96
Multimedia Sharing over the Internet from a Mobile Phone Rui Daniel Casaca da Trindade Dissertation submitted for obtaining the degree of Master in Electrical and Computer Engineering Jury: President: Prof. José Manuel Bioucas Dias Supervisor: Prof. Paulo Luís Serras Lobato Correia Co-Supervisor: Eng. Miguel Pereira dos Santos Member: Prof. Paulo Jorge Lourenço Nunes April 2010

Multimedia Sharing over the Internet from a Mobile Phone

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Multimedia Sharing over the Internet from a Mobile Phone

Multimedia Sharing over the Internet from a Mobile Phone

Rui Daniel Casaca da Trindade

Dissertation submitted for obtaining the degree ofMaster in Electrical and Computer Engineering

Jury:President: Prof. José Manuel Bioucas Dias

Supervisor: Prof. Paulo Luís Serras Lobato CorreiaCo-Supervisor: Eng. Miguel Pereira dos Santos

Member: Prof. Paulo Jorge Lourenço Nunes

April 2010

Page 2: Multimedia Sharing over the Internet from a Mobile Phone

ii

Page 3: Multimedia Sharing over the Internet from a Mobile Phone

iii

Acknowledgements

First I would like to thank everyone that, somehow, supported me not only in accomplishing this dissertation but also throughout my academic life.

To my supervisors, Prof. Paulo Lobato Correia, for his support, dedication and truly useful advices and Eng. Miguel Santos, for providing me the tools and for giving me the opportunity to develop this work.

To my colleagues and friends in IST, especially to Nuno Couto, André Esteves, André Chibeles, Filipa Henriques and Gonçalo Carmo, for their friendship and support.

To my friends from Odemira, especially to André Silva, Rui Silva, Diogo Mariano and Maria João, for their friendship throughout the years.

To my uncle Zé Carlos, for being, as he says, another dad to me, which is completely true.

To my girlfriend Ana Luísa, for her love, encouragement and for giving me the motivation to continuously improve myself.

To my parents and my sister, for their love, dedication and endless patience. I really believe that I could not have a better family.

Last but definitely not least, to my grandparents, all of them. Thank you for everything. I will never forget you.

Page 4: Multimedia Sharing over the Internet from a Mobile Phone

iv

Page 5: Multimedia Sharing over the Internet from a Mobile Phone

v

Agradecimentos

Primeiro gostaria de agradecer a toda a gente que, de alguma forma, me apoiou não apenas na realização desta dissertação, mas durante toda a minha vida académica.

Aos meus orientadores, Prof. Paulo Lobato Correia, pelo seu apoio, disponibilidade e conselhos verdadeiramente úteis, e Eng. Miguel Santos, por me fornecer as condições necessárias e por me dar a oportunidade de desenvolver este trabalho.

Aos meus colegas e amigos do IST, especialmente ao Nuno Couto, André Esteves, André Chibeles, Filipa Henriques e Gonçalo Carmo, pela sua amizade a apoio.

Aos meus amigos de Odemira, especialmente ao André Silva, Rui Silva, Diogo Mariano e Maria João, pela sua amizade ao longo dos anos.

Ao meu tio Zé Carlos, por ser, como ele diz, mais um pai que eu tenho, o que é completamente verdade.

À minha namorada Ana Luísa, pelo seu amor, encorajamento e por me motivar a ser cada vez melhor.

Aos meus pais e à minha irmã, pelo seu amor, dedicação e infinita paciência. Acredito verdadeiramente que não poderia ter melhor família.

Por fim, mas mais importante, aos meus avós, todos eles. Obrigado por tudo. Nunca vos esquecerei.

Page 6: Multimedia Sharing over the Internet from a Mobile Phone

vi

Page 7: Multimedia Sharing over the Internet from a Mobile Phone

vii

Abstract

Nowadays mobile applications are among the most popular services in the telecommunications world. The increased capabilities of mobile phones and the implementation of mobile networks providing high data rates and designed for multimedia communications have created the possibility to exploit the development of connected and fully featured mobile applications. In this context, there are great opportunities for applications that go beyond voice or text transmissions, such as video, photo and audio sharing.

This dissertation proposes a set of solutions to share multimedia content from a mobile phone, exploring the technology capabilities and constraints, as well as the usage environment. The multimedia content captured in the mobile device is made available to any client connected to the Internet.

The proposed system supports several multimedia modalities that can be selected and switched according to a set of application scenarios and usage conditions. These solutions include a multimedia uploading application for BlackBerry OS mobile phones, which allow sharing videos, photos and audio clips, and a photo streaming application for Windows Mobile OS mobile phones.

Keywords

Mobile Communications, Mobile Applications, Multimedia Uploading, Multimedia Streaming

Page 8: Multimedia Sharing over the Internet from a Mobile Phone

viii

Page 9: Multimedia Sharing over the Internet from a Mobile Phone

ix

Resumo

As aplicações móveis, ou seja, para telemóveis, são nos dias de hoje, um dos serviços mais populares no mundo das telecomunicações. O aumento das capacidades dos telemóveis e a implementação de redes móveis desenhadas para comunicações multimédia e que fornecem elevadas velocidades de transmissão de dados criaram a possibilidade de explorar o desenvolvimento de aplicações móveis multifuncionais e conectadas à Internet. Neste contexto, existem grandes oportunidades paradesenvolver aplicações que, para além da transmissão de voz ou texto, permitem a partilha de vídeo, fotografias e áudio.

Esta dissertação propõe um conjunto de soluções de partilha de conteúdo multimédia a partir de um telemóvel, explorando as limitações e as potencialidades da tecnologia existente, bem como o seu ambiente de utilização. O conteúdo multimédia capturado no dispositivo móvel é partilhado de forma a ficar acessível a qualquer cliente ligado à Internet.

O sistema proposto implementa várias modalidades multimédia que podem ser seleccionadas e alternadas de acordo com um conjunto de cenários de aplicação e condições de utilização. Estas soluções incluem uma aplicação de upload de multimédia para telemóveis BlackBerry, que permitepartilhar vídeos, áudio e fotos, e uma aplicação de streaming de fotos para telemóveis de sistema operativo Windows Mobile.

Palavras-Chave

Comunicações Móveis, Aplicações Móveis, Upload de Multimédia, Streaming de Multimédia

Page 10: Multimedia Sharing over the Internet from a Mobile Phone

x

Page 11: Multimedia Sharing over the Internet from a Mobile Phone

xi

Table of Contents

1. Introduction.........................................................................................................................................1

1.1. Context and Motivation ..............................................................................................................1

1.2. Objectives ...................................................................................................................................2

1.3. Related Work ..............................................................................................................................3

1.4. Related Mobile Multimedia Sharing Applications ......................................................................4

1.5. Main Contributions .....................................................................................................................5

1.6. Dissertation Outline ....................................................................................................................5

2. Multimedia Modalities and Application Scenarios ..............................................................................7

2.1. Factors Influencing Modality Switching ......................................................................................7

2.2. Switching Points Among Modalities ...........................................................................................8

2.3. Application Scenarios..................................................................................................................9

3. Mobile Networks and Data Transfer Methods ..................................................................................13

3.1. Mobile Networks ......................................................................................................................13

3.1.1. GPRS .................................................................................................................................13

3.1.2. EDGE.................................................................................................................................14

3.1.3. UMTS................................................................................................................................14

3.1.4. Wi-Fi .................................................................................................................................15

3.1.5. Data Rates Comparison ....................................................................................................15

3.2. Data Transfer Methods for Mobile ...........................................................................................16

3.2.1. HTTP Post .........................................................................................................................17

3.2.2. Web Services ....................................................................................................................18

4. Mobile Application Development......................................................................................................21

4.1. BlackBerry Application Development .......................................................................................21

4.2. Windows Mobile Application Development .............................................................................21

Page 12: Multimedia Sharing over the Internet from a Mobile Phone

xii

4.3. Mobile Development Best Practices .........................................................................................22

4.4. Development Guide ..................................................................................................................22

4.4.1. Object-Oriented Programming.........................................................................................23

4.4.2. Multimedia Development Guidelines ..............................................................................24

4.4.3. Multithreading Development Guidelines.........................................................................25

4.4.4. Network Operations Development Guidelines ................................................................27

5. Sharing Multimedia from a Mobile Phone ........................................................................................33

5.1. System Modalities and Application Scenarios ..........................................................................33

5.2. MobiShare: Multimedia Uploading Application .......................................................................34

5.2.1. MobiShare Application Architecture................................................................................34

5.2.2. MobiShare: Mobile Phone Component ...........................................................................36

5.2.3. MobiShare: Client Component.........................................................................................44

5.3. MobiStream: Photo Streaming Application ..............................................................................46

5.3.1. MobiStream Application Architecture .............................................................................46

5.3.2. MobiStream: Mobile Phone Component .........................................................................47

5.3.3. MobiStream: Client Component ......................................................................................51

6. Performance Analysis ........................................................................................................................55

6.1. MobiShare.................................................................................................................................55

6.2. MobiStream ..............................................................................................................................61

7. Conclusions ........................................................................................................................................69

7.1. Conclusion.................................................................................................................................69

7.2. Future Work ..............................................................................................................................71

Page 13: Multimedia Sharing over the Internet from a Mobile Phone

xiii

List of Figures

Figure 2.1 - OCV model of video content. ....................................................................................................9

Figure 2.2 – OCV model of video content with switching points among modalities....................................9

Figure 3.1 - GSM/GPRS architecture ..........................................................................................................13

Figure 3.2- UMTS architecture ...................................................................................................................14

Figure 3.3 - Network protocols selected to transfer data in the uplink direction......................................17

Figure 3.4 - Web Services architecture.......................................................................................................18

Figure 5.1 – Relationship between the modalities and application scenarios in the developed

applications. ...............................................................................................................................................34

Figure 5.2 – MobiShare multimedia uploading application architecture...................................................35

Figure 5.3 - MobiShare home screen. ........................................................................................................36

Figure 5.4 - Dialog box asking the user to input the desired file name. .....................................................37

Figure 5.5 - Dialog box warning not to share video in no Wi-Fi covered areas. .........................................37

Figure 5.6 – Embedded audio recorder application. ..................................................................................38

Figure 5.7 – Embedded photo camera application. ...................................................................................38

Figure 5.8 – Embedded video camera application. ....................................................................................38

Figure 5.9 - Uploading screen.....................................................................................................................41

Figure 5.10 - Upload succeeded screen......................................................................................................41

Page 14: Multimedia Sharing over the Internet from a Mobile Phone

xiv

Figure 5.11 - Screen with the URL selected. ...............................................................................................41

Figure 5.12 - Audio clip captured by MobiShare playing on a computer web browser. ............................44

Figure 5.13 - Photo captured by MobiShare displayed on a computer web browser................................45

Figure 5.14 - Video captured by MobiShare playing on a computer web browser....................................45

Figure 5.15 - MobiStream photo streaming application architecture........................................................46

Figure 5.16 - MobiStream home screen. ....................................................................................................49

Figure 5.17 - MobiStream home screen while camera is running..............................................................50

Figure 5.18 - Form asking for the streaming ID. .........................................................................................52

Figure 5.19 - Form containing the web browser displaying the sequence of images. ...............................53

Figure 6.1 - Variation of the latency for each photo uploaded using both HTTP Post and Web Services,

over Wi-Fi, for captureWaitPeriod values of: a) 300 ms, b) 400 ms and c) 500 ms, for a sequence of

200 photos..................................................................................................................................................62

Figure 6.2- Variation of the latency for each photo uploaded using both HTTP Post and Web Services,

over UMTS, for captureWaitPeriod values of: a) 1500 ms and b) 2000 ms, for a sequence of 100

photos.........................................................................................................................................................63

Figure 6.3 - Variation of the latency for each photo uploaded using both HTTP Post and Web Services,

over EDGE, for captureWaitPeriod values of: a) 2500 ms and b) 3500 ms, for a sequence of 75

photos.........................................................................................................................................................64

Figure 6.4 - Variation of the latency for each photo uploaded using both HTTP Post and Web Services,

over GPRS, for captureWaitPeriod values of: a) 3500 ms and b) 5000 ms, for a sequence of 50

photos.........................................................................................................................................................65

Page 15: Multimedia Sharing over the Internet from a Mobile Phone

xv

List of Tables

Table 1.1 - Multimedia uploading applications for BlackBerry. ...................................................................4

Table 2.1 - Proposed application scenarios according to the identified dimensions. ................................10

Table 2.2 - Relationship between the application scenarios and the multimedia modalities. ..................11

Table 3.1 - Comparison of mobile networks data rates. ............................................................................15

Table 5.1 - Information displayed according to the network change, when video is the selected modality.

....................................................................................................................................................................42

Table 5.2 - Information displayed according to the network change, when photo is the selected

modality......................................................................................................................................................43

Table 5.3 - Information displayed according to the network change, when audio is the selected modality.

....................................................................................................................................................................43

Table 6.1- Upload times of audio clips over Wi-Fi. .....................................................................................56

Table 6.2 - Upload times of audio clips over EDGE. ...................................................................................57

Table 6.3 - Upload times of audio clips over GPRS.....................................................................................57

Table 6.4 - Upload times of photos over Wi-Fi. ..........................................................................................58

Table 6.5 - Upload times of photos over EDGE. .........................................................................................58

Table 6.6 - Upload times of photos over GPRS...........................................................................................59

Table 6.7 - Upload times of video clips over Wi-Fi. ....................................................................................59

Table 6.8 – Average frame periods of a sequence of 200 photos over Wi-Fi.............................................62

Page 16: Multimedia Sharing over the Internet from a Mobile Phone

xvi

Table 6.10 – Average frame periods of a sequence of 100 photos over UMTS. ........................................63

Table 6.11 – Average frame periods of a sequence of 75 photos over EDGE. ...........................................64

Table 6.12 – Average frame period of a sequence of 50 photos over GPRS. .............................................65

Page 17: Multimedia Sharing over the Internet from a Mobile Phone

xvii

Acronyms

2G Second Generation Mobile Communication System

3G Third Generation Mobile Communication System3GPP Third Generation Partnership Project

4G Fourth Generation Mobile Communication System

AP Access PointAPI Application Programming Interface

BS Base StationBSC Base Station Controller BTS Base Transceiver Station

CDMA Code Division Multiple AccessCPU Central Processing Unit

DMO DirectX Media ObjectDSSS Direct-Sequence Spread Spectrum

EDGE Enhanced Data Rates for GSM Evolution ETSI European Telecommunications Standards Institute

GGSN Gateway GRPS Support Node GPRS General Packet Radio SystemGSM Global System for Mobile Communications

HTTP Hypertext Transfer ProtocolHSDPA High Speed Downlink Packet AccessHSPA High Speed Packet AccessHSUPA High Speed Uplink Packet Access

ICT Information and Communication TechnologiesIDE Integrated Development EnvironmentIEEE Institute of Electric and Electronic Engineers IETF Internet Engineering Task Force ITU International Telecommunications Union

MSC Mobile Switching Center

OCV Overlapped Content Value OFDM Orthogonal Frequency Division MultiplexingOOP Object Oriented Programming

P2P Peer-to-Peer

Page 18: Multimedia Sharing over the Internet from a Mobile Phone

xviii

PDA Personal Digital AssistantPSNR Peak Signal-to-Noise Ratio

QoE Quality of ExperienceQoS Quality of Service

RIM Research in MotionRMI Remote Method Invocation RNC Radio Network Controller RNS Radio Network SubsystemRPC Remote Procedure CallRTCP Real-Time Control ProtocolRTP Real-Time Transport Protocol RTSP Real-Time Streaming Protocol

SDK Software Development KitSDP Session Description ProtocolSGNS Serving GPRS Support Node SOAP Simple Object Access Protocol

TCP Transmission Control Protocol TDMA Time Division Multiple Access

UDP User Datagram Protocol UI User InterfaceUMA Universal Multimedia AccessUMTS Universal Mobile Telecommunication System UTRAN UMTS Terrestrial Radio Access Network

VoIP Voice over Internet Protocol

W3C World Wide Web Consortium WAF Wireless Access FamilyWi-Fi Wireless FidelityWLAN Wireless Local Area Network WMV Windows Media Video

Page 19: Multimedia Sharing over the Internet from a Mobile Phone

1

1. Introduction

Sharing multimedia content from a mobile phone is one of the mobile services showing greatertechnological potential and that has been increasingly sought by users. In this context, this dissertation aims to develop a set of solutions to share multimedia content from a mobile phone, exploring the technology constraints and capabilities, as well as the usage environment.

This first chapter starts by describing the dissertation’s context, introducing the main issues of the mobile development area and the social impact of mobile sharing multimedia applications. After establishing the dissertation objectives, related mobile applications are analyzed. At the end of the chapter, the dissertation contribution and layout are provided.

1.1. Context and Motivation

Nowadays mobile applications are among the most popular services in the Information and Communication Technologies (ICTs) world. The main reasons for that are the mobile phone developments in recent years and their increasing number of users, whether professional or personal.

The big investments that have been made by major companies such as Nokia, Apple, Research In Motion (RIM), etc., increasing the processing, memory and graphical capabilities of mobile phones, and the global implementation of mobile networks designed for data and multimedia communications and allowing high data rate services, such as the third generation mobile communication system (3G) [1], created the opportunity to exploit the development of useful, diversified, connected and fully featured mobile applications. In fact, the future will bring more powerful smartphones and mobile networks with even higher data rates with the forthcoming fourth generation mobile communication system (4G), which makes the mobile application development an area with great potential and very challenging to work on.

In this context, is clear that there is a great opportunity for applications that go beyond voice or text transmission. Given the growing importance that multimedia has had in the way people get information and have fun, applications for sharing different multimedia modalities such as video, photos and audio clips are expected to be among the most successful type of mobile applications in the next few years.

Therefore, based on the technological improvements in the mobile area and on the social impact of multimedia sharing applications, this work aims to develop a set of solutions to share multimedia content from a mobile phone, exploring the current capabilities and constraints of the existingtechnology.

In the dissertation’s context, sharing multimedia has a broad and double meaning, since it refers to two different ways of delivering multimedia: streaming and uploading. From the mobile phone’s perspective, streaming involves constantly capturing and sending content to one or several end users, while it is presented on those destinations. On the other hand, uploading refers to capturing the whole contentand sending it to the end users at once, as soon as it has been completely captured. Note that, in this context, the transmission of data refers to the uplink direction, that is, from the mobile phone to the core network.

Page 20: Multimedia Sharing over the Internet from a Mobile Phone

2

The usage environment is also considered, playing an important role during the application execution. Aset of application scenarios is defined, according to a group of characteristics related to the audiovisual nature of the content to share and to the operation environment constraints. The grouping into scenarios allows defining the multimedia modalities that can be used for each particular case.Moreover, a constant monitoring of the network will help the user in selecting the most adequate multimedia modality, according to the network resources and to the set of scenarios previously defined.

The Operating System (OS) chosen to develop the mobile applications is RIM’s BlackBerry. This choice isbased on a common interest shared by the student and Mobi2do, the commercial company where this dissertation is developed, in increasing the knowledge and experience on developing applications for this operating system. Moreover, the BlackBerry OS reveals some limitations on sharing multimedia contents captured on the device, since by default, the available alternatives to share a multimedia file are:

Bluetooth, which is not a possibility if the device to share the file is not in the vicinity or is not Bluetooth enabled.

Memory Card, which implies to plug it in the device to share the file.

E-mail/MMS, which may not allow uploading large files, such as video files.

Therefore, developing a multimedia sharing application that overcomes these limitations is an extra motivation to use the BlackBerry development environment in this dissertation.

However, due to limitations in the BlackBerry APIs (Application Programming Interfaces), which will be described in detail later, the Windows Mobile OS is also explored. A few other operative systems such Apple’s iPhone and Google’s Android were analyzed but Microsoft’s Windows Mobile proved to be a better choice for the dissertation’s purposes, as it provides lower-level and flexible camera APIs.

In fact, there are some APIs developed by the manufacturers that are not publicly available or do not provide all the functionalities needed to take full advantage of the mobile phone’s capabilities. This is probably one of the main reasons why there are much more solutions to deliver multimedia content toa mobile phone than from a mobile phone.

Hence, this work pretends, on one hand, to propose an alternative solution for sharing multimedia content from a mobile phone, and on other hand, to make available a set of guidelines and code samples for the development of mobile multimedia sharing applications.

1.2. Objectives

According to the dissertation’s context and motivation, the main goals to achieve are:

Explore and acquire knowledge in mobile application development, especially for BlackBerry OS and Windows Mobile OS, by learning the best practices and producing guidelines for the development of the main software components for both operating systems.

Investigate the main technologies and network protocols supporting data transfer from a mobile phone to a web server.

Page 21: Multimedia Sharing over the Internet from a Mobile Phone

3

Analyze the influence of the usage environment on mobile multimedia applications, defining a set of application scenarios.

Develop the following set of applications for sharing multimedia content from a mobile phone, exploring the existing technological constraints and capabilities:

o Application for uploading videos, photos and audio clips from a mobile phone running BlackBerry OS. The usage environment should be taken into account in order to help the user in selecting the most adequate multimedia modality.

o Application for streaming photos from a mobile phone running Windows Mobile OS. The network should be monitored in order to adapt the frame rate to the network conditions.

In the applications developed, both content uploaded and content streamed should be made available for a client connected to the Internet.

1.3. Related Work

In this section some of the research literature related to sharing multimedia content over mobile networks from a mobile phone is briefly presented, focusing on the video streaming modality, which has been the modality receiving major attention by the academic community.

Three main types of related work were found:

1. Peer-to-peer (P2P) video streaming between handheld devices.

2. Video coding for mobile handheld conferencing.

3. Video streaming to a mobile device.

The most relevant related work found is the system proposed by Akkanen [2], which use a Symbian based mobile device to generate a video stream, either directly from the camera of the phone or from a locally stored file, which is then propagated over Wi-Fi in peer-to-peer to listening phones where it can be viewed. The video stream generated in the mobile phone camera is packed in standard MPEG-4/H.264 format and UDP is used as transmission protocol. As being a peer-to-peer approach over a WLAN, the video streaming avoids using streaming servers, being available only to members within the local network, thus not providing a wide video broadcast.

Still regarding peer-to-peer video streaming, Davies conducted a set of experiences [3] [4] to measure the performance of several P2P video streaming applications such as Skype and GoogleTalk over UMTS networks. In these experiences personal computers connected to 3G cards represented the mobile devices.

The development of video codecs able to run in mobile phones has been also an topic receiving some attention. Faichney [5] developed a low complexity video encoder for Windows CE mobile devices that makes use of the G.723 audio compression and non-standard video-coding algorithms, to be used in videoconferencing. Yu [6] proposed a low-complexity video coding scheme similar to H.263 for Pocket PCs. Ciaramello [7] analyzed low complexity methods for identifying face and hand regions in a mobile videoconference system.

Page 22: Multimedia Sharing over the Internet from a Mobile Phone

4

The majority of the related work found is about live streaming video to a mobile phone, that is, downstream, in opposition to the dissertation purposes. However, it is worth to mention some researches done in this area, since some of the discussed topics are common to video streaming in both directions, uplink and downlink:

3GPP video streaming over GPRS [8] and over WCDMA [9] networks, by Lundan.

MPEG-4 video delivery over EDGE networks [10], by Basso.

Video Streaming for fast moving users in 3G networks [11], by Kyriakidou.

Well-known and new techniques for streaming video to a mobile device [12], by Walker.

Video Streaming Performance in Live UMTS networks [13], by Weber.

Live video streaming over Wi-Fi [14], by Haratcherev.

Streaming in Mobile Networks [15], by TellaSonera.

1.4. Related Mobile Multimedia Sharing Applications

As this dissertation is being developed in a business environment, it is useful to identify the already available applications for sharing multimedia from a mobile phone, and their main characteristics.

A search in the BlackBerry application store [16], shows eleven applications similar to the application being proposed for BlackBerry, that is, eleven applications for uploading multimedia contents. Table 1.1compares the multimedia uploading modalities supported by these applications.

Table 1.1 - Multimedia uploading applications for BlackBerry.

Application NameMultimedia Modalities

Video Uploading Audio Uploading Photo Uploading

Kyte Yes No Yes

Livecast Yes Yes No

Next2Friends Yes No Yes

vPost Yes Yes Yes

Avanquest No No Yes

Memeo Share No No Yes

ShoZu Yes No Yes

Page 23: Multimedia Sharing over the Internet from a Mobile Phone

5

Pictavision No No Yes

CellSpin Yes Yes Yes

Show Me! No No Yes

Fo2go No No Yes

As shown in Table 1.1, there are only two out of eleven applications offering video, audio and photo uploading simultaneously in one single application for BlackBerry. The proposed application provides an alternative solution, by monitoring the network characteristics and providing information to the user that helps in selecting the most adequate multimedia modality.

No existing commercial applications for streaming photos from a Windows Mobile device were found.However, a few applications allowing to stream video do exist: Bambuser, LiveCast, LiveCliq and Qik.

1.5. Main Contributions

This dissertation’s main contribution is to propose a solution for sharing multimedia content from a mobile phone. A key feature is the analysis of the usage environment, by proposing a set of possible application scenarios and monitoring the network resources. This analysis helps the user in selecting the best multimedia modality, in the application developed for BlackBerry, and adapts the frame rate according to the network conditions, in the Windows Mobile application, in order to maximize the quality of the experience provided to the user.

Moreover, this work makes available some guidelines and code samples for the development of applications for sharing multimedia content from a mobile device, gathering and clustering some of the available information regarding key topics such as network operations, multithreading and multimedia, easing its access.

An article summarizing the work developed in this dissertation was accepted for oral presentation at the 3rd International Workshop on Future Multimedia Networking, to be held in Krakow, Poland on 17-18 June 2010 [17].

1.6. Dissertation Outline

This dissertation is composed by seven chapters, including this first one, introducing the thesis context and motivation, objectives, related applications and main contributions.

Chapter 2 starts by highlighting the importance of providing several presentations, or multimedia modalities, of the same content, especially when using mobile devices. Then, the factors influencing the decision to switch from a given modality to another one are described and a set of application scenarios are proposed, in order to define the multimedia modalities that can be used in each particular case.

Chapter 3 is dedicated to describe, firstly, the networks that are typically available for the majority of mobile phones, focusing on the network parameters that have a major influence in the developed

Page 24: Multimedia Sharing over the Internet from a Mobile Phone

6

applications. Secondly, the existing methods to transfer data from a mobile device to the core network are also evaluated.

Chapter 4 presents an overview of the BlackBerry and Windows Mobile operating systems, from the application developer perspective. In particular, a set of best practices and guidelines for developing mobile applications are described, the object-oriented programming model is introduced and code snippets of common multimedia, threading and data transmission tasks and protocols are provided.

Chapter 5 provides a detailed description of the implemented applications, focusing on their functional structure, architecture and main components. The description includes code snippets corresponding to key tasks and snapshots of the developed applications running at the mobile phones and at the client devices.

Chapter 6 is dedicated to evaluate the performance of the developed applications. A set of tests is performed to measure key properties such as upload time, average frame period and latency under the different mobile networks available, data transfer protocols considered and other specific variables.

Finally, chapter 7 presents the dissertation’s conclusions and identifies topics for future work.

Page 25: Multimedia Sharing over the Internet from a Mobile Phone

7

2. Multimedia Modalities and Application Scenarios

Nowadays, the heterogeneity of networks, terminals and user’s preferences is increasing. At the same time, the growing availability and the different characteristics of multimedia contents has raised the importance of supplying different presentations of the same information, in order to provide the end user with the content presentation more suitable according to the usage environment.

Mobile devices are terminals for which the ability to provide different presentations of the same content may be crucial, given their operational constraints and limited resources. One way for a mobile phone to provide different presentations of the same information is to select the most appropriate multimedia modality for each application scenario, among video, image and audio. In this dissertation, the process of changing the multimedia modality in use is called multimedia modality switching.

The concept of Universal Multimedia Access (UMA) is behind the multimedia modality switching since UMA calls for the provision of different presentations of the same content, with more or less complexity, suiting the different usage environments in which content will be accessed. “Universal” applies here to the user location (anywhere) and time (anytime) but also to the content to be consumed (anything), even if that requires some change or adaptation to occur [18].

UMA requires a general and broad understanding of the Quality of Experience (QoE) concept, since itinvolves not only the user’s needs and preferences but also the usage environment characteristics. In this context, QoE plays a key role since the ultimate objective of any multimedia communication system is to provide the user the best meaningful experience of the delivered data. The user, not the terminal, should be at the center of the multimedia experience [19].

2.1. Factors Influencing Modality Switching

The factors that may influence the QoE and therefore, the decision to provide a different content presentation, can be clustered into five main classes [20] [21]:

a. User preferences and limitations, reflecting the user’s interest in using different modalities or even his handicaps, e.g. color blind.

b. Surrounding environment, reflecting the influence of the usage environment on the modality selection. For example, images or text might be more suitable than audio if a user is located in a noisy place.

c. Content characteristics, reflecting the content suitability to be represented using alternative modalities. For example, a news video might be represented in textual format, while a dance video could not.

d. Terminal capabilities and limitations. The heterogeneity and the large number of terminals, with different characteristics, might influence the modality switching process:

i. Display and capture capabilities, because the mobile terminal ability to display and capture multimedia may determine the available modalities. For example the modalities supported by a text-only pager are much more limited comparing to the modalities supported by a top mobile phone with built-in camera.

Page 26: Multimedia Sharing over the Internet from a Mobile Phone

8

ii. Processing capabilities, as some terminals might not be able to execute different taskssimultaneously or to operate in real time with high complexity codecs.

iii. Memory capabilities, as some terminals might not have enough memory size available for the requested contents.

iv. Software capabilities, as some operating systems do not provide essential APIs to capture, stream or play a specific modality.

e. Network characteristics. The wireless medium is shared and the frequency spectrum is limited, so both transmitted power (error rate) and bandwidth (data rate) are constrained. Additionally, wireless conditions change frequently because of interference, fading, multipath, mobility and traffic load [14]. All these factors influence the following Quality of Service (QoS) parameters:

i. Latency, which measures the time delay experienced in a system. This is a key parameter intime sensitive applications such as live streaming, which aims to provide a real time experience.

ii. Jitter, which in a packet switched network measures the variability over time of the packet latency across a network (a network with constant latency has no jitter). If the jitter value is high the time difference between packets varies over time which may reduce the QoE. For example, in a photo streaming scenario, if the time difference between images varies along time, the user perception of the event being represented may be affected.

iii. Packet loss, which occurs when one or more packets of data traveling across a packet switched network fail to reach their destination. Packet loss may cause loss of information if the data transmission protocol does not have packet retransmission, or may originate delay, if the transmission protocol implements packet retransmission.

Usually the decision to choose one modality or to switch among modalities is not influenced by one single factor, but by a combination of the above listed factors, with some of them assuming more relevance in different application scenarios.

2.2. Switching Points Among Modalities

In order to answer the questions “At what resource constraint point should modality switching occur?” and “What is the destination modality?”, that is, in order to find the switching points among modalities,the Overlapped Content Value (OCV) model can be used [20]. This model represents the relationship between the amount of resource and the content value of the rate-quality curves for each modality, that is, curves that for each modality represent the quality of the content according to the available resource.

According to the OCV approach, the amount of resource refers to terminal and network characteristics, such as, for example, the spatial resolution or the transmission data rate. For instance, in a photo streaming scenario, when the data rate is too low, sending a sequence of selected key images would be more appropriate than streaming a low quality or lagging sequence of photos.

The OCV model also states that the content value is defined by two key quality aspects: the perceptual quality, which refers to a user’s satisfaction in perceiving the content, regardless of what information composes it; and the semantic quality, which refers to the amount of the information a user obtains from the content, regardless of how it is presented [20].

Page 27: Multimedia Sharing over the Internet from a Mobile Phone

9

A commonly used method to evaluate the content value is to use the Peak Signal-to-Noise Ratio (PSNR) when dealing with images or video.

An example of an OCV model of video content is represented in Figure 2.1 and Figure 2.2.

Figure 2.1 - OCV model of video content [20].

Figure 2.2 – OCV model of video content with switching points among modalities [20].

As shown in Figure 2.2, the intersection points of the modality curves represent the optimal switching points among modalities.

The OCV model is a useful tool to determine the modality switching points, however, given the subjectivity associated to the measurement of the content value, the decision to switch among modalities will be ultimately taken by the user, even though, in the applications developed in this dissertation, suggestions are made to the user about the possibility to switch from one modality to another, in certain conditions.

2.3. Application Scenarios

Besides the need to switch modality due to the changing amount of available resources during a transmission, there are situations, or application scenarios, in which the modality switching occurs because some modalities can be more suitable than others to represent the desired contents.

In order to identify the supported modalities for each situation and to allow an easier selection of the best modality, among those available, a set of application scenarios are defined, according to the

Page 28: Multimedia Sharing over the Internet from a Mobile Phone

10

operation environment constraints and to a group of characteristics, or dimensions, closely related to the audiovisual nature of the multimedia modalities.

The four major dimensions found relevant to define the conceivable application scenarios are:

Event duration. The time length of the event to capture and share is one of the distinguishingfactors among application scenarios. The duration of an event can be either instantaneous, which happens when the situation to represent does not have temporal dimension, such as in a photography scenario, or non-instantaneous, which includes situations that take place for a certain period of time, whether it be short or long.

Real-time transmission. The possibility to share a live event, while it occurs, or with delay, after it has occurred, has a key role in the classification of application scenarios. In general, streaming is the multimedia delivering method associated to the first case while uploading is associated to the second case. However, uploading may also be used to share events in real time, if the size of the files to send in sequence is small.

Visual content. The visual meaning or information associated to an event is also a factor for classifying the different scenarios. In fact, there are scenarios in which the visual content has no meaning or is not worthy to be shared, scenarios where the visual content is the unique type of valuable content and, finally, situations where the visual content is as relevant as other types of content.

Audio content. Similarly to the previous dimension, the audio meaning or information associatedto an event allows classifying the application scenarios.

According to the identified dimensions, the proposed mobile multimedia sharing application scenariosare presented in Table 2.1.

Table 2.1 - Proposed application scenarios according to the identified dimensions.

ScenarioEvent

DurationReal-Time

TransmissionVisual Content Audio Content

Application Examples

A Instantaneous - Yes - Photography

BNon

Instantaneous Yes Yes YesSports match, music show

CNon

InstantaneousYes Yes No Surveillance

DNon

InstantaneousYes No Yes VoIP

ENon

Instantaneous No Yes YesSocial

networking

FNon

InstantaneousNo Yes No Slideshow

GNon

InstantaneousNo No Yes

Audio interviews

The main characteristics of the application scenarios proposed are:

Scenario A – The first scenario corresponds to instantaneous events, with no temporal dimension. Therefore, given its nature, only visual content matters. This is the appropriate

Page 29: Multimedia Sharing over the Internet from a Mobile Phone

11

scenario for photography related events, such as sightseeing or recording a friends or family gathering, for example.

Scenario B – The second scenario corresponds to events with both visual and audio content, which occur for a certain period of time and whose sharing is intended to be done in real time. Sharing a sports game, a music concert or an entertainment show live to some friends are the type of events included in this scenario.

Scenario C – This scenario corresponds to events for which only the visual content matters, occur for a certain period of time and are intended to be shared in real time. This is an appropriate scenario for surveillance systems, whether they are for private purposes, such as providing security for companies and houses, or for public purposes, such as preventing forest fires and providing information about the weather and sea conditions on beaches.

Scenario D – The fourth scenario corresponds to events that only have audio content, occur for a certain period of time and are intended to be shared in real time. This scenario is appropriate forreal time voice transmission, such as VoIP (Voice over Internet Protocol), for example.

Scenario E – The fifth scenario is similar to the second scenario, scenario B, except that the event is not intended to be shared in real time, that is, it can be recorded and stored or shared afterwards. Examples of events covered by this scenario are audiovisual situations intended to be posted on websites and social networks such as Youtube or Facebook.

Scenario F – This scenario is similar to scenario C, except that the event does not have to be shared in real time. Sharing videos with no audio content, such as slideshows, is an example of application of this scenario.

Scenario G – The last scenario considered is similar to scenario D, except that the event is not intended to be shared in real time. This is the proper scenario for audio interviews, for example.

Given the two different solutions considered for sharing multimedia content, that is, streaming and uploading, and the three types of multimedia content available, that is, video, photo and audio, six different multimedia modality sharing options can be defined. Table 2.2 establishes a relationship between the application scenarios and the supported modalities that may be selected for usage in those scenarios.

Table 2.2 - Relationship between the application scenarios and the multimedia modalities.

ScenarioModality A B C D E F G

Video Streaming x √ √ √ x x x

Photo Streaming x x √ x x x x

Audio Streaming x x x √ x x x

Video Uploading x x x x √ √ √

Photo Uploading √ x x x x √ x

Audio Uploading x x x x x x √

Page 30: Multimedia Sharing over the Internet from a Mobile Phone

12

Video streaming is a modality able to support scenarios B, C and D given its ability to deliver visual and audio contents in real time. However, note that the best modality to represent a specific scenario may not be the most resource consuming one. For example, photo streaming and audio streaming areprobably the most adequate modalities for usage in scenarios C and D respectively, given their visual-only and audio-only content transmission, although video streaming can be used as well in both cases.

Video uploading can be used to support scenarios E, F and G, as it uploads audiovisual contents. Photo uploading, given its snapshot nature, that is, given it consists in taking and uploading a single photography, supports scenarios A and F. Scenario G can be also supported by audio uploading.

Page 31: Multimedia Sharing over the Internet from a Mobile Phone

13

3. Mobile Networks and Data Transfer Methods

Transferring data between a mobile phone and a backend server, whether it is for watching videos, browsing on the internet or sending an e-mail, is one of the main actions performed in mobile communications nowadays. To enable the transfer of data, several types of mobile networks can be used, together with a selection of network protocols. These are the main issues addressed in this chapter.

3.1. Mobile Networks

This section introduces the networks that are typically available for the majority of mobile phones. Here, the focus is on the networks specifically designed for mobile phones, such as GPRS, EDGE and UMTS, and wireless LANs as Wi-Fi/IEEE 802.11.

3.1.1.GPRS

General Packet Radio System (GPRS) is the packet switch data service supported by Global System for Mobile Communications (GSM) [22], the most successful 2G mobile phone system. It was standardized by European Telecommunications Standards Institute (ETSI) and by the 3rd Generation Partnership Project (3GPP).

GPRS provides data rates up to 171.2 kbit/s in both downlink and uplink directions [23]. However, typical data rates are much lower. GPRS data rate performance is limited by the number of time slots available, since its technology is based on Time Division Multiple Access (TDMA), and by the coding scheme in use, which becomes more robust as radio link conditions degrade, that is, when connections need more error protection.

Figure 3.1 shows the basic architecture of a GPRS network.

Figure 3.1 - GSM/GPRS architecture [8].

GPRS attempts to reuse the existing GSM network elements as much as possible, but some new network elements as the Serving GPRS Support Node (SGNS) and Gateway GRPS Support Node (GGSN) must be added. Additionally, some new protocols for handling packet traffic are required on the existing

Page 32: Multimedia Sharing over the Internet from a Mobile Phone

14

equipments Base Transceiver Station (BTS) and Base Station Controller (BSC). The Mobile Switching Center (MSC) is required for circuit switched traffic only.

3.1.2.EDGE

Enhanced Data Rates for GSM Evolution (EDGE) is the evolutionary upgrade of the packet switch data service of GMS, GPRS [24]. It is a backward compatible technology that allows improved data rates as an extension on top of the GSM/GPRS network. EDGE is considered a 3G technology, being part of ITU’s(International Telecommunications Union) 3G definition [25], and is standardized by 3GPP.

EDGE adds always-on data capabilities and provides higher data rates than GPRS, up to 473.6 kbit/s inboth downlink and uplink directions [26], by using a larger number and more sophisticated adaptive modulations and coding schemes [10]. As a result, high bandwidth applications such as multimedia services take advantage from EDGE increased data capacity.

3.1.3.UMTS

Universal Mobile Telecommunication System (UMTS) is one of the 3G mobile technologies developed under the 3GPP and is part of the ITU IMT-2000 standard. As a 3G technology, UMTS was developed to offer effective delivery of data with rich multimedia content by supplying to the user services beyond voice or simple data transmission, which are characteristic of GSM [23].

The maximum peak data rates provided by UMTS are 2 Mbit/s in the downlink direction and 768 kbit/s in the uplink direction [26]. Unlike GSM, UMTS shared resource is power, instead of frequency, and uses Code Division Multiple Access (CDMA) instead of TDMA, on the radio interface [27]. Hence, mobile phones can use the same frequency band without time slots and the power required for transmissions increases as higher is the data rate needed.

The basic architecture of a UMTS network is illustrated in Figure 3.2.

Figure 3.2- UMTS architecture [9].

Page 33: Multimedia Sharing over the Internet from a Mobile Phone

15

The architecture can be divided in two elements: UMTS Terrestrial Radio Access Network (UTRAN) and Core Network. UTRAN is divided into Radio Network Subsystems (RNSs) which consist of several Base Stations (BSs), also known as Nodes B, and the controlling element Radio Network Controller (RNC). The Core Network consists of a circuit switched part and packet switched part. Figure 3.2 shows only the packet switched part and its main components: SGSN and GGSN [9].

Currently, High Speed Packet Access Evolution (HSPA+) protocols for UMTS, are being deployed in some locations, in order to increase the downlink and uplink data rates, through High Speed Downlink Packet Access (HSDPA+) and High Speed Uplink Packet Access (HSUPA+) respectively [28]. The theoretical maximum data rate values provided by these new protocols are 42 Mbit/s for HSDPA+ and 11.5 Mbit/s for HSUPA+ [26]. However, the majority of mobile phones do not support these protocols yet.

3.1.4.Wi-Fi

Wireless Fidelity (Wi-Fi) generally refers to any type of IEEE 802.11 Wireless Local Area Network (WLAN) and it allows a Wi-Fi enabled device, such as a computer or a mobile phone, to connect the internet through an Access Point (AP), that is, a wireless router [29].

The IEEE 802.11 standard was developed by the Institute of Electric and Electronic Engineers (IEEE), operates in the 2.4 GHz or 5GHz bands and provides maximum data rates from 11 Mbit/s to 54 Mbit/s, depending on the protocol in use. Its modulation is either Direct-Sequence Spread Spectrum (DSSS) or Orthogonal Frequency Division Multiplexing (OFDM) [30].

The ability to access the internet from a mobile phone through Wi-Fi, when compared to GPRS, EDGE and UMTS, has as advantage the higher data rates available for lower cost. On other hand, there are several disadvantages such as the smaller coverage area, since Wi-Fi is only available in the vicinity of an AP while the remaining networks are available almost worldwide; and the IEEE 802.11 standard is designed for still or slow moving users, while the other networks support fast moving users.

3.1.5.Data Rates Comparison

As discussed before, data rate is a key parameter in mobile communications, and specifically, in the multimedia sharing applications proposed, since it has a major influence on the performance of streaming applications and it determines the amount of time that a file takes to be uploaded.

Table 3.1 summarizes the theoretical maximum downlink and uplink data rates of the mobile networks described in the previous sections.

Table 3.1 - Comparison of mobile networks data rates [26].

Data Rates [Mbit/s] GPRS EDGE UMTS Wi-Fi

Downlink 0.171 0.474 2 54

Uplink 0.171 0.474 0.768 54

Page 34: Multimedia Sharing over the Internet from a Mobile Phone

16

The values presented on Table 3.1 are the theoretical peak values, and therefore much higher than the typical data rate values provided by real networks. However they give an idea of the differences amongthe data rates of the addressed networks and can be used for comparison purposes. Hence, is clear that Wi-Fi has a great advantage when compared to the other mobile networks, notably for upstream communications, given it provides much higher data rates in the uplink direction. Take notice that the data rate of 54 Mbit/s is the highest data rate offered by Wi-Fi using the IEEE 802.11 protocols existent in the majority of mobile phones. There are other IEEE 802.11 protocols offering higher data rate values.

3.2. Data Transfer Methods for Mobile

The data transfer methods available for mobile phone communications are addressed in this section. Here the available alternatives for transferring data between a mobile phone and a web server located within the network core are discussed, focusing in the uplink direction, according to the dissertation purposes.

There are several alternative solutions to transfer data between a mobile phone and a web server. The most common are listed below:

a. Streaming protocols:

i. Real-Time Transport Protocol (RTP).

ii. Real-Time Control Protocol (RTCP).

iii. Real-Time Streaming Protocol (RTSP).

iv. Session Description Protocol (SDP).

b. Hypertext Transfer Protocol (HTTP).

c. Socket connections.

d. Remote Procedure Calls (RPCs):

i. Remote Method Invocation (RMI), for Java based devices.

ii. Microsoft .NET Remoting, for devices running Microsoft’s OSs.

e. Web Services.

The streaming protocols enumerated belong to the application layer and rely on transport layer protocols such as the Transmission Control Protocol (TCP) and the User Datagram Protocol (UDP) [15]. Although these streaming protocols can be used in the downlink direction (some mobile phones are able to receive and playback multimedia files transmitted through these streaming protocols), most of themcannot be easily applied in the uplink direction, since there are no libraries implementing them on the mobile devices to act as streaming sources.

Sockets connections [31] [32], which rely directly on transport layer protocols, as well as the higher level RPCs such as Java RMI, for BlackBerry Java applications [33], can also be used to transfer data over a network. However these methods require an open port to establish connections, which can lead to security issues causing communication to be blocked when data is transferred over different networks, each one secured by its own firewall. There is also the Microsoft .NET Remoting, which however cannot

Page 35: Multimedia Sharing over the Internet from a Mobile Phone

17

be used on Windows Mobile because it is not supported by the .NET Compact Framework which is the .NET Framework version available for Windows Mobile devices [34].

The remaining network data transfer methods, notably HTTP, through its Post method, and Web Services (which are built upon HTTP, Simple Object Access Protocol (SOAP) and RPC) establish connections typically over port 80, which usually is open, and have available APIs for being implemented on the mobile phones. Therefore, as these protocols do not have the implementation and security problems faced by the other alternatives, they are the methods chosen to transfer data in the uplink direction in the proposed applications, as illustrated by Figure 3.3.

Figure 3.3 - Network protocols selected to transfer data in the uplink direction.

3.2.1.HTTP Post

As defined in RFC 2616, “HTTP is an Application Level protocol for distributed, collaborative, hypermedia information systems” [35], developed by the World Wide Web Consortium (W3C) and the Internet Engineering Task Force (IETF).

An HTTP session is a sequence of network request-response transactions. An HTTP client initiates a request, establishing a TCP connection typically to port 80 on a host, that is, an HTTP server listening for a client request message. This message consists of a line describing the request method, such as Get or Post, several headers and the message body.

The Post method is used when the requested action changes data on the server, such as updating data in a database or storing an uploaded file. The data is included in the body of the request and any kind of data can be sent, from byte arrays to serialized objects.

Since HTTP is implemented over TCP it is not the ideal streaming method, as for real time applications like streaming, dropping packets is often preferable to waiting for delayed packets. TCP originates delay because it provides a reliable and ordered delivery of a stream of bytes by retransmitting packets when

Page 36: Multimedia Sharing over the Internet from a Mobile Phone

packet loss occurs. UDP, on the contrarywhen packet loss occurs, being as a consequence applications. One way to overcome thpackets as small as possible, that is,and the delays originated by packet retransmission.

3.2.2.Web Services

Web Services are a new approach to objectprocess communication technology that allows an application running in a computerprocedure or method in another computer, without the programmer explicitly coding the detailremote interaction, that is, the programmer calls a method writing the same code whether the method is local to the executing program or remote. implementations which allow remote interaction are called stubs.

However Web Services are more than a RPCto lay the RPC on top of the HTTP overcomes the security issues caused by firewall limitabecome available to everyone by being registered in an in a Universal Description Discovery and Integration (UDDI) platform

In particular, the RPC requests and responses are contained within SOAP packets that rely on eXtensible Markup Language (XML) for data representation and are encapsulated by HTTP packets.registers itself within the UDDI platform (WSDD) that provides a brief description of the servicescheme. Also, the service provides the information needed by the compiler to generate the client and server stubs using a descriptor thalso defined by using a special XML schema.automatic generated. The Web Services architecture

When using Web Services, data is transmitted as an argument of the methods or procedures executed remotely, so any kind of data can be sent.

18

on the contrary, does not originate delay since it does not retransmit packets when packet loss occurs, being as a consequence a more suitable protocol to

One way to overcome the problem of using HTTP as a streaming method that is, sending a small number of bytes, reducing the error bit probabilities

originated by packet retransmission.

new approach to object-oriented RPCs defined by the W3C [36]process communication technology that allows an application running in a computerprocedure or method in another computer, without the programmer explicitly coding the detail

the programmer calls a method writing the same code whether the method is local to the executing program or remote. The software components providing the replacement implementations which allow remote interaction are called stubs.

more than a RPC implementation. The Web Services protocol innovation was HTTP application layer protocol, providing a solution that

overcomes the security issues caused by firewall limitations and is easy to deploy because everyone by being registered in an independent registry or service broker

in a Universal Description Discovery and Integration (UDDI) platform.

and responses are contained within SOAP packets that rely on eXtensible L) for data representation and are encapsulated by HTTP packets.

within the UDDI platform using a descriptor called a Web Services Deplo(WSDD) that provides a brief description of the service. This descriptor is written using a special XML scheme. Also, the service provides the information needed by the compiler to generate the client and server stubs using a descriptor that is written in Web Services Description Language (WSDL)also defined by using a special XML schema. Both WSDL and WSDD can be either composed manually or

The Web Services architecture is shown in Figure 3.4.

Figure 3.4 - Web Services architecture [37].

hen using Web Services, data is transmitted as an argument of the methods or procedures executed remotely, so any kind of data can be sent.

elay since it does not retransmit packets protocol to support streaming

using HTTP as a streaming method may be to upload the error bit probabilities

]. An RPC is an inter process communication technology that allows an application running in a computer to execute a procedure or method in another computer, without the programmer explicitly coding the details for this

the programmer calls a method writing the same code whether the method The software components providing the replacement

protocol innovation was , providing a solution that simultaneously

easy to deploy because Web Servicesor service broker defined

and responses are contained within SOAP packets that rely on eXtensible L) for data representation and are encapsulated by HTTP packets. The service

a Web Services Deployment Descriptor using a special XML

scheme. Also, the service provides the information needed by the compiler to generate the client and at is written in Web Services Description Language (WSDL), which is

either composed manually or

hen using Web Services, data is transmitted as an argument of the methods or procedures executed

Page 37: Multimedia Sharing over the Internet from a Mobile Phone

19

As HTTP, Web Services is not the ideal solution to stream media. It has the HTTP related issues mentioned in section 3.2.1, as it is layered on HTTP, and has other performance issues caused by the XML serialization/deserialization, which includes additional header information, string matching and data type casting. According to some studies, when used over a network, Web Services are ~8.8 times slower than RMI, being nevertheless its best alternative available, among those considered [38] [39].Given its larger overhead, measured differences are less marked when the data volume is larger. So,sending large data chunks instead of small data chunks may be a possible way to increase its performance [40].

To sum up, there are several mobile networks and data transfer methods that may be used for thedissertation’s purpose. Regarding the mobile networks discussed, GPRS, EDGE, UMTS and Wi-Fi, this last one distinguishes itself from others by providing considerably higher data rates though it has a much more limited coverage. HTTP Post and Web Services were the data transfer methods chosen to be implemented in the developed applications.

Page 38: Multimedia Sharing over the Internet from a Mobile Phone

20

Page 39: Multimedia Sharing over the Internet from a Mobile Phone

21

4. Mobile Application Development

The goal of this section is to present an overview of the BlackBerry and the Windows Mobile operating systems from the application developer perspective, as well as provide some useful guidelines regarding key topics of mobile application development. In particular, a set of best practices and guidelines for developing mobile applications are described, the object-oriented programming model is introduced and code snippets of common multimedia, threading and data transmission tasks and protocols are provided.

4.1. BlackBerry Application Development

The BlackBerry key features are security and push e-mail, that is, the ability to receive e-mail in near real time, wherever the device can access a wireless network, making it one of the most appropriate mobile solutions for professional use. The BlackBerry development framework is based on the Java programming language [41]. It has a native Integrated Development Environment (IDE), plug-ins for Eclipse IDE and for NetBeans IDE and it also has simulators and debuggers in which applications can be tested and debugged before being deployed on real devices.

From the application developer perspective the main advantage of BlackBerry is the possibility to use the push service, which allows sending data to a client application proactively, reducing the processing requirements and the network traffic, as data is sent only when it is necessary avoiding unnecessary information exchanges. For example, the network traffic and the processor load associated to an event notification application implemented using the push service is smaller when comparing to a similar application that constantly connects to a remote server checking for new events to be notified. On the other hand, the disadvantages include the fact that some APIs are protected, as their usage requires the payment of a fee or code signing. Code signing means that some APIs need to be signed by RIM every time an application is compiled and deployed on a real phone. Another limitation is the impossibility of cross platform deployment.

4.2. Windows Mobile Application Development

As a Microsoft product, Windows Mobile benefits from some well established Windows features and applications and by having an extensive support. It is used in a wide variety of third-party hardwareplatforms, such as Personal Digital Assistants (PDAs) and mobile phones.

There are several programming languages available for developing a Windows Mobile application, as both native code (C++) and managed code (C# and Visual Basic) can be used [42]. Microsoft typically releases Windows Mobile Software Development Kits (SDKs) that work in conjunction with their Visual Studio development environment, including several simulators and debuggers.

From the developer perspective, the main advantages of the Windows Mobile development environment are its low-level, flexible and widely available APIs, the possibility to use both managed and native code and its extensive developer community and support. Its main disadvantage is that the kernel of Windows Mobile, based on Windows CE 5.2, has not been updated since 2004, meaning that it is not optimized for modern processors and some new technologies.

Page 40: Multimedia Sharing over the Internet from a Mobile Phone

22

4.3. Mobile Development Best Practices

Given the resource constrained nature of mobile phones, the development of mobile applications has to comply with some special rules that typically do not apply for non-mobile devices, such as laptops or desktop computers.

Therefore, the following characteristics of mobile phones must be taken into account when developing a mobile application [43] [44]:

A smaller screen size that cannot display high resolution images.

Wireless network connections with a longer latency and reduced data rate.

Slower processor speeds.

Less available memory.

Shorter battery life.

In this context, some of the best practices for developing a mobile application are included in the following list:

Using or extending the existing User Interface (UI) components so that an application can inherit the default component behavior.

Following the standard navigation model so that a particular user action produces a consistent result across applications. For example, allow users to open the main menu in all applications by clicking a specific button.

Making all actions available from the main menu.

Displaying only the information that users need at any moment, focusing on the user’s immediate task.

Designing the UI to allow users to change their mind and undo commands that might be selected accidentally. For example, use an alert dialog box to notify users of a critical action such asdeleting data from their devices.

Implementing dialog boxes using short and concise sentences to clearly state the reason for displaying the dialog box and the actions that can dismiss it.

Displaying a progress bar when an operation takes more than a few seconds to complete.

Use clear and consistent labels throughout the application.

Applying these best practices can help in providing more efficient and more user friendly mobile applications.

4.4. Development Guide

This section provides an introduction to the Object-Oriented Programming (OOP) paradigm, used byboth application development environments used in this dissertation, BlackBerry and Windows Mobile.

Page 41: Multimedia Sharing over the Internet from a Mobile Phone

23

Moreover, some code snippets related to multimedia, threading and data transmission tasks andprotocols used in the development of the proposed mobile applications are provided.

4.4.1.Object-Oriented Programming

The OOP paradigm is basically a programming model centered around “objects” rather than “actions”. This paradigm distinguishes itself by focusing in the “objects”, or data structures to be manipulated,rather than the logic required to manipulate them. An object is an entity often related to a particular real-world concept and it is characterized by a set of attributes or properties. Examples of objects range from human beings (described by name, age, gender and so forth), to vehicles (described for example by number of wheels) and to software application components.

Each object is an instance or a realization of a class. A class defines the abstract characteristics of an object, including the object’s attributes and the object’s behaviors or methods, that is, the actions it can perform.

Some fundamental concepts related to the OOP paradigm are:

Inheritance – Way to create new classes using classes that have already been defined. The new classes, or sub-classes, inherit the attributes and behavior of the pre-existing classes, or super-classes, reusing the existing code with little modifications. For example, the class vehicle might have subclasses called motorcycle, car or truck.

Data Abstraction and Encapsulation – Abstraction is representing essential components without including the background details or explanations. Encapsulation is storing data and functions in a single unit (a class or an interface, in this case), that can be then called by a single name. The reason for data abstraction and encapsulation is to prevent clients of an interface from depending on those parts of the implementation that are likely to change in the future, thereby allowing those changes to be easily made, independently of clients. Keywords such as public, protected or private determine whether a class, a method or an object is available to all classes, sub-classes or only the defining class, respectively.

Polymorphism – Redefinition of a method with the same identifier in several classes, which may have different implementations.

OOP has the following advantages over conventional approaches:

More suitable to model the way the real world works.

Supports code re-usage and extensibility, being easier to modify or add new functionalities.

Clear modular structure that provides easy component replacement and allows defining abstract data types where implementation details are hidden.

The programming language used in BlackBerry application development (Java) and the programming languages available to develop applications for Windows Mobile (C++, C# and Visual Basic) are based on the OOP model, following its principles.

Page 42: Multimedia Sharing over the Internet from a Mobile Phone

24

4.4.2.Multimedia Development Guidelines

When developing a multimedia application, and more specifically, when developing an application to record video, record audio or take pictures, it is necessary to use the device’s camera and microphone.

Multimedia Development Guidelines for BlackBerry

The available alternatives for using the device’s camera and microphone in BlackBerry are, depending on the OS version, invoking the embedded recorder application or using its specific video camera, photo camera or microphone APIs.

The following code snippet shows how to invoke the embedded video camera application, which is the solution implemented in the developed applications.

//Retrieve the handle number for the video recorder applicationint moduleHandle = CodeModuleManager.getModuleHandle("net_rim_bb_videorecorder");

//Retrieve the descriptive information for the video recorder application

ApplicationDescriptor[] apDes = CodeModuleManager.getApplicationDescriptors(moduleHandle);

//Try-catch block for handling exceptions try {

// Start the video recorder application using the Application ManagerApplicationManager.getApplicationManager().runApplication(apDes[0]);

} catch (Exception e) {e.printStackTrace();

}

If the objective is to invoke the photo camera or the microphone, the getModuleHandle argument should be "net_rim_bb_camera" or "net_rim_bb_voicenotesrecorder" respectively.

This approach uses the video recorder, the photo camera and the voice recorder Java code modules installed on the device and its descriptors. The ApplicationManager class manages all Java applications on the device, allowing to run applications immediately or at a specific time, interact with other applications or lock/unlock the handheld, using the corresponding application’s descriptors.

Multimedia Development Guidelines for Windows Mobile

For Windows Mobile, the most common solution to capture and playback video is to use the DirectShow API [45] and some DirectShow extended libraries, such as DirectShowNETCF [46]. These libraries providelow level methods, which allow capturing and previewing video frames or still images, applying effects, among several video related features. Some methods from classes belonging to the DirectShowNETCF library are listed below [46]:

public bool init() //Builds a graph for still pictures or video

public void release() //Destroys graph (free all camera resources)

public void stop() //Stops graph

Page 43: Multimedia Sharing over the Internet from a Mobile Phone

25

public bool run(IntPtr owner) //Runs graph and starts render captured //video

public IntPtr grabFrame(ref long bufSize) //Returns raw frame and buffer //size

public bool getRgb565(IntPtr scan0) //Grabs raw frame and builds rgb //image

public bool getGrayScaleImage(IntPtr scan0, int stride) //Grabs raw frame //and build grayscale image

public void getParams(out int width, out int height, out RawFrameFormat format) //Returns width, height and format of raw frame

public bool zoomOut() //Zoom –

public bool zoomIn() //Zoom +

public bool autoFocusOff() //Turns off autofocus

public bool autoFocusOn() //Turns on autofocus

public bool flashOn() //Turns flash on

public bool flashOff() //Turns flash off

For Windows Mobile devices it is also possible to use multimedia encoders, such as JPG, PNG, TIFF and GIF for photos [47], MP3 and WMA for audio [48] and Windows Media Video (WMV) for video [49], in order to encode raw images, audio or video frames, respectively.

4.4.3.Multithreading Development Guidelines

Multithreading is the ability for two or more tasks to (or appear to) be executing independently and concurrently, sharing resources such as memory and processor. It is typically implemented by time division multiplexing, that is, the processor switches different among threads along time.

Multithreading increases throughput and responsiveness, which are key factors in a resource constrained environment such as mobile phones [50]. Although on a single processor the actual performance of a task cannot be improved (because the Central Processing Unit (CPU) can do only one task at a time), introducing parallelism can be useful in some circumstances, such as activities that can be broken down into separate tasks that require different resources. For example, in an application in which the user has requested some data from the network, while bytes intermittently arrive, the CPU has moments of idleness which can be used to draw the user interface on the device screen.

Multithreading Development Guidelines for BlackBerry

Multithreading in BlackBerry follows some specific principles that must be applied when developing an application. The two major principles are listed below:

Page 44: Multimedia Sharing over the Internet from a Mobile Phone

26

User Interface operations always need to run in the UI thread. For example, to display a dialog box within a different thread the UI thread must be invoked:

//Invoke UI threadUiApplication.getUiApplication().invokeLater(new Runnable(){

public void run(){

//Display a dialog boxDialog.alert("Message to display");

}

});

Network operations should always run within their own thread. For example, to initiate a HTTP session a new thread must be started:

//Create and start a new threadnew Thread(){

public void run(){

//Open an HTTP ConnectionHttpConnection c =(HttpConnection)Connector.open(“http://mobi2do.com”);

}

}.start();

Not applying these principles may cause an application to block when performing UI operations or network operations.

Multithreading Development Guidelines for Windows Mobile

Creating and starting a new thread in a Windows Mobile application can be done by adding the following code snippet

//Create and start a new thread

Thread t = new Thread(new ThreadStart(this.SendImage));t.start();

, and defining the SendImage method within the application class.

If it is required to pass arguments to the thread’s method, the ThreadStart’s Delegate method may be used, as shown in the following code snippet.

//Create, start and pass arguments to a new thread

ThreadStart starter = delegate { SendImage(array); };new Thread(starter).Start();

As before, the SendImage method must be defined within the application class.

Page 45: Multimedia Sharing over the Internet from a Mobile Phone

27

4.4.4.Network Operations Development Guidelines

As discussed in section 3.2, the network protocols used in this dissertation to transfer data upstream are HTTP, through its Post method, and Web Services. This section provides simplified implementations of both protocols, for BlackBerry and for Windows Mobile.

Network Operations Development Guidelines for BlackBerry

The following code snippet shows how to implement a HTTP Post session in a BlackBerry application.

//Open an HTTP Connection defining the URL and the name of the file to //upload

HttpConnection c =

(HttpConnection)Connector.open(uploadURL+"?"+FileName); // Set the request method and headersc.setRequestMethod(HttpConnection.POST);

//Open an output streamOutputStream os = c.openOutputStream();

//Write bytes to output streamos.write(fileBytes);

//Close the output streamos.close();

//Close the HTTP Connectionc.close();

The HttpConnection interface defines the necessary methods and constants for establishing an HTTP connection. The connection exists in one of three states [51]:

Setup, in which the request parameters are set.

Connected, in which request parameters have been sent and the response is expected.

Closed, the final state, in which the HTTP session has terminated.

The transition from Setup to Connected is caused by any method that requires data to be sent to or received from the server. HttpConnection performs blocking Input and Output operations. In order to prevent an application from blocking, connections using this interface should be opened from within anew thread. The OutputStream class accepts output bytes and sends them to a sink. It provides methods to write a single byte or a byte array to an output stream.

To call a Web Service in a BlackBerry application, the development environment has, first, to be set up properly [52]. After setting up the environment and knowing the Web Service’s URL, namespace and method name, it can be consumed using the following code snippet [53].

//Create a SOAP ObjectSoapObject rpc = new SoapObject(serviceNamespace, methodName);

//Set its properties according to its method

Page 46: Multimedia Sharing over the Internet from a Mobile Phone

28

rpc.addProperty("FileName", sFileName);rpc.addProperty("Data", Base64.encode(fileBytes));

//Create and set up a SOAP envelope

SoapSerializationEnvelope envelope = new SoapSerializationEnvelope(SoapEnvelope.VER11);

//Set up the envelope according to the SOAP objectenvelope.bodyOut = rpc;

//Create an Http Transport object HttpTransport ht = new HttpTransport(serviceUrl);

try{

//Call the HTTP Transport objectht.call(serviceNamespace + methodName, envelope);

//Get the responseresult = (envelope.getResponse()).toString();

} catch(Exception e){

result = e.toString();

}

Each SoapObject’s instance is defined by the Web Service’s namespace and method name. Its properties must be set according to the method’s arguments. The SoapSerializationEnvelopeclass provides the methods for the XML serialization. The HttpTransport class provides methods to facilitate SOAP calls over HTTP using the generic Java 2 Micro Edition (J2ME) connection framework, which is the Java platform designed for mobile phones. Instances of the HttpTransport can be in one of two states: connected or not connected. When call method is used to make an invocation, the instance is in a connected state until it returns or throws an exception.

Network Operations Development Guidelines for Windows Mobile

The procedures to implement a HTTP Post session in a Windows Mobile application are similar to the procedures described for BlackBerry. Besides the different programming languages, the main difference resides in the class that may be used to create the HTTP session: HttpConnection is used in BlackBerry applications while HttpWebRequest is used in Windows Mobile applications. In this case, the HTTP session implemented in Windows Mobile has to be closed by using the class WebResponse, pairing the HttpWebRequest used previously, as shown in the following code snippet.

//Create HttpWebRequest object defining the URL and file name

HttpWebRequest request = (HttpWebRequest)WebRequest.Create(Uri + "?User" + FileName);

//Set the request method to Postrequest.Method = "POST";

//Request an output streamStream os = request.GetRequestStream();

Page 47: Multimedia Sharing over the Internet from a Mobile Phone

29

//Write bytes to output stream os.Write(array, 0, array.Length);

//Close output streamos.Close();

//Get the web response to close the http request WebResponse webResponse = request.GetResponse();

//Close the HTTP requestwebResponse.Close();

To call a Web Service in a Windows Mobile application the development environment has to be set up first as well. Hence, within the project references must be added a web reference to the WSDL file of the desired Web Service. The next step is to create an object extending the WebProxy class and to implement a call to the referenced Web Service in association to the WebProxy’s object created, as shown in the following code snippet.

//Create object of the Web Service class

WebReference.UploadService uploadObj =

new ProjectClass.WebReference.UploadService();

//Create object of the WebProxy classWebProxy objProxy = new WebProxy();

//Set uploadObj proxy informationuploadObj.Proxy = objProxy;

try{

//Call uploadObj’s Upload method passing file name and data as //arguments

string uploadStatus = uploadObj.Upload(sFileName, data);}

catch (WebException ex){

string message = ex.Message;

}

Network Operations Development Guidelines for a Web Server

The previous descriptions refer only to the implementation of the network protocols on the mobile side. To establish a connection, that is, to transfer data from a mobile to a web server, the HTTP Post and Web Services data receiving components have to be implemented as well. This implementation can be done by developing .NET Framework web applications such as a Web Form application for the HTTP Postand a Web Service application. These web applications can be developed and published using Microsoft Visual Studio in order to be accessed over the internet by using a URL.

One way to save to disk a file received through a HTTP Post session is using the SaveAs method [54] on a Web Form class, as the following code snippet illustrates.

Page 48: Multimedia Sharing over the Internet from a Mobile Phone

30

namespace WebFormClass{ //Create a Web Form class public partial class WebFormClass : System.Web.UI.Page {

//Define upload directoryString sUploadDirectory;

//Perform actions to respond to client-side events protected void Page_Load(object sender, EventArgs e) {

//Retrieve file name String fileName = Request.QueryString.ToString();

// Save the fileRequest.SaveAs(sUploadDirectory + fileName, false);

}}

}

During the Page_load event a series of actions can be performed to either create an ASP.NET web page for the first time or respond to client-side events that result from a HTTP Post. The saveAs method allows saving an HTTP request to disk, in the physical drive path defined in its first argument. The second argument specifies whether the HTTP headers should be saved to disk.

To save to disk a file received through a Web Service is necessary to create and set up a .NET Web Service application [55], modifying its default web method in order to receive and save a file. The following code snippet shows a simplified way to implement it.

namespace WebUploadService{

//Default summary description for Web Service [WebService(Namespace = "http://tempuri.org/")] [WebServiceBinding(ConformsTo = WsiProfiles.BasicProfile1_1)] [System.ComponentModel.ToolboxItem(false)]

public class UploadService : System.Web.Services.WebService {

//Define Web Service’s methodpublic void Upload(String FileName, String Data)

{//Open a file stream to save the file

FileStream objFileStream = File.Open(uploadDirectory + FileName, FileMode.Create, FileAccess.Write);

//Desserialize data byte[] fileBytes = Convert.FromBase64String(Data);

//Write data on a file stream objFileStream.Write(fileBytes);

//Close the file stream objFileStream.Close(); } }}

Page 49: Multimedia Sharing over the Internet from a Mobile Phone

31

The Web Service’s summary description, including its namespace, is automatically generated when it is created using Visual Studio. FileStream class provides methods supporting both synchronous and asynchronous read and write operations.

The data receiving components described are mobile side independent, that is, they can be used whether connections are established from a BlackBerry application or a Windows Mobile application.

Page 50: Multimedia Sharing over the Internet from a Mobile Phone

32

Page 51: Multimedia Sharing over the Internet from a Mobile Phone

33

5. Sharing Multimedia from a Mobile Phone

The concepts presented in the previous chapters, such as the modality selection according to the application scenarios considered, the data transfer methods and the mobile development guidelines,provide the framework in which the applications composing the proposed mobile multimedia sharing system are developed.

This chapter presents a detailed description of the implemented system, focusing on its architecture and functional structure. The first subsection presents a general description of the developed applications, focusing on the multimedia modalities implemented and application scenarios supported, followed by a more detailed presentation of the proposed applications main components in the remaining subsections.

5.1. System Modalities and Application Scenarios

The developed system implements four different multimedia sharing solutions, based on the modality switching principles, described in chapter 2:

Video Uploading.

Photo Uploading.

Audio Uploading.

Photo Streaming.

According to Table 2.2, these four modalities support five application scenarios:

Scenario A, which corresponds to instantaneous events only with visual content, supported by photo uploading.

Scenario C, which corresponds to non-instantaneous events only with visual content and intended to be shared in real time, supported by photo streaming.

Scenario E, which corresponds to non-instantaneous events with visual and audio content and not intended to be shared in real time, supported by video uploading.

Scenario F, which corresponds to non-instantaneous events only with visual content and not intended to be shared in real time, supported by video uploading and by photo uploading.

Scenario G, which corresponds to non-instantaneous events only with audio content and not intended to be shared in real time, supported by video uploading and by audio uploading.

As mentioned in previous chapters, two development environments were used to develop the applications implementing the modalities and scenarios above: BlackBerry and Windows Mobile. BlackBerry was used to develop a multimedia uploading application – MobiShare – which implements video, photo and audio uploading, and Windows Mobile was used to develop a photo streaming application – MobiStream.

Page 52: Multimedia Sharing over the Internet from a Mobile Phone

MobiShare’s implementation offers the possibility to switch among the available uploading modalities, in order to choose the best set of modalities to support its application sceMobiStream’s implementation supports

Figure 5.1 illustrates the relationship between the multimedia modalities application scenarios supported

Figure 5.1 – Relationship between the modalities and application scenarios

5.2. MobiShare: Multimedia

The application for capturing and mobile phone, is described in detail in this sectionaudio clips, uploading them to a web servphotos, videos and audio clips are sweb browser and playable over the internet.

This section starts by presenting developed, as well as the relationshipsis provided.

5.2.1.MobiShare Application

The architecture of MobiShare is composed by tand the client web browser. The main components othem, are represented in Figure

Scenario A

Photo Uploading

Scenario E

34

MobiShare’s implementation offers the possibility to switch among the available uploading modalities, in order to choose the best set of modalities to support its application scenario

supports scenario C.

illustrates the relationship between the multimedia modalities implementedin the developed applications.

Relationship between the modalities and application scenarios in the developed applications

MobiShare: Multimedia Uploading Application

he application for capturing and sending multimedia content over the Internet, from a BlackBerry , is described in detail in this section. The application captures photos, video clips and

them to a web server as soon as recording is finished. In the web serverphotos, videos and audio clips are stored in a shared folder, whose content can be accessed by a client’s

playable over the internet.

This section starts by presenting the application’s architecture, introducing the main components relationships among them. Then, a detailed description of those components

hare Application Architecture

is composed by three main elements: the mobile phone, the web serverhe main components of each element, as well as the relationship

Figure 5.2.

BlackBerry

MobiShare

Video Uploading

Scenario E Scenario F

Audio Uploading

Scenario G

Windows Mobile

MobiStream

Streaming

Scenario C

MobiShare’s implementation offers the possibility to switch among the available uploading modalities, narios: A, E, F and G.

implemented and the

the developed applications.

multimedia content over the Internet, from a BlackBerry captures photos, video clips and

n the web server, the accessed by a client’s

the main components , a detailed description of those components

: the mobile phone, the web serverrelationships among

Windows Mobile

MobiStream

Photo Streaming

Scenario C

Page 53: Multimedia Sharing over the Internet from a Mobile Phone

35

Figure 5.2 – MobiShare multimedia uploading application architecture.

Hence, in the mobile phone, the application main components are:

1. Modality Selection – Within the application home screen, the user is given the possibility to choose among the available multimedia modalities: video, photo or audio.

2. Start Camera/Microphone – According to the user’s choice, the device’s embedded video camera, photo camera or microphone is invoked and started.

3. Record and Encode – The multimedia file is recorded, encoded and locally stored using one of the following file formats: jpg, 3gp or amr, whether it is a photo, video or audio clip, respectively. The encoding and file creation operations are done automatically when an embedded recording application is used.

4. File Access – The file is detected in the mobile device’s file system and accessed.

5. Upload – The file is uploaded to the web server, using one of the solutions described in section 3.2: HTTP Post or Web Services. Once the upload is complete, the user is notified of successful upload and receives the URL where the file can be accessed. The application then returns to thehome screen, to its initial state.

Between the mobile phone and the web server, data is transferred using one the network protocols mentioned before, over three of the mobile networks described in section 3.1: GPRS, EDGE and Wi-Fi, since the BlackBerry device available to develop the application does not support UMTS [56]. If more than one network is available, the network selection process is made by the device’s core system, according to the fact that Wi-Fi, when available, is used by default for data services. Typically, the order of selection of the network in use for data services is Wi-Fi as first choice, then EDGE and GRPS, as last choice.

In the web server, the file is stored in a shared folder, that is, in a folder set up to be accessed by a network process. Shared folders content is publicly available and can be accessed, that is, read or downloaded by a device connected to the Internet.

Page 54: Multimedia Sharing over the Internet from a Mobile Phone

36

Finally, a client can access the desired file over the Internet and download it or play it on a compatible web browser, once he knows the file’s URL. There are several ways for a mobile user to share the file’s URL with a client, such as e-mail or SMS for example.

Details on each of the application mobile and client components are provided in the following subsections.

5.2.2.MobiShare: Mobile Phone Component

When MobiShare is started three tasks are executed:

the constructor method is executed;

a thread listening for changes in the file system is started;

a thread monitoring the network is started.

The constructor method creates the application home screen and adds to it the menu items associated to the implemented multimedia uploading modalities. The home screen object extends the MainScreen class and is composed by label field and text field objects displaying the application’s name and a description of it. Figure 5.3 shows two snapshots of the application home screen.

Figure 5.3 - MobiShare home screen.

The list of menu items is displayed once the user presses the menu key, in the home screen. Each menu item object belongs to the MenuItem class, which implements a Runnable interface, automatically executing its run method as a thread when selected. The implementation of a MenuItem object is shown in the following code snippet.

private MenuItem MenuItemObject = new MenuItem("Menu Item Text", 105, 10){

public void run(){ //actions to execute when menu item is selected}

};

Page 55: Multimedia Sharing over the Internet from a Mobile Phone

37

In order to identify the file to be created, each menu item, when selected, displays a dialog box asking the user to input a file name, as illustrated in Figure 5.4. Each dialog box object belongs to a class extending the Dialog class.

Figure 5.4 - Dialog box asking the user to input the desired file name.

If the selected modality is video, Wi-Fi coverage is checked for, using the CoverageInfo class and its methods. This class will be detailed later, in the description of the thread executing the network monitoring.

As discussed in chapters 2 and 3, if the network available for transferring data typically has a low data rate, it might not be a suitable network to upload video files, since their large size might cause an upload time longer than acceptable.

Hence, as the Wi-Fi is the only network providing acceptable data rates for uploading video, a dialog box warning the user that sharing video is not recommended pops up in case Wi-Fi is not available, as illustrated in Figure 5.5.

Figure 5.5 - Dialog box warning not to share video in no Wi-Fi covered areas.

Finally, after the file name has been set, each menu item invokes and starts the corresponding embedded recorder application as detailed in section 4.4.2.

Snapshots of the embedded audio recorder application, photo camera application and video recorder application are shown in Figure 5.6, Figure 5.7 and Figure 5.8, respectively.

Page 56: Multimedia Sharing over the Internet from a Mobile Phone

38

Figure 5.6 – Embedded audio recorder application.

Figure 5.7 – Embedded photo camera application.

Figure 5.8 – Embedded video camera application.

Once a recording is finished the corresponding file is automatically encoded and added to the file system, being saved using the file formats amr, jpg or 3gp, according to its multimedia type.

As mentioned before, a thread checking for new files added to the file system is started in main method. This is accomplished by adding in the thread’s run method an object of a new class, calledMobiShareJournalListener, which extends the FileSystemJournalListner class. The FileSystemJournalListner class is used for detecting new entries in the file system journal. The file

Page 57: Multimedia Sharing over the Internet from a Mobile Phone

39

system journal is used to track changes within the file system and each change event corresponds to a new journal entry. The types of file system change events are file added, file changed or file deleted.

The MobiShareJournalListner class inherits the fileJournalChanged method, which detectswhen the file system journal has added a new entry. In this dissertation, the fileJournalChangedmethod is overridden in order to not only detect a new file system journal entry, as it does by default, but also to identify the type of the associated change event and to access its corresponding file. As a result, the following code snippet was added to the default fileJournalChanged method.

// FileSystemJournalEntry entry: New journal entry detected by the default //fileJournalChanged method.

//Detect the event type and file’s pathswitch (entry.getEvent()) {

case FileSystemJournalEntry.FILE_ADDED:

//File added is located at pathpath = entry.getPath();

break;

case FileSystemJournalEntry.FILE_CHANGED:

//File changed is located at pathpath = entry.getPath();

break;

case FileSystemJournalEntry.FILE_DELETED:

//File was deleted

break;

}

The FileSystemJournalEntry.FILE_ADDED is triggered when the file is first added to the file system. However, large files might not be completely stored in the file system when this happens, being yet unavailable. On the other hand, the FileSystemJournalEntry.FILE_CHANGED is triggered only when the entire file is completely stored in the file system and available to be accessed and eventually uploaded, according to the dissertation purposes.

Hence, within the FILE_CHANGED case, a new thread is started to set the file access process apart from the file system journal listener process. Within this new thread, the file is read, using a FileConnection object to create a connection to the file and an InputStream object to read itsbytes to a byte array variable, fileBytes. A simplified code version is shown below.

//Create a connection to the fileFileConnection fconn = (FileConnection)Connector.open("file://"+path);

//Determine the file sizeint inStreamLength = (int) fconn.fileSize();

Page 58: Multimedia Sharing over the Internet from a Mobile Phone

40

//Open a stream for a connection to the fileInputSream inStream = fconn.openInputStream();

//Read the file’s bytes to a byte array variableinStream.read(fileBytes, 0, inStreamLength);

//Close the streaminStream.close();

Given the fileBytes variable and the file name previously defined, the file can then be uploaded using either HTTP Post or Web Services, as described in section 4.4.4. Note that the network connection is opened within a new thread, in order to prevent the HttpConnection interface from blocking, as recommended in section 4.4.3.

While the upload is in progress, and after the embedded recorder application is closed, a new screen object containing a progress bar field is pushed onto the display stack and painted on the screen. Implementing a progress bar is recommended as a best practice in section 4.3 since the upload usually takes more than a few seconds to complete. The progress bar is an object of the GaugeField class and it is replaced by a new updated progress bar each 10%. As discussed in section 4.4.3, the UI thread is invoked within the network connection thread to get the active screen and to replace the progress bars, as shown in the following code snippet.

//Invoke the UI thread each 10%if (percent % 10 == 0){

UiApplication.getUiApplication().invokeLater(new Runnable(){

public void run(){

//Get the active screen

Screen msScreen2 = UiApplication.getUiApplication().getActiveScreen();

//Instantiate a new GaugeField object to upload the progress bar

GaugeField gaugeField1 = new GaugeField ("Progress: ", 0, 100, percent, GaugeField.PERCENT);

//Replace the old progress bar msScreen2.replace(ms2.getField(1), gaugeField1);

}

});

}

The percent value is obtained from the number of times that the write method of OutputStream is called, since if HTTP Post is the upload method the file’s bytes are divided into smaller byte arrays and the write method is called several times. Note that this process occurs within the same HTTP session.

The uploading screen, with the progress bar, is illustrated in Figure 5.9.

Page 59: Multimedia Sharing over the Internet from a Mobile Phone

41

Figure 5.9 - Uploading screen.

When the upload is complete and the network connection is closed, a new text field is added to the active screen warning the user that the upload was succeeded and sharing the file’s URL, as shown inFigure 5.10.

Figure 5.10 - Upload succeeded screen.

Note that the in last URL’s field, 2100000a is the BlackBerry PIN, an identifying hexadecimal number different for each BlackBerry device, and photo1 is the file name previously defined by the user. Adding the user’s BlackBerry PIN to the file name ensures that each user files are not replaced by files with the same file name and uploaded by different users.

To share the URL easily, it can be selected and copied to be pasted in a web browser, e-mail, SMS or wherever the user wants to. Figure 5.11 shows the screen with the URL selected and the copy option.

Figure 5.11 - Screen with the URL selected.

Page 60: Multimedia Sharing over the Internet from a Mobile Phone

42

Once the user closes the previous screen the application returns to its home screen, as shown in Figure 5.3.

Since the beginning of the application execution, a thread monitoring the available networks is running. This thread implements the isCoverageSufficient method of the CoverageInfo class, which determines whether or not the device currently has the type of coverage specified, over all supported Wireless Access Families (WAFs), as well as USB and Bluetooth. The following code snippet represents an implementation of the mentioned method, for the WAFs GPRS, EDGE and Wi-Fi:

//Determine wi-fi, egde and gprs coverage

boolean wifi = CoverageInfo.isCoverageSufficient(CoverageInfo.COVERAGE_DIRECT, RadioInfo.WAF_WLAN, false);

boolean edge = CoverageInfo.isCoverageSufficient( CoverageInfo.COVERAGE_MDS, RadioInfo.NETWORK_SERVICE_EDGE, false);

boolean gprs = CoverageInfo.isCoverageSufficient( CoverageInfo.COVERAGE_MDS, RadioInfo.NETWORK_GPRS, false);

The network coverage is checked regularly, establishing a current state i, which is compared to the previous state i-1, that is, the network coverage state a few seconds earlier.

According to an eventual coverage change for the network in use, a dialog box pops up notifying the user that there are modalities which selection is either now supported or no longer recommended, depending on the network change verified and on the current modality in use. The used modality influences the decision since the resource consumption and the upload times are different for each modality. For example, while audio, given its small file size, is supported in all networks, video is supported in Wi-Fi only.

Table 5.1, Table 5.2 and Table 5.3 present the content of the information displayed in the dialog boxes,according to the network change and to the modality in use: video, photo or audio, respectively.

Table 5.1 - Information displayed according to the network change, when video is the selected modality.

State iState i-1 Wi-Fi EDGE GPRS

Wi-Fi -Video sharing

not recommended

Video and HQ photo sharing

not recommended

EDGE - -

Video and HQ photo sharing

not recommended

GPRS - - -

Page 61: Multimedia Sharing over the Internet from a Mobile Phone

43

Table 5.2 - Information displayed according to the network change, when photo is the selected modality.

State iState i-1

Wi-Fi EDGE GPRS

Wi-Fi - -Photo HQ

sharing not recommended

EDGEVideo sharing is now supported

-Photo HQ

sharing not recommended

GPRSVideo and HQ

photo sharing isnow supported

HQ photo sharing is now

supported-

Table 5.3 - Information displayed according to the network change, when audio is the selected modality.

State iState i-1 Wi-Fi EDGE GPRS

Wi-Fi - - -

EDGEVideo sharing is now supported - -

GPRSVideo and HQ photo sharing

now supported

HQ photo sharing is now

supported-

The previous tables mention a new sub-modality, High-Quality (HQ) photo sharing, related to photos with maximum quality and resolution, which are characteristics that the user can set but cannot bedefined in the application development when using the embedded camera. This sub-modality is defined since HQ photos usually correspond to large files whose upload time may be excessive when GPRS is the current network in use.

As presented in Table 5.1, when video is the modality in use, the user is only notified when the network conditions decrease, that is, when the network in use in the previous state is replaced by a different network usually providing smaller data rates. Hence, when Wi-Fi is replaced by EDGE, the user is recommended not to continue sharing video and when Wi-Fi is replaced by GPRS, neither video sharing nor HQ photo sharing are recommended.

When photo is the modality in use, as shown in Table 5.2, a dialog box is launched in two cases. First,when the network in use changes to GPRS, the user is notified not to continue sharing HQ photos. Secondly, when network conditions increase, that is, when the network in use in the previous state is replaced by a different network usually providing higher data rates, the user is notified that HQ photo sharing is now supported and, if the new network in use is Wi-Fi, the user is also notified that video sharing is supported as well.

Page 62: Multimedia Sharing over the Internet from a Mobile Phone

44

At last, when audio is the selected modality, the user is only notified when the network conditions increase. Hence, the user is notified that video sharing is now supported, if the current network in use is Wi-Fi and is also notified that HQ photos sharing is now supported if the previous network in use was GPRS, as described in Table 5.3.

In the previous cases, the choice of selecting video sharing or HQ photo sharing is ultimately made by the user, according to the current application scenario.

5.2.3.MobiShare: Client Component

As mentioned before, a client can access the desired file being shared from the mobile device, using its URL. The client can use any device connected to the Internet to download the file, share it or play it on a compatible web browser.

Figure 5.12 shows a client accessing an audio file, audio1.amr, Figure 5.13 shows the access to a photo, photo1.jpg, and Figure 5.14 illustrates the access to a video clip, video1.3gp, captured by MobiShare, playing on a personal computer web browser.

Figure 5.12 - Audio clip captured by MobiShare playing on a computer web browser.

Page 63: Multimedia Sharing over the Internet from a Mobile Phone

45

Figure 5.13 - Photo captured by MobiShare displayed on a computer web browser.

Figure 5.14 - Video captured by MobiShare playing on a computer web browser.

Page 64: Multimedia Sharing over the Internet from a Mobile Phone

46

5.3. MobiStream: Photo Streaming Application

The application for streaming photos over the Internet, from a Windows Mobile device is described in detail in this section. The application constantly captures and encodes frames as images, uploadingthem to a web server. In the web server, the images are saved in a shared folder and then accessed over the Internet by the client application, which displays them as a sequence of photos.

This section starts by presenting the application’s architecture, introducing the main components developed, as well as the relationships among them. Then, a detailed description of those components is provided.

5.3.1.MobiStream Application Architecture

The architecture of MobiStream is composed by three main elements: the mobile phone, the web server and the client. The main components of each element, as well as their relationships, are represented inFigure 5.15.

Figure 5.15 - MobiStream photo streaming application architecture.

Hence, in the mobile phone, the main software components are:

1. Create Home Screen – A panel control, a start button, an exit button and text boxes are created and a streaming ID is generated and displayed.

2. Start Camera – The camera is initialized and starts running once the user presses the start button.

3. Capture and Encode Frame – Frames are captured and encoded using the jpg file format, at a frame rate that varies according to the current mobile network in use.

4. Upload – Each image is sent to the web server, using one of the solutions described in section 3.2: HTTP Post or Web Services.

Between the mobile phone and the web server, data is transferred using one of the network protocols mentioned before, over one of the available mobile networks, as described in section 3.1. The network

Page 65: Multimedia Sharing over the Internet from a Mobile Phone

47

selection process is similar to what was described for the MobiShare application. Note that in this caseUMTS is supported by the device available to develop this application [57], so typically the order of network selection for data services is Wi-Fi as first choice, then UMTS, EDGE and GRPS, as last choice.

Moreover, in this application data is also saved in a shared folder located in the web server. The image files corresponding to a single streaming are all indentified by the same file name, replacing each other in the web server, avoiding the web server overload in long streaming sessions or when the number of users is high.

The client application main components are:

1. Ask for streaming ID – A form is created, containing labels describing the application and a text box asking the client to input the desired content’s streaming ID.

2. Open Web Browser – A WebBrowser object, contained within the form, is created to open the URL associated to the streaming ID.

3. Display Image – The jpg image contained in the web page associated to the URL is displayed on the web browser.

4. Refresh Web Browser – The URL’s web page is reloaded to retrieve the most recent image available in the web server.

Details on each of the mobile and client application components are provided in the following subsections.

5.3.2.MobiStream: Mobile Phone Component

When the MobiStream mobile application is started four actions take place:

The constructor method is executed, initializing the form components.

The streaming ID, corresponding to the image’s file name, is generated.

The current network in use is detected and a SystemState object monitoring network changes is created.

A camera object is created.

The constructor method creates the following components: panel control, start button, exit button and two text boxes, defining their size and location within the application form.

The streaming ID is a four digit random number converted to string. As mentioned before, each image has as file name the string identifying its corresponding streaming ID.

The SystemState object created, networkWatcher, monitors the number of network connections available. A ChangeEventHandler object associated to it is triggered whenever the number of connections available changes and the NetworkWatcher_Changed method is executed to identify the current network in use and to adapt the frame rate accordingly.

The frame rate is associated to the captureWaitPeriod variable, which sets the sleep time of the thread managing the frame capturing, encoding and sending. The following code snippet illustrates how

Page 66: Multimedia Sharing over the Internet from a Mobile Phone

48

both the network monitoring process and the captureWaitPeriod settlement process previously described are implemented.

//Determine the captureWaitPeriod value accordingly to the network in use

captureWaitPeriod = NetworkCaptureWaitPeriod(); // in miliseconds

//Implement network watchernetworkWatcher = new SystemState(SystemProperty.ConnectionsCount);

networkWatcher.Changed += new ChangeEventHandler(NetworkWatcher_Changed);

// Set the captureWaitPeriod value whenever the number of connections //available changes

void NetworkWatcher_Changed(object sender, ChangeEventArgs args){

captureWaitPeriod = NetworkCaptureWaitPeriod();

}

//NetworkCaptureWaitPeriod method adapts the frame rate, //that is, captureWaitPeriod, to the network in use

private int NetworkCaptureWaitPeriod(){

int captureWaitPeriod_;

if (SystemState.WiFiStateConnected) captureWaitPeriod_ = wifiCaptureWaitPeriod;

else if (SystemState.CellularSystemAvailableUmts) captureWaitPeriod_ = umtsCaptureWaitPeriod;

else if (SystemState.CellularSystemAvailableEdge) captureWaitPeriod_ = edgeCaptureWaitPeriod;

else if (SystemState.CellularSystemAvailableGprs) captureWaitPeriod_ = gprsCaptureWaitPeriod;

return captureWaitPeriod_;

}

The captureWaitPeriod values for each network are defined according to the typical characteristics provided by those networks, such as the data rate. Hence, the wifiCaptureWaitPeriod value is the lowest and has an order of magnitude of hundreds of milliseconds and gprsCaptureWaitPeriodvalue is the highest and has an order of magnitude of thousands of milliseconds.

The camera object created, camera, extends the AMCamera class, which belongs to the DirectShowNETCF library, whose main methods were described in section 4.4.2.

Page 67: Multimedia Sharing over the Internet from a Mobile Phone

49

The application home screen created is presented in Figure 5.16.

Figure 5.16 - MobiStream home screen.

Once the start button is pressed the camera is initialized and starts running and a new thread managing the streaming is created, as shown in the following simplified code snippet.

private void Start_Click(object sender, EventArgs e){

//Initialize and start cameracamera.init();

//run camera in panel controlcamera.run(panel1.Handle));

//Create thread managing the streamingThread t = new Thread(new ThreadStart(this.Stream));

//Start the threadt.Start();

}

The camera is initialized and started using the init and the run methods of the DirectShowNETCF’sAMCamera class, described in section 4.4.2. The init method builds a graph for the video content captured by the device’s camera and the run method runs the graph and starts rendering the captured video in the panel control defined in its argument.

Figure 5.17 shows MobiStream home screen while the camera is running. Implementing the frame capturing, encoding and sending in a new a thread different from the Start_Click method allows Exit_Click method to be executed when the user presses the exit button.

Page 68: Multimedia Sharing over the Internet from a Mobile Phone

50

Figure 5.17 - MobiStream home screen while camera is running.

The main actions executed in Stream method are capturing and encoding frames using the jpg file format, at the frequency previously set in the captureWaitPeriod variable. Within the Streammethod, a new method for sending the images is started as a thread, the SendImage method, since as a network operation it should be executed in its own thread. The following code snippet presents the main actions executed in Stream method.

private void Stream(){

//While loop runs until Exit_Click is executedwhile(!stop){

// Returns width, height and format of raw framecam_.getParams(out width, out height, out format_);

//Locks raw frame bits into system memory as a BitmapData objectBitmap bmp = new Bitmap(width, height);

BitmapData data = bmp.LockBits(new Rectangle(0, 0, width, height), ImageLockMode.ReadWrite, PixelFormat.Format16bppRgb565);

//Creates a RGB image from the raw frame bitsbool RgbFlag = cam_.getRgb565(data.Scan0));bmp.UnlockBits(data);

if (RgbFlag){

//Save Bitmap to a new MemoryStrem object as Jpeg bmp.Save(ms, ImageFormat.Jpeg);

//Writes ms content to byte arrayarrayImage = ms.ToArray();

Page 69: Multimedia Sharing over the Internet from a Mobile Phone

51

//Starts SendFrame method as a thread using the delegate //method to pass arguments

ThreadStart starter = delegate {SendImage(arrayImage);};new Thread(starter).Start();

}

//Thread is suspended for captureWaitPeriod millisecondsThread.Sleep(captureWaitPeriod);

}}

The Stream method continuously executes the previous actions, until exit button is pressed, which set the stop variable to true, causing the while loop to end. The raw frame width and height are provided by the getParams method and then are used as arguments of the LockBits method, which locks the raw frame bits into the system memory as a BitmapData object, according to the pixel format specified. Then getRgb565 method creates a Bitmap RGB image from the raw frame bits. Finally, the image is saved into a memory stream using the jpg file format and its bytes are passed to the SendImage method.

The main action executed in SendImage method is sending the images to the web server, using either HTTP Post or Web Services, as described in section 4.4.4.

The streaming ends when the exit button is pressed, that is, when the Exit_Click method is executed.

private void Exit_Click(object sender, EventArgs e){

//set stop variable to truestop = true;

//stop the graphcam_.stop();

//free all camera resourcescam_.release();

//close the application formClose();

}

This method stops the graph, releases all camera resources and exits the application.

5.3.3.MobiStream: Client Component

MobiStream’s client component is a Windows Form application developed in Visual Studio, under the Microsoft .NET Framework. This application allows a client to watch the photo streaming in any personal computer connected to the Internet and running a Microsoft’s OS.

Page 70: Multimedia Sharing over the Internet from a Mobile Phone

52

The first task executed by the application is to create a form asking the client to input the streaming IDidentifying the desired streaming session. This form contains three label fields providing a description of the application, a text box to input the streaming ID and a button field to validate the input. The form is illustrated in Figure 5.18.

Figure 5.18 - Form asking for the streaming ID.

Once the button is pressed, the method WatchStreamingButton_Clicked is executed. This method rebuilds the form, replacing the previous components by an object extending the WebBrowser class. Additionally, the RefreshStream method starts as a thread within the WatchStreaming

Button_Clicked method.

public void WatchStreamingButton_Clicked(object sender, EventArgs e){

//Retrieve userID from the text boxuserID = textBox1.Text.ToString();

//Replace label, button and text fields by a web browser objectRebuildForm();

//Starts RefreshStream method as a threadThread t = new Thread(new ThreadStart(this.RefreshStream));

t.Start();

}

Within the RefreshStream method, the web browser object loads the web page corresponding to the URL of the streaming ID using the Navigate method from the WebBrowser class.

Then the web browser object repeatedly executes its Refresh method, which reloads the web page currently displayed by checking the server for an update version. The corresponding code snippet is shown as follows.

Page 71: Multimedia Sharing over the Internet from a Mobile Phone

53

public void RefreshStream(){

//Load the web page correspoding to the streaming ID URLwebBrowser1.Navigate(new Uri( URL + userID + ".jpg"));

while (!stop){

//Suspend thread for refreshInterval milisecondsThread.Sleep(refreshCaptureWaitPeriod);

//Reload the web pagewebBrowser1.Refresh();

}}

The refreshCaptureWaitPeriod is a fixed value set accordingly to the lowest captureWaitPeriodvalue of the MobiShare mobile application, that is, to the captureWaitPeriod associated to the Wi-Fi network, thus having an order of magnitude of hundreds of milliseconds. Hence, the client application component has a negligible influence on the streaming frame rate, which is mainly defined by the Stream thread in the mobile application component.

Figure 5.19 shows the form containing the web browser where images are displayed.

Figure 5.19 - Form containing the web browser displaying the sequence of images.

The photo streaming stops when the browser’s stop button is pressed, that is, when the web browser object within the stopButton_Click method calls its Stop method.

An analysis of the performance of these applications, MobiShare and MobiStream, is presented in next chapter, where some key characteristics such as the upload time and average frame period are evaluated in different scenarios.

Page 72: Multimedia Sharing over the Internet from a Mobile Phone

54

Page 73: Multimedia Sharing over the Internet from a Mobile Phone

55

6. Performance Analysis

This chapter presents a performance analysis of the applications described in the previous chapter. The set of tests performed and the achieved results are different in both applications, given their different functional structures and supported scenarios.

The main characteristic to evaluate in the MobiShare application is the upload time, that is, the time interval from the beginning of sending a file until the moment when the file is completely stored in the web server and made available to be accessed by a client. In the MobiStream application, the main characteristic evaluated is the average frame period, expressed as the average time interval between two consecutive photos stored in the web server. Additionally, the streaming latency is also estimated.

6.1. MobiShare

The tests conducted to evaluate the upload time cover all the possible combinations among modalities, mobile networks and data transfer protocols considered in MobiShare’s. Hence, it is possible to analyze the obtained values and to compare the influence of the different alternatives in the resulting upload times.

The tests were performed using the mobile device which supported the application development: a BlackBerry Curve 8900 with the following specifications [56]:

4.6.1.133 OS version.

3.2 Megapixels camera.

Wi-Fi enabled, supporting the IEEE 802.11 b/g protocols.

GSM/GPRS/EDGE are the supported networks.

Locked to Vodafone PT.

The web server used to store the uploaded files has the following specifications:

Intel Core2 Quad CPU @ 2.84 GHz processor.

2 GB RAM memory.

Microsoft Windows Server 2008.

10 Gb/s LAN connection.

A script was developed to determine the upload time. This script, running in the web server, contains an object extending the FileSystemWatcher class, which is monitoring the folder in which files are stored. Every time a new file is stored in the monitored folder, the web server current time is retrieved using the DateTime structure. Then, the time corresponding to the instant of the beginning of the upload, obtained in the mobile device and contained in the file’s name, is subtracted to the web server current time, thus obtaining the file’s upload time. This approach requires that the mobile device and the web server clocks are synchronized.

Page 74: Multimedia Sharing over the Internet from a Mobile Phone

56

As mentioned previously, the tests performed to evaluate the upload time cover all the possible combinations among modalities, mobile networks and data transfer protocols considered. Each combination was tested three times, in different time instants.

The length of the audio clips used in the tests represents a short audio event, 30 seconds, and the typical length of a song, 3 minutes and 30 seconds.

There are three possible resolutions and three possible quality degrees that can be set when taking a photo using a BlackBerry device. The resolution scale ranges from small (480 x 360), to medium (1024 x 768), up to large (2048 x 1536), while the quality scale comprehends normal, fine and superfine. Hence, the three different photo sub-modalities used to perform the tests are defined as:

Low Quality: small resolution and normal quality.

Medium Quality: medium resolution and fine quality.

High Quality: large resolution and superfine quality.

The lengths of the video clips represent short (20 seconds) and medium (60 seconds) audiovisual events.

The subsequent tables present the results of the conducted tests, which were obtained sequentially and, supposedly, under approximately stable wireless network conditions.

Table 6.1 presents the measurements of audio clip upload times performed over Wi-Fi, using both HTTP Post and Web Services, for two different audio clip lengths.

Table 6.1- Upload times of audio clips over Wi-Fi.

Audio clip length = 30 s Audio clip length = 3min 30s

File Size [kB] Upload Time [s] File Size [kB] Upload Time [s]

HTTP Post

47 4 327 18

48 4 326 19

47 4 330 18

Web Services

46 3 328 12

48 4 329 14

48 4 329 14

Table 6.2 presents the measurements of audio clip upload times performed over EDGE, using both HTTP Post and Web Services, for two different audio clip lengths.

Page 75: Multimedia Sharing over the Internet from a Mobile Phone

57

Table 6.2 - Upload times of audio clips over EDGE.

Audio clip length = 30 s Audio clip length = 3min 30s

File Size [kB] Upload Time [s] File Size [kB] Upload Time

HTTP Post

47 14 328 1 min 38 s

48 18 325 1 min 43s

45 17 327 1 min 41 s

Web Services

47 13 328 1 min 40 s

48 12 328 1 min 38 s

47 13 329 1 min 30 s

Table 6.3 presents the measurements of audio clip upload times performed over GPRS, using both HTTP Post and Web Services, for two different audio clip lengths.

Table 6.3 - Upload times of audio clips over GPRS.

Audio clip length = 30 s Audio clip length = 3min 30s

File Size [kB] Upload Time [s] File Size [kB] Upload Time

HTTP Post

46 30 328 2 min 41 s

47 22 328 2 min 39s

47 29 328 2 min 39 s

Web Services

49 32 329 2 min 56 s

47 29 328 2 min 43 s

48 33 325 2 min 49 s

Page 76: Multimedia Sharing over the Internet from a Mobile Phone

58

Table 6.4 presents the measurements of photo upload times performed over Wi-Fi, using both HTTP Post and Web Services, for three different photo settings.

Table 6.4 - Upload times of photos over Wi-Fi.

Low Quality Medium Quality High Quality

File Size [kB]Upload Time [s]

File Size [kB]Upload Time [s]

File Size [kB]Upload Time [s]

HTTP Post

15 2 56 4 392 21

16 3 54 4 422 20

15 2 54 4 388 20

Web Services

17 2 50 4 415 19

14 2 53 4 402 20

15 2 55 4 407 20

Table 6.4 presents the measurements of photo upload times performed over EDGE, using both HTTP Post and Web Services, for three different photo settings.

Table 6.5 - Upload times of photos over EDGE.

Low Quality Medium Quality High Quality

File Size [kB]Upload Time [s]

File Size [kB]Upload Time [s]

File Size [kB]Upload

Time

HTTP Post

15 5 57 17 427 1 min 41 s

15 5 50 17 413 1 min 13s

15 5 55 17 427 1 min 37 s

Web Services

15 6 55 19 411 1 min 49 s

14 5 58 18 380 1 min 10 s

16 5 57 18 387 1 min 11 s

Table 6.6 presents the measurements of photo upload times performed over GPRS, using both HTTP Post and Web Services, for three different photo settings.

Page 77: Multimedia Sharing over the Internet from a Mobile Phone

59

Table 6.6 - Upload times of photos over GPRS.

Low Quality Medium Quality High Quality

File Size [kB]Upload Time [s]

File Size [kB]Upload Time [s]

File Size [kB]Upload

Time

HTTP Post

14 9 54 27 375 2 min 44 s

15 11 53 28 421 3 min 3s

17 12 60 25 391 2 min 50 s

Web Services

15 11 56 30 401 3 min 33 s

17 13 58 31 410 3 min 12 s

15 11 55 30 412 3 min 29 s

Table 6.7 presents the measurements of video clip upload times performed over Wi-Fi, using both HTTP Post and Web Services, for two different video clip lengths.

Table 6.7 - Upload times of video clips over Wi-Fi.

Video clip length = 20 s Video clip length = 60s

File Size [kB] Upload Time File Size [kB] Upload Time

HTTP Post

1634 1 min 29 s 3524 3 min 15 s

1112 1 min 1s 3932 3 min 58s

1502 1 min 18 s 3336 2 min 52 s

Web Services

1203 1 min 20 s 3608 3 min 0 s

1444 1 min 34 s 4332 4 min 44 s

1193 1 min 5 s 3579 3 min 15 s

The main conclusions to be drawn from the previous tables are:

As expected, the upload time increases as the size of the file increases and as the available wireless conditions decrease, that is, if the network in use typically provides smaller data rates. Note that the application performance over Wi-Fi is much better than over EDGE and GPRS.

Page 78: Multimedia Sharing over the Internet from a Mobile Phone

60

Actually, this difference of performance increases as the size of the file to be transmittedincreases.

The performances of HTTP Post and Web Services are globally similar, regardless of the multimedia modalities, file sizes and mobile networks considered.

The attributes of the recorded content influence the size of the files, and consequently, the upload time, especially for high quality photos and for video. In fact, given the characteristics of the photo and video codecs used to compress the files, JPEG and MPEG-4 (which is used in the 3gp file format), respectively, the size of the file increases as the details of an image increase and as the motion of a video increases.

The observed upload times are acceptable according to the requisites of the corresponding application scenarios, proving the application usefulness:

o In scenario A, supported by photo uploading, low quality photos are uploaded in approximately 11 seconds in the worst case, that is, over GPRS, while in the best case, over Wi-Fi, the average upload time is around 2 seconds, which are acceptable values for transmitting photos from a mobile device. High quality photos have upload times of around 20 seconds, over Wi-Fi, and around 3 minutes over GPRS, which may be excessive in some cases, hence the recommendation launched in MobiShare warning not to share high quality photos when GPRS is the only network available.

o In scenario E, supported by video uploading, short videos (20 seconds) are uploaded in little more than 1 minute, while longer videos (1 minute) take around 3 minutes to be uploaded, which are considered acceptable upload time values given the content is video. Note that this scenario is only recommended if Wi-Fi is available, otherwise the upload times would be excessive.

o For scenario F, supported by photo uploading or video uploading, the comments are the same as above. Video uploading may be used to support this scenario when Wi-Fi is available; if not, photo uploading, which, as seen before, allows uploading photos in approximately 10 seconds over GPRS and 5 seconds over EDGE, may be used as well.

o Considering scenario G, supported by video uploading or audio uploading, in most cases, short audio events (30 seconds) are uploaded in approximately 30 seconds, in the worst case (over GPRS), and in 4 seconds, for the best case (over Wi-Fi), that is, in very short time intervals. Long audio events (3 minutes and 30 seconds) can be uploaded within a time period no longer than 20 seconds, if Wi-Fi is available, which is very acceptable given the duration of the audio event. Over GPRS these longer audio events take a little more than 2 minutes and 30 seconds to be uploaded.

The network conditions may have a severe impact in the upload time. As mentioned in section 2.1e, variations in the network traffic load, that is, in the number of users, for example, may influence considerably the wireless conditions, affecting the upload time. The tests were performed in order to minimize the influence of these factors, but due to the randomness associated to them, especially when using public communication networks, it cannot be ensured that network conditions have no influenceon the results. Testing each combination three times helped to minimize this influence, allowing to reach a more representative average value.

Page 79: Multimedia Sharing over the Internet from a Mobile Phone

61

6.2. MobiStream

In the MobiStream application, the achieved average frame period is the main characteristic to evaluate, as mentioned previously. The average frame period is measured as the average time interval between two consecutive photos stored in the web server. The transmission average frame rate can be deduced from the average frame period, by computing its inverse. Moreover, the latency of the transmission is also estimated. The factors influencing the frame rate considered in the series of tests performed are the captureWaitPeriod variable of the mobile device, mentioned in section 5.3.2, the mobile network in use and the selected data transfer protocol.

The mobile device used to conduct the tests is an HTC Touch Pro with the following specifications [57]:

OS Windows Mobile 6.1 Professional.

3.2 Megapixels camera.

Wi-Fi enabled, supporting the IEEE 802.11 b/g protocols.

GSM/GPRS/EDGE/UMTS are the supported networks.

Locked to Vodafone PT.

The web server used to store the images is the same used to test the MobiShare application, and described in the previous subsection.

The script developed to determine the upload time, described previously for evaluating MobiShare´sperformance, was modified in order to create a text file containing the following information related to each transmitted photo:

i. size, in Bytes;

ii. the upload start time instant measured in the mobile device, in minutes and seconds;

iii. the time instant when the file is stored in the web server, in minutes, seconds and milliseconds.

Based on the generated text file, another script was developed to calculate the time interval between each pair of consecutive photos stored in the web server, in milliseconds, and each photo’s upload time, in seconds, as well as the respective average values. Regarding the calculation of the upload time, it is required that the mobile device and the web server clocks are synchronized. Again, the synchronization method applied cannot ensure an accuracy better than one second.

The subsequent tables present the results of the tests conducted to evaluate the performance of the MobiStream application. The results presented in each table were obtained sequentially to try to minimize the variations in the wireless network conditions.

Table 6.8 presents the measurements of average frame periods of a sequence of 200 photos, performed over Wi-Fi, using both HTTP Post and Web Services, for three different values of the captureWaitPeriod: 300 ms, 400 ms and 500 ms. Figure 6.1 a), b) and c) illustrates the variation of the latency for each photo uploaded using HTTP Post and Web Services, for captureWaitPeriod = 300ms, captureWaitPeriod = 400 ms and captureWaitPeriod = 500 ms, respectively, for the same sequence of photos.

Page 80: Multimedia Sharing over the Internet from a Mobile Phone

62

Table 6.8 – Average frame periods of a sequence of 200 photos over Wi-Fi.

captureWaitPeriod = 300 ms

captureWaitPeriod = 400 ms

captureWaitPeriod = 500 ms

Average File Size [kB]

Average Frame

Period [ms]

Average File Size [kB]

Average Frame

Period [ms]

Average File Size [kB]

Average Frame

Period [ms]

HTTP Post 7.67 662 7.51 735 7.49 868

Web Services

8.16 665 7.63 794 7.36 762

Figure 6.1 - Variation of the latency for each photo uploaded using both HTTP Post and Web Services, over Wi-Fi, forcaptureWaitPeriod values of: a) 300 ms, b) 400 ms and c) 500 ms, for a sequence of 200 photos.

0 20 40 60 80 100 120 140 160 180 2000

20

40

60

a)

HTTP Post

Web Services

0 20 40 60 80 100 120 140 160 180 2000

20

40

60

b)

Late

ncy

[s]

0 20 40 60 80 100 120 140 160 180 2000

20

40

60

c)

Frame Rate

Page 81: Multimedia Sharing over the Internet from a Mobile Phone

63

Table 6.9 presents the measurements of average frame periods of a sequence of 100 photos, performed over UMTS, using both HTTP Post and Web Services, for two different values of captureWaitPeriod: 1500 ms and 2000 ms. Figure 6.2 a) and b) illustrates the variation of the latency for each photo uploaded using HTTP Post and Web Services, for captureWaitPeriod = 1500 ms andcaptureWaitPeriod = 2000 ms, respectively, for the same sequence of photos.

Table 6.9 – Average frame periods of a sequence of 100 photos over UMTS.

CaptureWaitPeriod = 1500 ms CaptureWaitPeriod = 2000 ms

Average File Size [kB]

Average Frame Period [ms]

Average File Size [kB]

Average Frame Period [ms]

HTTP Post 7.72 1703 7.36 2524

Web Services 7.41 1861 7.28 2543

Figure 6.2- Variation of the latency for each photo uploaded using both HTTP Post and Web Services, over UMTS,for captureWaitPeriod values of: a) 1500 ms and b) 2000 ms, for a sequence of 100 photos.

0 20 40 60 80 1000

5

10

15

20

25

a)

HTTP Post

Web Services

0 20 40 60 80 1000

5

10

15

20

25

b)

Number of Frames

Late

ncy

[s]

Page 82: Multimedia Sharing over the Internet from a Mobile Phone

64

Table 6.10 presents the measurements of average frame periods of a sequence of 75 photos, performed over EDGE, using both HTTP Post and Web Services, for two different values of captureWaitPeriod: 2500 ms and 3500 ms. Figure 6.3 a) and b) illustrates the variation of the latency for each photo uploaded using HTTP Post and Web Services, for captureWaitPeriod = 2500 ms andcaptureWaitPeriod = 3500 ms, respectively, for the same sequence of photos.

Table 6.10 – Average frame periods of a sequence of 75 photos over EDGE.

CaptureWaitPeriod = 2500 ms CaptureWaitPeriod = 3500 ms

Average File Size [kB]

Average Frame Period [ms]

Average File Size [kB]

Average Frame Period [ms]

HTTP Post 7.28 3144 7.31 3705

Web Services 7.60 3130 7.30 3802

Figure 6.3 - Variation of the latency for each photo uploaded using both HTTP Post and Web Services, over EDGE,for captureWaitPeriod values of: a) 2500 ms and b) 3500 ms, for a sequence of 75 photos.

0 10 20 30 40 50 60 700

10

20

30

40

50

60

a)

HTTP Post

Web Services

0 10 20 30 40 50 60 700

10

20

30

40

50

60

b)

Number of Frames

Late

ncy

[s]

Page 83: Multimedia Sharing over the Internet from a Mobile Phone

65

Table 6.11 presents the measurements of average frame periods of a sequence of 50 photos, performed over GPRS, using both HTTP Post and Web Services, for two different values of captureWaitPeriod: 3500 ms and 5000 ms. Figure 6.4 a) and b) illustrates the variation of the latency for each photo uploaded using HTTP Post and Web Services, for captureWaitPeriod = 3500 ms andcaptureWaitPeriod = 5000 ms, respectively, for the same sequence of photos.

Table 6.11 – Average frame period of a sequence of 50 photos over GPRS.

CaptureWaitPeriod = 3500 ms CaptureWaitPeriod = 5000 ms

Average File Size [kB]

Average Frame Period [ms]

Average File Size [kB]

Average Frame Period [ms]

HTTP Post 7.30 4369 7.16 5224

Web Services 7.03 4037 7.30 5310

Figure 6.4 - Variation of the latency for each photo uploaded using both HTTP Post and Web Services, over GPRS, forcaptureWaitPeriod values of: a) 3500 ms and b) 5000 ms, for a sequence of 50 photos.

0 5 10 15 20 25 30 35 40 45 500

5

10

15

20

a)

Late

ncy

[s]

HTTP Post

Web Services

0 5 10 15 20 25 30 35 40 45 500

5

10

15

20

b)

Number of Frames

Page 84: Multimedia Sharing over the Internet from a Mobile Phone

66

Concerning the average frame period measured in the web server, an analysis of the results presented in the tables above shows that:

As expected, MobiStream performance over Wi-Fi is much better when compared to its performance over the remaining mobile networks, with the minimum achievable average frame period over Wi-Fi being substantially lower. The consequence is that when using Wi-Fi it is possible to work with higher frame rates, thus displaying more images per time unit.

In the best case scenario, for the tested captureWaitPeriod values, the frame rates provided are, approximately, 3 images every 2 seconds over Wi-Fi, 2 images every 3 seconds over UMTS, 1 image every 3 seconds over EDGE and 1 image every 4 seconds over GPRS. These values prove the application’s usefulness, given their suitability to the scenario C characteristics, that is, for surveillance systems for example, as described in chapter 2.

HTTP Post and Web Services provide similar average frame periods for the mobile networks and for the values of captureWaitPeriod considered.

The captureWaitPeriod values selected to perform the tests were chosen according to a series of preliminary trials, which allow determining the minimum time interval that the Stream thread mentioned in section 5.3.2, has to be suspended between successive frame captures. This suspensionallows the network operations and the remaining tasks running in parallel to be performed correctly, that is, preventing the application from blocking. In fact, one of the main tasks executed during the captureWaitPeriod time interval is sending a photo to the web server and getting the corresponding web server response. When this time interval is lower than a certain value, the method used to get the web server response returns null and throws a NullReferenceException. This probably means that, given the network conditions (traffic load and data rates provided), the time interval between writing the photo bytes to the output stream and getting the response from the server is not long enough for aphoto to be completely transmitted and stored in the web server. This problem is solved by increasing the captureWaitPeriod value, thus providing more time to execute the required network operations,avoiding the network overload. This is the reason why the captureWaitPeriod cannot be arbitrarily low and the reason why its value increases as the conditions provided by the network in use decrease: if the available transmission data rate decreases, the time interval required to obtain the web server response increases.

The differences observed among the captureWaitPeriod and the average frame period have to do mainly with the time spent by the application executing the remaining tasks included in the Streamthread, such as the frame capturing and encoding, which are not done during the captureWaitPeriodtime interval in which the Stream thread is suspended, as shown in section 5.3.2, so its execution time must be also considered to obtain the average frame period in the web server.

As far as the latency is concerned, an analysis of the results presented in the figures above shows that:

The network conditions have a severe impact in the upload time. The wide variations observed in the latency in some of the tests conducted over UMTS, EDGE and GPRS are probably due to the fact that these are public networks, accessible to everyone, and therefore, the data rates available to the MobiStream application are affected by the number of users simultaneously accessing the network and by the amount of traffic produced by them, among several factors. As the Wi-Fi network used to perform the tests is private, providing a more controlled environment, the randomness associated to it is much smaller and consequently its results do not show the wide variations shown by the tests conducted over the remaining networks.

Page 85: Multimedia Sharing over the Internet from a Mobile Phone

67

The measurements of the latency for each photo uploaded performed over EDGE forcaptureWaitPeriod = 2500 ms, presented in Figure 6.3 a), show an increase over time using both HTTP Post and Web Services. This phenomenon may indicate that the value of captureWaitPeriod chosen is too low, thus existing the possibility of overloading the network, which may cause a null response from the method used to get the response from the web server, as described previously.

In general, HTTP Post presents a better performance than Web Services. The reason for this difference of performance between HTTP Post and Web Services is probably the small size of the photos uploaded, which benefits HTTP Post over Web Services. In fact, as discussed in section3.2.2, the additional header information introduced by the XML serialization in Web Services has a major impact on smaller files, causing an increase of the upload time and therefore, of the latency.

It is observed that the delay of the first photos of the sequence is higher for Web Services when comparing to HTTP Post. As mentioned in a related Microsoft’s technical article, regarding calling Web Services in a Windows Mobile application [58], this initial behavior is verified because the .NET Compact Framework runtime is setting up and caching the connection details for the Web Service. The referred article also mentions that subsequent calls to the Web Service will show greatly improved performance, which is confirmed by the results obtained. As HTTP Post does not require this type of initial set up, its initial delays are lower and they remain approximately constant over time. Three further conclusions can be drawn from here:

o HTTP Post presents lower jitter, in average, since its delay variations are smaller over time.

o HTTP Post provides a better real-time experience, since the latency of the first photos of the sequence has a major influence in the time delay observed between the beginning of the photo capturing and the moment that the sequence of photos starts being displayed on the client application. Note that to obtain the total latency of the system, the latency associated to the photo transmission between the web server and the client application must be added to the previous latency.

o Given the application scenario supported by MobiStream, scenario C, which aims for real-time transmission, that is, aims for an initial latency as low as possible, HTTP Post should be the data transfer protocol implemented in the application, since this is the alternative providing a better real-time experience.

In the figures presented, it is observed that the total number of photos depicted in the curves does not match the total number of photos of the sequence. For example, in Figure 6.4 b), the red curve only represents around 45 photos, when the sequence contains 50 photos. The reason is that some photos arrived with a considerable delay, not being shown the user, rather being replaced by newly arrived photos.

Take notice that the receiving server characteristics, as a photo streaming receptor, may have some influence as well in the application performance, notably if it is under great traffic load and therefore operating under constrained data rates.

The selection of several captureWaitPeriod values to perform the tests allows to compare and to analyze the application performance under different network conditions within the same mobile network. In order to provide a more reliable application implementation, able to run in a wide range of conditions, high captureWaitPeriod values should be implemented in the application.

Page 86: Multimedia Sharing over the Internet from a Mobile Phone

68

Page 87: Multimedia Sharing over the Internet from a Mobile Phone

69

7. Conclusions

The exciting world of mobile applications is flourishing and is becoming increasingly important every day. Mobile phone manufacturers are struggling to launch more powerful and more featured devices, creating a “war” which benefits, firstly, the application developers, who see an increased number of tools available and new possibilities arising, and ultimately, the consumer, that is, the final client, who can enjoy and take advantage from the new functionalities provided.

The work developed in this dissertation allowed to explore the mobile applications world from the developer perspective, perceiving some of the current technological constraints and capabilities associated to it.

Moreover, the importance of the usage environment analysis for multimedia mobile applications has also been explored, by managing the application execution according to the usage conditions and by recommending the user to switch among modalities, whenever appropriate, in order to provide a better multimedia experience.

7.1. Conclusion

One of the main contributions of this dissertation is making available a set of guidelines and code samples for the development of mobile multimedia sharing applications, regarding topics such as network operations, multithreading and multimedia. The information available is quite disperse and often incomplete, since the documentation provided by the manufacturers does not cover all the existing functionalities that can be implemented, such as Web Services for example, or has some details missing, so developer community forums and other web sites may have to be used as a source of information. Hence, the work done to gather, organize and cluster the information is made available in chapter 4, easing its access for future works.

This dissertation also proposes a classification of application scenarios of mobile multimedia sharingapplications. Seven different scenarios are proposed, according to the duration of the event to share, its visual content, its audio content and the possibility of real-time transmission.

The developed mobile multimedia sharing system is composed by two applications, MobiShare, for BlackBerry OS mobile phones, and MobiStream, for Windows Mobile OS mobile phones, covering five of the identified application scenarios. Four multimedia sharing modalities, supporting the application scenarios, are implemented: video uploading, photo uploading and audio uploading, provided by MobiShare, and photo streaming, provided by MobiStream.

One of the application scenarios covered, scenario A, is related to sharing a single image with someone. The importance of supporting this scenario is unquestionable, given the wide range of real life situations where this type of photo sharing is suitable, such as sightseeing, recording a friends gathering or just showing a certain situation or object to someone.

Another application scenario supported, scenario C, is related to the transmission in real-time of visual-only events. An example of application of this scenario is in surveillance systems: the proposed systemcan be use inside buildings as an alternative to surveillance cameras, benefiting from Wi-Fi networks that may exist. Additionally, given its ability to run over mobile networks such as GPRS, EDGE and UMTS,

Page 88: Multimedia Sharing over the Internet from a Mobile Phone

70

whose coverage is nearly global, it can also be used for monitoring forests, to prevent forest fires, for monitoring skiing resorts, and for monitoring beaches, to provide information about the sea conditions for surfers and to help lifeguards in overseeing the safety of beach users, for example.

Scenario E, also supported by the developed system, is related to sharing short clips of video, thereby covering a wide range of situations, like scenario A. This scenario has a great importance, given the possibility to post the videos in web sites and social networks such as Facebook and Youtube, as soon as they are captured and uploaded.

The remaining application scenarios supported are scenario F, which is suitable for sharing a slideshow,for example, and scenario G, which can be very useful for journalists since it is the proper scenario forsharing audio interviews, speeches, press conferences and so on. It can also be used to share music themes or any kind of short audio events.

Moreover, the work developed in this dissertation allowed to understand the current capabilities and constraints of the existing mobile technology, especially regarding multimedia features.

As mentioned before, BlackBerry was the development environment chosen to develop the dissertation’s multimedia sharing system, but due to limitations in the available BlackBerry APIs, Windows Mobile has also been used.

In fact, for the OS version available to develop this work, OS 4.6.1, the access to the photo camera andto the video camera is very limited, since:

The BlackBerry Multimedia API set for OS 4.6.1 is highly limited in supporting programmatically access to record still images. Several issues regarding methods to manage the photo camera have been reported [59] [60].

When using the device’s embedded camera, the frequency at which it is possible to record a still image is limited, since the screen is redrawn every time a snapshot is taken.

The BlackBerry Multimedia API set for OS 4.6.1 does not support programmatically access to record video [61]. This functionality only became available with the recently launched OS 5.0 [62].

The BlackBerry Multimedia API does not allow to record video to a stream, for any of the available OSs [63].

BlackBerry does not provide libraries to implement streaming protocols, such as RTSP for example.

According to some RIM Spain associates, the reason that some APIs allowing a low level access to device’s camera are not publicly available is probably due to RIM’s policy to release some of their APIs only to organizations belonging to their partnership alliance program, given RIM’s special concerns about the security and stability of their phones.

Comparing to BlackBerry, Windows Mobile proved to have lower-level, flexible and well-established APIs. However, a few constraints related to the video camera have been found as well in this development environment, for the OS version available, OS 6.1:

The Windows Mobile Multimedia API supports video capture and encoding in the Windows Media Video format [64]. However, there are no available APIs to implement it directly: a library

Page 89: Multimedia Sharing over the Internet from a Mobile Phone

71

implementing a filter graph to control the camera within a DMO (DirectX Media Object) wrapper filter has to be developed [65].

Windows Mobile does not provide libraries to implement streaming protocols.

These type of constraints are common for other development environments such as Google’s Android orApple’s iPhone, proving that this is still a technology in development, with a lot to improve and explore, both for the manufacturers and for the application developers.

7.2. Future Work

Future work directions include the development of applications implementing the remaining modalities,especially video streaming, to support all of the application scenarios proposed. Developing video streaming applications for BlackBerry requires creating libraries that work around the previously identified restrictions or negotiating the protected APIs availability with RIM. A possible solution to develop a video streaming application for Windows Mobile may be creating the library for capturing and encoding video in the WMV format described previously and then using or creating a network protocol enabling the transfer of the encoded video in real-time.

Regarding the photo streaming application, a system providing a constant frame rate to the final client can be developed. A possible solution is for the client application to provide a synchronous photo display, removing the jitter. This is typically done by combining the following three mechanisms [66]:

Prefacing each photo with a sequence number: the mobile application increments the sequence number by one for each of the photos generated.

Prefacing each photo with a timestamp: the mobile application stamps each photo with the time at which the photo is generated.

Delaying the display of photos ate the client application: the display delay of the photos must be long enough so that most of the photos are received before their scheduled display time. This delay can be either fixed or may vary adaptively during the streaming session. Photos that do not arrive before their scheduled display times are considered lost and forgotten, and some form of image processing may be used to attempt concealing that loss.

Alternatively, a more sophisticated solution can be developed to broadcast the photo streaming to several clients, without running a special client application. A possible solution may be using the tools provided by the Windows Media Encoder framework [67] to convert the sequence of images into a video file format that can be broadcasted live, over the Internet, and watched using a compatible web browser or a common multimedia player such as Windows Media Player, or similar.

Given the social impact and the commercial potential shown by social networks, an interesting future extension of this work may also reside in using these websites available APIs to integrate them within the developed applications. For example, Facebook APIs may be explored to enable uploading videos, photos or audio clips directly to it.

Developing mobile multimedia sharing applications for additional mobile operating systems may also be of interest.

Page 90: Multimedia Sharing over the Internet from a Mobile Phone

72

Page 91: Multimedia Sharing over the Internet from a Mobile Phone

73

References

[1] L. Correia, "Introduction," in Mobile Communications Systems - Presentations from Lectures, 2009, ch. 1.

[2] J. Akkanen, O. Karonen, and J. Porio, "Peer-to-Peer Video Streaming on Mobile Phones," in 5th IEEE Consumer Communications and Networking Conference, Las Vegas, USA, 2008, pp. 1253-1254.

[3] S. Davies, S. Gardner, and D. Jones., "Quantitative Experiences in Bidirectional Direct Mobile to Mobile Audio & Video Streaming," in International Conference on Information and Communication Technology, Cairo, Egypt, 2007.

[4] S. Davies and S. Gardner., "Empirical Experiences in Mobile-to-Mobile Video Streaming," in Consumer Communications and Networking Conference, Las Vegas, USA, 2008, pp. 442-446.

[5] J. Faichney and R. Gonzalez, "Video Coding for Mobile Handheld Conferencing," Multimedia Tools and Applications, vol. 13, no. 2, p. 165–176, Feb. 2001.

[6] K. Yu, J. Lv, J. Li, and S. Li, "Practical real-time video codec for mobile devices," Proceedings of the International Conference on Multimedia and Expo, vol. 3, pp. 509-512, Jul. 2003.

[7] F. Ciaramello and S. Hemami, "Real-Time Face and Hand Detection for Videoconferencing on a Mobile Device," in Fourth International Workshop on Video Processing and Quality Metrics for Consumer Electronics, Scottsdale, USA, 2009.

[8] M. Lundan and I. Curcio, "3GPP streaming over GPRS Rel '97," in International Conference on Computer Communications and Networks, Tampere, Finland, 2003, pp. 101-106.

[9] M. Lundan and I. Curcio, "Mobile Streaming Services in WCDMA Networks," in IEEE Symposium on Computers and Communications, Cartagena, Spain, 2005, pp. 231-236.

[10] A. Basso, B. J. Kim, and Z. Jiang, "Performance evaluation of MPEG-4 video over realistic EDGE wireless networks," in The 5th International Symposium on Wireless Personal Multimedia Communications, Honolulu, Hawaii, USA, 2002, pp. 1118-1122.

[11] A. Kyriakidou, N. Karelos, and A. Delis, "Video-streaming for Fast Moving Users in 3G Mobile Networks," in International Workshop on Data Engineering for Wireless and Mobile Access,

Page 92: Multimedia Sharing over the Internet from a Mobile Phone

74

Baltimore, USA, 2005, pp. 65-72.

[12] M. Walker, M. Nilsson, T. Jebb, and R. Turnbull., "Mobile Video-Streaming," BT Technology Journal, vol. 21, no. 3, pp. 192-202, Jul. 2003.

[13] R. Weber, M. Guerra, S. Sawhney, L. Golovanevsky, and M. Kang, "Measurement and Analysis of Video Streaming Performance in Live UMTS Networks," in 9th International Symposium on Wireless Personal Multimedia Communications, San Diego, USA, 2006.

[14] I. Haratcherev, J. Taal, K. Langendoen, R. Lagendijk, and H. Sips, "Fast 802.11 link adaptation for real-time video streaming by cross-layer signaling," in IEEE International Symposium on Circuits and Systems, ISCAS, Kobe, Japan, 2005, pp. 3523-3526.

[15] TeliaSonera MediaLab, "Streaming in Mobile Networks," TeliaSonera MediaLab WhitePaper, August 2004.

[16] BlackBerry Application Store. Photo & Video - Sharing. [Online]. http://appworld.blackberry.com/webstore/category/47?recordsPerPage=100

[17] R. Trindade, P. Correia, and M. Santos, "Multimedia Sharing over the Internet from a Mobile Phone," in 3rd International Workshop on Future Multimedia Networking, Krakow, Poland, 2010.

[18] F. Pereira, "Content and Context: Two Worlds to Bridge," in Fourth International Workshop on Content-Based Multimedia Indexing (CBMI), Riga, Latvia, 2005.

[19] F. Pereira, "Multimedia Content Adaptation: May one fit all?," Computational Imaging and Vision, vol. 32, pp. 337-342, Mar. 2006.

[20] T. C. Thang, Y. J. Jung, and Y. M. Ro, "Modality Conversion for QoS Management in Universal Multimedia Access," IEE Proceedings - Vision, Image and Signal Processing, vol. 152, no. 3, pp. 374-384, Jun. 2005.

[21] A. Vetro, "MPEG-21 Digital Item Adaptation: Enabling Universal Multimedia Access," IEEE Multimedia, vol. 11, no. 1, pp. 84-87, Jan. 2004.

[22] L. Correia, "Cellular Networks," in Mobile Communications Systems - Presentations from Lectures, 2009, ch. 6.

Page 93: Multimedia Sharing over the Internet from a Mobile Phone

75

[23] L. Correia, "Mobility and Traffic," in Mobile Communications System - Presentations from Lectures, 2009, ch. 9.

[24] D. Molkdar, W. Featherstone, and S. Larnbotharan, "An overview of EGPRS: the packet data component of EDGE," Electronics & Communication Engineering Journal, vol. 14, no. 1, pp. 21-38, Feb. 2002.

[25] ITU. Evolution to IMT-SC (EDGE). [Online]. http://www.itu.int/ITU-D/imt-2000/DocumentsIMT2000/IMTSC_Rev1.pdf

[26] Rysavy Research, "EDGE, HSPA, LTE: The Mobile Broadband Advantage," 3G Americas, 2007.

[27] L. Correia, "Radio Interface," in Mobile Communications Systems - Presentations from Lectures, 2009, ch. 7.

[28] 3GPP. HSPA. [Online]. http://www.3gpp.org/HSPA

[29] Wi-Fi Alliance. (2010) Discover and Learn. [Online]. http://www.wi-fi.org/discover_and_learn.php

[30] J. Sobrinho, P. Correia, and J. Sanguino, "Data Link Layer," in Computer Networks - Presentations from Lectures, 2008.

[31] BlackBerry Developer Docs. Socket Connection Interface . [Online]. http://www.blackberry.com/developers/docs/4.6.0api/index.html

[32] Windows Mobile Development Center. (2010, Apr.) Socket Functions. [Online]. http://msdn.microsoft.com/en-us/library/aa925696.aspx

[33] BlackBerry Developer Docs. Java.rmi Package. [Online]. http://www.blackberry.com/developers/docs/4.6.0api/index.html

[34] Windows Mobile Development Center. (2003, Dec.) Fundamentals of Microsoft .NET Compact Framework Development for the Microsoft .NET Framework Developer. [Online]. http://msdn.microsoft.com/en-us/library/aa446549.aspx

[35] T. Berners-Lee. Hypertext Transfer Protocol -- HTTP/1.1. [Online]. http://tools.ietf.org/html/rfc2616

Page 94: Multimedia Sharing over the Internet from a Mobile Phone

76

[36] M. A. Holliday, J. T. Houston, and E. M. Jones, "From Sockets and RMI to Web Services," ACM SIGCSE Bulletin, vol. 40, no. 1, pp. 236-240, Mar. 2008.

[37] H. Voorman. Websevices.png. [Online]. http://en.wikipedia.org/wiki/File:Webservices.png

[38] M. Juric, B. Kezmah, and M. Hericko, "Java RMI, RMI Tunneling and Web Services Comparison and Performance Analysis," ACM SIGPLAN Notices, vol. 39, no. 5, pp. 58-65, May 2004.

[39] M. Juric, I. Rozman, and B. Brumen, "Comparison of performance of Web Services, WS-Security, RMI and RMI-SSL," The Journal of Systems and Software, vol. 79, no. 5, pp. 689-700, May 2006.

[40] N. A. B. Gray, "Comparison of Web Services, Java-RMI and CORBA service implementations," in Fifth Australasian Workshop on Software and System Architectures, Melbourne, Australia, 2004, pp. 52-63.

[41] RIM. (2008, May) Introduction to BlackBerry Java Development. Video.

[42] Windows Mobile Development Center. (2009, Sep.) Windows Mobile. [Online]. http://msdn.microsoft.com/en-us/library/bb847935.aspx

[43] RIM. (2009, Nov.) UI Guidelines - BlackBerry Smartphones. [Online]. http://docs.blackberry.com/en/developers/deliverables/6622/BlackBerry_Smartphones-US.pdf

[44] Windows Mobile Development Center. (2008, Aug.) Best Practices for Windows Mobile Application Compatibility. [Online]. http://msdn.microsoft.com/en-us/library/bb677125.aspx

[45] Windows Mobile Developer Center. (2010, Apr.) DirectShow. [Online]. http://msdn.microsoft.com/en-us/library/aa930379.aspx

[46] A. Mogurenko. DirectshowNETCF. [Online]. http://alexmogurenko.com/blog/directshownetcf/

[47] Windows Mobile Development Center. Supported Image Formats. [Online]. http://msdn.microsoft.com/en-us/library/cc907931.aspx

[48] Windows Mobile Development Center. Supported Audio Formats. [Online]. http://msdn.microsoft.com/en-us/library/cc907934.aspx

Page 95: Multimedia Sharing over the Internet from a Mobile Phone

77

[49] Windows Mobile Development Center. Supported Video Formats. [Online]. http://msdn.microsoft.com/en-us/library/cc907935.aspx

[50] A. Wigley, D. Moth, and P. Foot, "Threading," in Microsoft Mobile Development Handbook. Microsoft Press, 2007, ch. 11.

[51] RIM. Interface HttpConnection . [Online]. http://www.blackberry.com/developers/docs/4.6.0api/index.html

[52] C. A. Green. craigAgreen.com. [Online]. http://www.craigagreen.com/index.php?/Blog/blackberry-and-net-webservice-tutorial-part-1.html

[53] C. A. Green. craigAgreen.com. [Online]. http://craigagreen.com/index.php?/Blog/blackberry-and-net-web-service-tutorial-part-2.html

[54] .NET Framework Development Center. HttpRequest.SaveAs method. [Online]. http://msdn.microsoft.com/en-us/library/system.web.httprequest.saveas(VS.85).aspx

[55] Microsoft Developer Network. Walkthrough: Creating a Web Service Using Visual Basic or Visual C#. [Online]. http://msdn.microsoft.com/en-us/library/87h5xz7x.aspx

[56] RIM. BlackBerry Curve 8900 specifications. [Online]. http://na.blackberry.com/eng/devices/blackberrycurve8900/curve_specifications.jsp

[57] HTC. HTC Touch Pro Specification. [Online]. http://www.htc.com/www/product/touchpro/overview.html

[58] Microsoft. (2003, Mar.) Consuming Web Services with the Microsoft .NET Compact Framework. [Online]. http://msdn.microsoft.com/en-us/library/aa446547.aspx

[59] RIM. (2008, Dec.) BlackBerry Java Development Environment Version 4.7.0 - Known Issues. [Online]. http://docs.blackberry.com/en/developers/deliverables/5818/jde_4_7_0_known_issues_RTM.pdf

[60] RIM. (2010, Apr.) BlackBerry Java Application Version 5.0 - Release Notes - Fixed Issues. [Online]. http://docs.blackberry.com/en/developers/deliverables/15396/FI_sample_apps_1027378_11.jsp

Page 96: Multimedia Sharing over the Internet from a Mobile Phone

78

[61] RIM. (2008, Dec.) BlackBerry Java Development Environment Version 4.6.1 - Multimedia Development Guide. [Online]. http://docs.blackberry.com/en/developers/deliverables/5784/Multimedia_guide.pdf

[62] RIM. (2010, Apr.) BlackBerry Java Application Version 5.0 - Multimedia Development Guide. [Online]. http://docs.blackberry.com/en/developers/deliverables/11942/Record_video_without_using_the_camera_app_734824_11.jsp

[63] RIM. (2009) BlackBerry Java Application Version 5.0 - Release Notes - Known Issues - SDR324117. [Online]. http://na.blackberry.com/eng/developers/devbetasoftware/Release_Notes_Beta.pdf

[64] Microsoft. Windows Mobile Development Center - Supported Video Formats.. [Online]. http://msdn.microsoft.com/en-us/library/cc907935.aspx

[65] Microsoft. Windows Mobile Development Center - Supported DirectShow filters. [Online]. http://msdn.microsoft.com/en-us/library/aa930380.aspx

[66] J. F. Kurose and K. W. Ross, Computer Networking: A Top-Down Approach Featuring the Internet.USA: Addison Wesley, 2004.

[67] Microsoft. Windows Media Encoder Features. [Online]. http://www.microsoft.com/windows/windowsmedia/forpros/encoder/features.aspx