13
Results from the User Survey Tobias Hossfeld WG2 TF„Crowdsourcing“ https://www3.informatik.uni-wuerzburg.de/qoewiki/qualinet:crowd 1

Results from the User Survey Tobias Hossfeld WG2 TF„Crowdsourcing“ crowdcrowd

Embed Size (px)

Citation preview

Page 1: Results from the User Survey Tobias Hossfeld WG2 TF„Crowdsourcing“ crowdcrowd

Results from the User SurveyTobias Hossfeld

WG2 TF„Crowdsourcing“ https://www3.informatik.uni-wuerzburg.de/qoewiki/qualinet:crowd 1

Page 2: Results from the User Survey Tobias Hossfeld WG2 TF„Crowdsourcing“ crowdcrowd

Summary

• Apps of interest (in decreasing order)• Adaptive streaming, 2D video images, VoIP, images, web browsing

• Interests and contributions by VIPs• High interest: Design of test, statistical analysis• Very few VIPs: implementation and execution

Time concerns by VIPs, limited resources possible for doing tests Focus on existing (lab and crowdsourcing) data sets Discussion in Phone Conference, see doodle link

• Crowdsourcing data available / VIPs available for all steps (test design, implementation, execution, analysis)– Web browsing: data available (Martin, Lea, Toni, Tobias)– VoIP and image: VIPs for all steps available

• Lab results available / VIPs available– Available: images, 2D video– VoIP: will be executed– Web browsing: only implementation missing

2WG2 TF„Crowdsourcing“ https://www3.informatik.uni-wuerzburg.de/qoewiki/qualinet:crowd

Page 3: Results from the User Survey Tobias Hossfeld WG2 TF„Crowdsourcing“ crowdcrowd

Which application? Your Contribution?

3WG2 TF„Crowdsourcing“ https://www3.informatik.uni-wuerzburg.de/qoewiki/qualinet:crowd

Crowdsourcing

Laboratory

ApplicationWhich application do you prefer for the JOC?

How will you contribute to crowdsourcing experiment?

How will you contribute to the lab experiment?

Page 4: Results from the User Survey Tobias Hossfeld WG2 TF„Crowdsourcing“ crowdcrowd

Detailed View: Contributions

• Of interest and contributions– images, web browsing, VoIP, adaptive streaming, 2D video

• Out of scope, too many problems– File storage, Radio streaming, Other

4WG2 TF„Crowdsourcing“ https://www3.informatik.uni-wuerzburg.de/qoewiki/qualinet:crowd

Crowdsourcing 2D video Adaptive streaming VoIP Web browsing Images Radio Streaming File Storage Other SumDesign of test 5 4 3 2 2 2 2 1 21Implementation 0 0 1 0 1 0 0 1 3Execution 1 2 1 2 1 0 1 0 8Statistical Analysis 2 2 2 2 2 2 2 2 16Sum per app 8 8 7 6 6 4 5 4

Laboratory 2D video Adaptive streaming VoIP Web browsing Images Radio Streaming File Storage Other SumDesign of test 5 4 4 2 2 2 2 0 21Implementation 0 0 0 0 0 0 0 0 0Execution 1 1 0 1 1 0 0 0 4Statistical Analysis 1 2 2 2 2 2 2 2 15Sum per app 7 7 6 5 5 4 4 2

0 1 2 3 4 5 6 7 8

2D video

Adaptive streaming

VoIP

Web browsing

Images

Radio Streaming

File Storage

Other

#contributors

crowdlab

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

2D video

Adaptive streaming

VoIP

Web browsing

Images

Radio Streaming

File Storage

Other

#weighted contributions0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

2D video

Adaptive streaming

VoIP

Web browsing

Images

Radio Streaming

File Storage

Other

potential problems (normalized)

Page 5: Results from the User Survey Tobias Hossfeld WG2 TF„Crowdsourcing“ crowdcrowd

Research Questions

• Develop and apply methodology

• Derive QoE model for selected app

• Analyze impact of crowdsourcing environment

• Providing database with crowdsourcing results

• Do results using crowdsourcing platforms differ from results of an test using a dedicated panel and in which sense? What does it imply for QoE assessment and the tools we (can) use?

• Do results using crowdsourcing differ from results from controlled lab experiments (and in a next step possibly even more realistic home environments)?

5WG2 TF„Crowdsourcing“ https://www3.informatik.uni-wuerzburg.de/qoewiki/qualinet:crowd

Questions 2D video Adaptive streaming VoIP Web browsing Images Radio Streaming File Storage Sum

Develop and apply methodology 2 2 1 0 0 0 0 5

Derive new QoE model for selected app 2 3 0 1 2 2 2 12

Analyze impact of crowdsourcing environment 3 2 2 2 2 1 1 13

Providing database with crowdsourcing results 0 0 0 1 0 0 0 1Sum per app 7 7 3 4 4 3 3

Page 6: Results from the User Survey Tobias Hossfeld WG2 TF„Crowdsourcing“ crowdcrowd

Invididual comments

• Contributions– We are currently developing 2 applications of possible interest- one is a

VoIP client within webRTC and the other is an intermedia synch application similar to HbbTV (broadcast/broadbandTV)..which we also hope to deploy on webRTC platform. Both are still at development stage..so perhaps I am being a bit optimistic !

– I can do data analysis for first two options as well.– The chosen app and link to ongoing activities, will determine how much I

can be involved. Also depending on the app, I could also link up to the iMinds panel.

• Problems– Heterogeneous possibly time-variant users' connections– I am completely novice with everything related to the implementation, but

I see some methodological challenges related to the cross-device use (and how this links up to QoE) of e.g., personal cloud storage apps and adaptive video streaming.

– No time

6WG2 TF„Crowdsourcing“ https://www3.informatik.uni-wuerzburg.de/qoewiki/qualinet:crowd

Page 7: Results from the User Survey Tobias Hossfeld WG2 TF„Crowdsourcing“ crowdcrowd

Next Steps

1. Summary via mailing list / wiki– Your interests – Your contributions

2. Collective decision within TF– Collect info from all TF participants– Google survey form

3. Online meeting– Decision on concrete application, platform, research questions– Allocation of work for VIPs– Rough time schedule

4. Time plan– 15/03/2013: summary– 22/03/2013: google survey sent around– 31/03/2013: TF fills survey– Mid april: online meeting

7WG2 TF„Crowdsourcing“ https://www3.informatik.uni-wuerzburg.de/qoewiki/qualinet:crowd

Page 8: Results from the User Survey Tobias Hossfeld WG2 TF„Crowdsourcing“ crowdcrowd

Summary from Breakout Session

WG2 TF„Crowdsourcing“ https://www3.informatik.uni-wuerzburg.de/qoewiki/qualinet:crowd 8

Page 9: Results from the User Survey Tobias Hossfeld WG2 TF„Crowdsourcing“ crowdcrowd

Contributions by Participants

• Design of user test– Source contents for tests (video, images): Marcus Barkowsky– Test design: Lucjan Janowski, Katrien de Moor, Miguel Rios-Quintero

• Implementation of test– Lab test for image quality: Judith Redi, Filippo Mazza– Lab test for VoIP: Christian Hoene– Online test for VoIP: Christian Hoene– Crowdsourcing test for images/video: Christian Keimel– Crowdsourcing test for HTTP video streaming: Andreas Sackl, Michael Seufert, Tobias Hossfeld– Crowdsourcing platform with screen quality measurements: Bruno Gardlo– Crowdsourcing micro-task platform: Babk Naderi, Tim Polzehl

• Execution of test– Crowdsourcing: Tobias Hossfeld– Online panel: Katrien de Moor– Lab test for image quality: Judith Redi, Filippo Mazza– Lab test for VoIP: Christian Hoene– Crowdsourcing test for images/video: Christian Keimel– Crowdsourcing test for HTTP video streaming: Andreas Sackl, Michael Seufert, Tobias Hossfeld

• Data analysis– Identification of key influence factors and modeling: Tobias Hossfeld, Judith Redi– Comparison between crowdsourcing and lab: Tobias Hossfeld, Marcus Barkowsky, Katrien de Moor, Martin

Varela, Lea Skorin-Kapov– Model validation: Marcus Barkowsky

9WG2 TF„Crowdsourcing“ https://www3.informatik.uni-wuerzburg.de/qoewiki/qualinet:crowd

Page 10: Results from the User Survey Tobias Hossfeld WG2 TF„Crowdsourcing“ crowdcrowd

Summary of Interests

10WG2 TF„Crowdsourcing“ https://www3.informatik.uni-wuerzburg.de/qoewiki/qualinet:crowd

Application / Topic

VIPs Methodology QoE model Crowd impact

Web browsing Martin Varela, Lea Skorin-Kapov, Tobias Hossfeld

  Visual appeal, loading times; mobile web

Payments, demographics on reliability / model

VoIP Christian Hoene MUSHRA OPUS User at home vs. lab vs. crowd

Image Filippo Mazza, Ann Dooms, Judith Redi

    Comparison with lab; gender issue

Video streaming

Christian Keimel, Ulrich Reiter, Christian Timmerer, Andres Sackl, Michael Seufert, Tobias Hossfeld, Marcus Barkowsky

Profiling and characterization of (source) contents

DASH; adaptive playout; HTTP streaming; long duration videos

Impact of demographics

HDTV Hugh Melvin   HDTV  Application independent

Marcus Barkowsky, Tobias Hossfeld, Katrien de Moor, Lucjan Janowski

Profiling user Merging different user studies; influencing factors

Quantify influence of environment on reliability and data quality; reliability metrics

Crowdsourcing plattform

Bruno Gardlo, Babak Naderi

Development of own platform

  Motivation and incentives on reliability and data quality

Page 11: Results from the User Survey Tobias Hossfeld WG2 TF„Crowdsourcing“ crowdcrowd

Summary of Contributions

11WG2 TF„Crowdsourcing“ https://www3.informatik.uni-wuerzburg.de/qoewiki/qualinet:crowd

Application Design of test Implementation Execution AnalysisWeb browsing MV, LSK Lab/online: MV, LSK Crowd: TH, KM

Lab:MB, TH, KdM, LJ, MV, LSK

VoIP CH Lab: CHOnline: CH

Crowd: TH, KMLab: CH

MB, TH, KdM, LJ

Image KdM, MV, LSK Contents: FMLab: FMCrowd: BG, CK

Crowd: TH, KM, BG, CKLab: FM, JR

MB, TH, KdM, LJ

Video streaming KdM, MV, LSK, MB Contents: MBLab:Crowd: BG, CK

Crowd: TH, KM, BG, CKLab:

MB, TH, KdM, LJ, UR

HDTV HM      

Page 12: Results from the User Survey Tobias Hossfeld WG2 TF„Crowdsourcing“ crowdcrowd

Input collected before Novi Sad meeting

WG2 TF„Crowdsourcing“ https://www3.informatik.uni-wuerzburg.de/qoewiki/qualinet:crowd 12

Page 13: Results from the User Survey Tobias Hossfeld WG2 TF„Crowdsourcing“ crowdcrowd

Interest in Joint Qualinet Experiment

• Filippo Mazza, Patrick le Callet, Marcus Barkowsky: comparison of lab and crowdsourcing experiments considering model validation; directly related to “Validation TF”

• Martin Varela, Lea Skorin-Kapov: impact of crowdsourcing environment on user results and QoE models, e.g. incentives and payments on the example of Web QoE; directly related to “Web/Cloud TF”

• Christian Keimel: Impact of crowdsourcing environment on user results and QoE models, e.g. demographics

• Andreas Sackl, Michael Seufert: Impact of content/consistency questions on QoE ratings, e.g. for HTTP video streaming; directly related to “Web/Cloud TF”

• Bruno Gardlo: currently working on improved crowdsourcing platform with screen quality measurement etc.; interest in incentive design, gamification; platform may be used for experiment, e.g. for videos or images

• Katrien de Moor: contribution in the questionnaire development/refinement and/or by setting up a comparative lab test

• Babak Naderi: development of crowdsourcing micro-task platform which may be used for joint experiment; incentives, data quality control, effects of platform-dependent and user-dependent factors on motivation and data quality

13WG2 TF„Crowdsourcing“ https://www3.informatik.uni-wuerzburg.de/qoewiki/qualinet:crowd