18
AIMS’99 Workshop Heidelberg, 11-12 May 1999 Linking User Acceptance and Network Performance Miles Wilkins (BT) P807 (JUPITER2)

AIMS’99 Workshop Heidelberg, 11-12 May 1999 Linking User Acceptance and Network Performance Miles Wilkins (BT) P807 (JUPITER2)

Embed Size (px)

Citation preview

Page 1: AIMS’99 Workshop Heidelberg, 11-12 May 1999 Linking User Acceptance and Network Performance Miles Wilkins (BT) P807 (JUPITER2)

AIMS’99 Workshop

Heidelberg, 11-12 May 1999

Linking User Acceptance and Network Performance

Miles Wilkins (BT)

P807 (JUPITER2)

Page 2: AIMS’99 Workshop Heidelberg, 11-12 May 1999 Linking User Acceptance and Network Performance Miles Wilkins (BT) P807 (JUPITER2)

AIMS’99 Workshop

Heidelberg, 11-12 May 1999

IP-based Services

• And many more …

• The world has gone IP mad (not ATM)

Page 3: AIMS’99 Workshop Heidelberg, 11-12 May 1999 Linking User Acceptance and Network Performance Miles Wilkins (BT) P807 (JUPITER2)

AIMS’99 Workshop

Heidelberg, 11-12 May 1999

But….

• ATM can guarantee bandwidth / delay

• IP services only “Best Effort”

• QoS support being added by IETF and others

• Real-time requirements (multimedia)

• Not enough (reliable) bandwidth/delay on the Internet

• Early use in corporate intranets

• How do you support these applications?

– and protect other data flows?

Page 4: AIMS’99 Workshop Heidelberg, 11-12 May 1999 Linking User Acceptance and Network Performance Miles Wilkins (BT) P807 (JUPITER2)

AIMS’99 Workshop

Heidelberg, 11-12 May 1999

What is ‘QoS’?

• Quality of Service

– variously defined by ITU-T (E.800) and others• Objective

– Network measurements• Subjective

– User’s perception and expectations

• “Constantly meeting customers expectations in a service”

Page 5: AIMS’99 Workshop Heidelberg, 11-12 May 1999 Linking User Acceptance and Network Performance Miles Wilkins (BT) P807 (JUPITER2)

AIMS’99 Workshop

Heidelberg, 11-12 May 1999

What is a realistic range of loss / delay values?

Introduce packetloss & delay

IP NetworkVideoServer

MultimediaCollaboration

How was it for you?

I can’t hear him very well.

What is the relationship between user perceivedQoS and the actual network QoS?

Page 6: AIMS’99 Workshop Heidelberg, 11-12 May 1999 Linking User Acceptance and Network Performance Miles Wilkins (BT) P807 (JUPITER2)

AIMS’99 Workshop

Heidelberg, 11-12 May 1999

Determining ‘typical’ parameters

• Measure traffic characteristics of applications– NetMeeting, NetShow, Cisco IP/TV

• Generate similar test traffic– with sequence numbers and timestamps

Intranet

Page 7: AIMS’99 Workshop Heidelberg, 11-12 May 1999 Linking User Acceptance and Network Performance Miles Wilkins (BT) P807 (JUPITER2)

AIMS’99 Workshop

Heidelberg, 11-12 May 1999

Laboratory Subjective Tests

Network Impairment- Packet Loss- Packet Burst Loss- Packet Delay- Packet Jitter (perceived as loss/delay by user)

Video Serveror

Second User

User

Applications- NetMeeting- NetShow- IP/TV

Data Collection- Questionnaire- Interview- video and screen capture analysisTasks

- video clips- editing/discussion

Page 8: AIMS’99 Workshop Heidelberg, 11-12 May 1999 Linking User Acceptance and Network Performance Miles Wilkins (BT) P807 (JUPITER2)

AIMS’99 Workshop

Heidelberg, 11-12 May 1999

Main Interests & Results• Audio quality

• Video quality

• Overall quality

• Acceptability

– (would you use this system again with this quality?)

• Quality and acceptability judgements were affected by the amount of loss exhibited by a

network and by packet burst size

Page 9: AIMS’99 Workshop Heidelberg, 11-12 May 1999 Linking User Acceptance and Network Performance Miles Wilkins (BT) P807 (JUPITER2)

AIMS’99 Workshop

Heidelberg, 11-12 May 1999

Video Streaming Results

• Effect of packet loss & burst size on IP/TV

0.5 1 4 7

1

2

3

4

5MO S

% Loss

1-2

6-7

9-10

BurstSize

Page 10: AIMS’99 Workshop Heidelberg, 11-12 May 1999 Linking User Acceptance and Network Performance Miles Wilkins (BT) P807 (JUPITER2)

AIMS’99 Workshop

Heidelberg, 11-12 May 1999

Video Streaming Results

• Effect of packet loss & burst size on NetShow

1

2

3

4

5

Overall

Video

Audio

MOS

Loss 0 1/2% 1/2% 1% Burst 0 1-2 6-7 1-2

Quality

Network Condition

Page 11: AIMS’99 Workshop Heidelberg, 11-12 May 1999 Linking User Acceptance and Network Performance Miles Wilkins (BT) P807 (JUPITER2)

AIMS’99 Workshop

Heidelberg, 11-12 May 1999

Conferencing Results

• Audio quality most important and most disturbing– (for some tasks)

• Rating drops between 50mS-120mS jitter– caused by receiver buffer overflow (loss)?

Video Ratings Audio Ratings

Page 12: AIMS’99 Workshop Heidelberg, 11-12 May 1999 Linking User Acceptance and Network Performance Miles Wilkins (BT) P807 (JUPITER2)

AIMS’99 Workshop

Heidelberg, 11-12 May 1999

The Next Experiments• Field Trials

– video streaming and conferencing applications– network impairment on source

• Video Streaming

– validate laboratory tests– live video source (BBC News 24) – NetShow & IP/TV

• Conferencing

– packet loss burst effect? – NetMeeting

Page 13: AIMS’99 Workshop Heidelberg, 11-12 May 1999 Linking User Acceptance and Network Performance Miles Wilkins (BT) P807 (JUPITER2)

AIMS’99 Workshop

Heidelberg, 11-12 May 1999

Performability• Taking results of user subjective test determine how

to use QoS network building blocks to provide required end-end QoS

• Looking at:

– RSVP (Resource ReSeVation Protocol)

– RSVP over ATM

– IP Differential Services

– Winsock2 (support for native ATM and RSVP)

– Queuing technologies (Weighted Fair Queuing, etc)

– H.323 Gatekeeper

– Multimedia Conference Manager

– Sub-net Bandwidth Manager

Page 14: AIMS’99 Workshop Heidelberg, 11-12 May 1999 Linking User Acceptance and Network Performance Miles Wilkins (BT) P807 (JUPITER2)

AIMS’99 Workshop

Heidelberg, 11-12 May 1999

Performability Approach

• Characterise applications traffic / QoS features

– two-party and multi-party– multicast & broadcast– include end-system performance

• Measure operation of QoS techniques

– e.g. RSVP implementation in routers• Match

– network performance required for user acceptance– network performance achievable with QoS methods

Page 15: AIMS’99 Workshop Heidelberg, 11-12 May 1999 Linking User Acceptance and Network Performance Miles Wilkins (BT) P807 (JUPITER2)

AIMS’99 Workshop

Heidelberg, 11-12 May 1999

Example Performability Result• Compare NetMeeting using Best Effort & RSVP

r s r s

SmartBits Load Generator

RadcomAnalyser

Router Router

Serial Link

NetMeetingHost #1

NetMeetingHost #2

Investigate- BE (FIFO queue)- Fair Queueing- Reservation - Controlled Load (Video) - Guaranteed Service (Audio)

Page 16: AIMS’99 Workshop Heidelberg, 11-12 May 1999 Linking User Acceptance and Network Performance Miles Wilkins (BT) P807 (JUPITER2)

AIMS’99 Workshop

Heidelberg, 11-12 May 1999

Experiment & (Early) Results

• Examine end-end delay between hosts for audio (G.723.1) & video (H.263) traffic

– 1) no background load on serial link– 2) background load (1250byte packets), fair-queuing in router– 3) background load (1250byte packets), RSVP reservations

• Audio delay

– 1) 1.5 ms to 28.4 ms, mean 8.8 ms– 2) 1.5 ms to 39.5 ms, mean 9.0 ms– 3) 90% between 1.2ms and 30ms

Page 17: AIMS’99 Workshop Heidelberg, 11-12 May 1999 Linking User Acceptance and Network Performance Miles Wilkins (BT) P807 (JUPITER2)

AIMS’99 Workshop

Heidelberg, 11-12 May 1999

Experiment & (Early) Results

• Video delay

– 1) mean 33.1 ms– 2) 30 ms to 800 ms

• Not acceptable to users

– 3) 3.7 ms to 30ms• Acceptable to users

• RSVP used to meetusers’ requirements

Page 18: AIMS’99 Workshop Heidelberg, 11-12 May 1999 Linking User Acceptance and Network Performance Miles Wilkins (BT) P807 (JUPITER2)

AIMS’99 Workshop

Heidelberg, 11-12 May 1999

Any Questions?