View
1.367
Download
0
Category
Preview:
DESCRIPTION
Dan Burnett, editor of the WebRTC specification and author of The WebRTC Book provides an excellent tutorial on WenRTC at TADS, 21-22 Nov 2013 in Bangkok
Citation preview
Introduc)on to WebRTC
Dan Burne4 Chief Scien)st, Tropo Director of Standards, Voxeo
Alan Johnston Dis)nguished Engineer Avaya
WebRTC
WebRTC Tutorial Topics
• What is WebRTC? • How to Use WebRTC • WebRTC Peer-‐to-‐Peer Media • WebRTC Protocols and IETF Standards • WebRTC W3C API Overview • Pseudo Code Walkthrough • PracFcal bits
TAD Summit Bangkok 2013 2
What is WebRTC?
WebRTC is “Voice & Video in the browser”
• Access to camera and microphone without a plugin – No proprietary plugin required!
• Audio/video direct from browser to browser • Why does it maWer? – Media can stay local – Mobile devices eventually dropping voice channel anyway
– Games
TAD Summit Bangkok 2013 4
The Browser RTC FuncFon • WebRTC adds new Real-‐
Time CommunicaFon (RTC) FuncFon built-‐in to browsers – No download – No Flash or other plugins
• Contains – Audio and video codecs – Ability to negoFate peer-‐to-‐
peer connecFons – Echo cancellaFon, packet loss
concealement • In Chrome & Firefox today,
Internet Explorer someFme and Safari eventually
HTTP or WebSockets
On-‐the-‐wire protocols
RTC APIs Other APIs
NaFve OS Services
Web Server
JavaScript/HTML/CSS
Web Browser
Browser RTC
FuncFon
(Signaling)
(Media or Data)
HTTP or WebSockets
Signaling Server
TAD Summit Bangkok 2013 5
Benefits of WebRTC
For Developer • Streamlined development –
one placorm • Simple APIs – detailed
knowledge of RTC protocols not needed
• NAT traversal only uses expensive relays when no other choice
• Advanced voice and video codecs without licensing
For User • No download or install –
easy to use • All communicaton
encrypted – private • Reliable session
establishment – “just works”
• Excellent voice and video quality
• Many more choices for real-‐Fme communicaFon
TAD Summit Bangkok 2013 6
WebRTC Support of MulFple Media
• MulFple sources of audio and video are assumed and supported
• All media, voice and video, and feedback messages are mulFplexed over the same transport address
Stereo Audio
Browser M on Mobile
Browser L on Laptop
Front Camera Video
Rear Camera Video
Microphone Audio
ApplicaFon Sharing Video
WebCam Video
TAD Summit Bangkok 2013 7
WebRTC Triangle
• Both browsers running the same web applicaFon from web server
• Peer ConnecFon established between them with the help of the web server
Web Server (ApplicaFon)
Browser M (Running HTML5 ApplicaFon
from Web Server)
Browser L (Running HTML5 ApplicaFon
from Web Server)
Peer ConnecFon (Audio, Video, and/or Data)
TAD Summit Bangkok 2013 8
WebRTC Trapezoid
• Similar to SIP Trapezoid • Web Servers communicate using SIP or Jingle or proprietary • Could become important in the future.
Web Server A (ApplicaFon A)
Peer ConnecFon (Audio and/or Video)
Web Server B (ApplicaFon B)
SIP or Jingle
Browser M (Running HTML5 ApplicaFon
from Web Server A)
Browser T (Running HTML5 ApplicaFon
from Web Server B)
TAD Summit Bangkok 2013 9
WebRTC and SIP
• SIP (Session IniFaFon Protocol) is a signaling protocol used by service providers and enterprises for real-‐Fme communcaFon
• Peer ConnecFon appears as a standard RTP session, described by SDP • SIP Endpoint must support RTCWEB media extensions
Web Server
SIP Client
Peer ConnecFon (Audio and/or Video)
SIP Server SIP
SIP
Browser M
TAD Summit Bangkok 2013 10
WebRTC and Jingle Web Server
Jingle Client
Peer ConnecFon (Audio and/or Video)
XMPP Server Jingle
Jingle
Browser M
• Jingle is a signaling extension to XMPP (Extensible Messaging and Presence Protocol, aka Jabber)
• Peer ConnecFon SDP can be mapped to Jingle • Jingle Endpoint must support RTCWEB Media extensions
TAD Summit Bangkok 2013 11
WebRTC and PSTN
• Peer ConnecFon terminates on a PSTN Gateway • Audio Only • EncrypFon ends at Gateway
Web Server
Browser M PSTN Gateway Peer ConnecFon (Audio)
Phone
TAD Summit Bangkok 2013 12
WebRTC with SIP
• Browser runs a SIP User Agent by running JavaScript from Web Server • SRTP media connecFon uses WebRTC APIs • Details in [dram-‐iec-‐sipcore-‐websocket] that defines SIP transport over
WebSockets
13
Web Server
SRTP Media
SIP Proxy/Registrar Server
Browser M (running JavaScript SIP UA)
Browser T (running JavaScript SIP UA)
HTTP (HTML5/CSS/JavaScript)
WebSocket (SIP)
WebSocket (SIP)
HTTP (HTML5/CSS/JavaScript)
TAD Summit Bangkok 2013 13
WebRTC Signaling Approaches
• Signaling is required for exchange of candidate transport addresses, codec informaFon, media keying informaFon
• Many opFons – choice is up to web developer
14
TAD Summit Bangkok 2013 14
How to Use WebRTC
WebRTC usage in brief
Set Up Peer ConnecFon
Obtain Local Media
Exchange Offer/Answer
AWach Media or Data
Get more media
All media added
Peer ConnecFon established
AWach more media or data
Ready for call
TAD Summit Bangkok 2013 16
WebRTC usage in brief
• getUserMedia() – Audio and/or video – Constraints – User permissions
• Browser must ask before allowing a page to access microphone or camera
• MediaStream • MediaStreamTrack
– CapabiliFes – States (sepngs)
Set Up Peer ConnecFon
Obtain Local Media
Exchange Offer/Answer
AWach Media or Data
Get more media
All media added
Peer ConnecFon established
AWach more media or data
Ready for call
TAD Summit Bangkok 2013 17
WebRTC usage in brief
• RTCPeerConnection – Direct media – Between two peers – ICE processing – SDP processing – DTMF support – Data channels – IdenFty verificaFon – StaFsFcs reporFng
Set Up Peer ConnecFon
Obtain Local Media
Exchange Offer/Answer
AWach Media or Data
Get more media
All media added
Peer ConnecFon established
AWach more media or data
Ready for call
TAD Summit Bangkok 2013 18
WebRTC usage in brief
• addStream() – Doesn't change media state!
• removeStream() – DiWo!
• createDataChannel() – Depends on transport
Set Up Peer ConnecFon
Obtain Local Media
Exchange Offer/Answer
AWach Media or Data
Get more media
All media added
Peer ConnecFon established
AWach more media or data
Ready for call
TAD Summit Bangkok 2013 19
WebRTC usage in brief
• createOffer(), createAnswer()
• setLocalDescription(), setRemoteDescription()
• Applying SDP answer makes the magic happen
Set Up Peer ConnecFon
Obtain Local Media
Exchange Session
DescripFons
AWach Media or Data
Get more media
All media added
Peer ConnecFon established
AWach more media or data
Ready for call
TAD Summit Bangkok 2013 20
WebRTC usage – a bit more detail
Set Up Peer ConnecFon
Obtain Local Media
Exchange Session
DescripFons
AWach Media or Data
Get more media
AWach more media or data
Set Up Signaling Channel
TAD Summit Bangkok 2013 21
SDP offer/answer • Session DescripFons – Session DescripFon Protocol created for use by SIP in sepng up voice (and video) calls
– Describes real-‐Fme media at low level of detail • Which IP addresses and ports to use • Which codecs to use
• Offer/answer model (JSEP) – One side sends an SDP offer lisFng what it wants to send and what it can receive
– Other side replies with an SDP answer lisFng what it will receive and send
TAD Summit Bangkok 2013 22
WebRTC Peer-‐to-‐Peer Media
Media Flows in WebRTC
Web Server
Internet
Router
Browser L
Home WiFi Router
Browser T
Browser M
Coffee Shop WiFi Router
Browser D
TAD Summit Bangkok 2013 24
Media without WebRTC
Web Server
Internet
Router
Browser L
Home WiFi Router
Browser T
Browser M
Coffee Shop WiFi Router
Browser D
TAD Summit Bangkok 2013 25
Peer-‐to-‐Peer Media with WebRTC
Web Server
Internet
Router
Browser L
Home WiFi Router
Browser T
Browser M
Coffee Shop WiFi Router
Browser D
TAD Summit Bangkok 2013 26
Web Server
Internet
Router with NAT
Browser L
Home WiFi with NAT
Browser T
Browser M
Most browsers are behind NATs on the Internet, which complicates the establishment of peer-‐to-‐peer media sessions.
Coffee Shop WiFi with
NAT
Browser D
NAT Complicates Peer-‐to-‐Peer Media
TAD Summit Bangkok 2013 27
What is a NAT? • Network Address Translator (NAT) • Used to map an inside address (usually a private IP address) to outside address (usually a public IP address) at Layer 3
• Network Address and Port TranslaFon (NAPT) also changes the transport port number (Layer 4) – These are omen just called NATs as well
• One reason for NAT is the IP address shortage
TAD Summit Bangkok 2013 28
Internet
Home WiFi with NAT
Browser T 192.168.0.6
Browser M 192.168.0.5
“Outside” Public IP Address 203.0.113.4
“Inside” Private IP Addresses 192.168.x.x
NAT Example
TAD Summit Bangkok 2013 29
NATs and ApplicaFons • NATs are compaFble with client/server protocols such as web, email, etc.
• However, NATs generally block peer-‐to-‐peer communicaFon
• Typical NAT traversal for VoIP and video services today use a media relay whenever the client is behind a NAT – Omen done with an SBC – Session Border Controller
– This is a major expense and complicaFon in exisFng VoIP and video systems
• WebRTC has a built-‐in NAT traversal strategy: InteracFve ConnecFvity Establishment (ICE)
TAD Summit Bangkok 2013 30
Web Server
Internet
Router with NAT
Browser L
Home WiFi with NAT
Browser T
Browser M
ICE connecFvity checks can omen establish a direct peer-‐to-‐peer session between browsers behind different NATs
Coffee Shop WiFi with
NAT
Browser D
Peer-‐to-‐Peer Media Through NAT
TAD Summit Bangkok 2013 31
ICE ConnecFvity Checks • ConnecFvity through NAT can be achieved using ICE connecFvity checks
• Browsers exchange a list of candidates – Local: read from network interfaces – Reflexive: obtained using a STUN Server – Relayed: obtained from a TURN Server (media relay)
• Browsers aWempt to send STUN packets to the candidate list received from other browser
• Checks performed by both sides at same Fme • If one STUN packet gets through, a response is sent and this connecFon used for communicaFon – TURN relay will be last resort (lowest priority)
TAD Summit Bangkok 2013 32
Web Server
Internet
Router with NAT
Browser L
Home WiFi with NAT
Browser T
Browser M
If both browsers are behind the same NAT, connecFvity checks can omen establish a connecFon that never leaves the NAT.
Coffee Shop WiFi with
NAT
Browser D
P2P Media Can Stay Local to NAT
TAD Summit Bangkok 2013 33
Web Server
Internet
Router with NAT
Browser L
Home WiFi with NAT
203.0.113.4
Browser T
Browser M 192.168.0.5
Coffee Shop WiFi with
NAT
Browser D
TURN Server 198.51.100.2
ICE uses STUN and TURN servers in the public Internet to help with NAT traversal.
STUN Server 198.51.100.9
ICE Servers
TAD Summit Bangkok 2013 34
Web Server
Internet
Router with NAT
Browser L
Home WiFi with NAT
203.0.113.4
Browser T
Browser M 192.168.0.5
Coffee Shop WiFi with
NAT
Browser D
TURN Server 198.51.100.2
Browser sends STUN test packet to STUN server to learn its public IP address (address of the NAT).
STUN Server 198.51.100.9
Browser Queries STUN Server
TAD Summit Bangkok 2013 35
Web Server
Internet
Router with NAT
Browser L
Home WiFi with NAT
Browser T
Browser M
Coffee Shop WiFi with
NAT
Browser D
TURN Server as a Media Relay
In some cases, connecFvity checks fail, and a TURN Media Relay on the public Internet must be used.
STUN Server
TURN Server Can Relay Media
TAD Summit Bangkok 2013 36
WebRTC Protocols and IETF Standards
WebRTC: A Joint Standards Effort • Internet Engineering Task Force (IETF) and World Wide Web ConsorFum (W3C) are working together on WebRTC
• IETF – Protocols – “bits on wire” – Main protocols are already RFCs, but many extensions in progress
– RTCWEB (Real-‐Time CommunicaFons on the Web) Working Group is the main focus, but other WGs involved as well
– hWp://www.iec.org • W3C – APIs – used by JavaScript code in HTML5 – hWp://www.w3c.org
TAD Summit Bangkok 2013 38
WebRTC Protocols
IP
UDP SCTP
WebSocket SRTP SDP
TURN
STUN
ICE
TCP TLS Transport Layer
Network Layer
ApplicaFon Layer
DTLS
HTTP
SIP is not shown as it is opFonal TAD Summit Bangkok 2013 39
IETF RTCWEB Documents Document) Title) Ref)
Overview' “Overview:'Real'Time'Protocols'for'Browser6based'Applications”'
draft6ietf6rtcweb6overview'
Use'Cases'and'Requirements' “Web'Real6Time'Communication'Use6cases'and'Requirements”'
draft6ietf6rtcweb6use6cases6and6requirements'
RTP'Usage'“Web'Real6Time'Communication'(WebRTC):'Media'Transport'and'
Use'of'RTP”'
draft6ietf6rtcweb6rtp6usage'
Security'Architecture' “RTCWEB'Security'Architecture”' draft6ietf6rtcweb6security6arch'
Threat'Model' “Security'Considerations'for'RTC6Web”'
draft6ietf6rtcweb6security'
Data'Channel' “RTCWeb'Data'Channels”' draft6ietf6rtcweb6data6channel'
JSEP' “JavaScript'Session'Establishment'Protocol”'
draft6ietf6rtcweb6jsep'
Audio' “WebRTC'Audio'Codec'and'Processing'Requirements”'
draft6ietf6rtcweb6audio'
Quality'of'Service' “DSCP'and'other'packet'markings'for'RTCWeb'QoS”'
draft6ietf6rtcweb6qos'
'TAD Summit Bangkok 2013 40
Codecs
• Mandatory to Implement (MTI) audio codecs are seWled on Opus and G.711 (finally!)
• Video is not yet decided!
RFC 6716 .
TAD Summit Bangkok 2013 41
WebRTC W3C API Overview
Two primary API secFons
• Handling local media – Media Capture and Streams (getUserMedia) specificaFon
• Transmipng media – WebRTC (Peer ConnecFon) specificaFon
TAD Summit Bangkok 2013 43
Local Media Handling
• In this example – Captured 4 local media streams – Created 3 media streams from them – Sent streams over Peer ConnecFon
Sources
Presenter Video
DemonstraFon Video
Audio
PresentaFon Video
PresentaFon Stream
Presenter Stream
DemonstraFon Stream
Captured MediaStreams
Created MediaStreams
Tracks
“Audio” Track
“PresentaFon” Track
“Audio” Track
“Presenter” Track
“Audio” Track
“DemonstraFon” Track
Front Camera Video Rear Camera Video
Microphone Audio ApplicaFon Sharing Video
Browser M
TAD Summit Bangkok 2013 44
Local Media Handling
• Sources – Encoded together – Can't manipulate individually
Sources
Presenter Video
DemonstraFon Video
Audio
PresentaFon Video
PresentaFon Stream
Presenter Stream
DemonstraFon Stream
Captured MediaStreams
Created MediaStreams
Tracks
“Audio” Track
“PresentaFon” Track
“Audio” Track
“Presenter” Track
“Audio” Track
“DemonstraFon” Track
Front Camera Video Rear Camera Video
Microphone Audio ApplicaFon Sharing Video
Browser M
TAD Summit Bangkok 2013 45
Local Media Handling
• Tracks (MediaStreamTrack) – Tied to a source – Exist primarily as part of Streams; single media type – Globally unique ids; opFonally browser-‐labeled
Sources
Presenter Video
DemonstraFon Video
Audio
PresentaFon Video
PresentaFon Stream
Presenter Stream
DemonstraFon Stream
Captured MediaStreams
Created MediaStreams
Tracks
“Audio” Track
“PresentaFon” Track
“Audio” Track
“Presenter” Track
“Audio” Track
“DemonstraFon” Track
Front Camera Video Rear Camera Video
Microphone Audio ApplicaFon Sharing Video
Browser M
TAD Summit Bangkok 2013 46
Local Media Handling
• Captured MediaStream – Returned from getUserMedia() – Permission check required to obtain
Sources
Presenter Video
DemonstraFon Video
Audio
PresentaFon Video
PresentaFon Stream
Presenter Stream
DemonstraFon Stream
Captured MediaStreams
Created MediaStreams
Tracks
“Audio” Track
“PresentaFon” Track
“Audio” Track
“Presenter” Track
“Audio” Track
“DemonstraFon” Track
Front Camera Video Rear Camera Video
Microphone Audio ApplicaFon Sharing Video
Browser M
TAD Summit Bangkok 2013 47
Local Media Handling
• MediaStream – All contained tracks are synchronized – Can be created, transmiWed, etc.
Sources
Presenter Video
DemonstraFon Video
Audio
PresentaFon Video
PresentaFon Stream
Presenter Stream
DemonstraFon Stream
Captured MediaStreams
Created MediaStreams
Tracks
“Audio” Track
“PresentaFon” Track
“Audio” Track
“Presenter” Track
“Audio” Track
“DemonstraFon” Track
Front Camera Video Rear Camera Video
Microphone Audio ApplicaFon Sharing Video
Browser M
TAD Summit Bangkok 2013 48
Local Media Handling
• Sepngs – Current values of source properFes (height, width, etc.)
– Exposed on MediaStreamTrack • CapabiliFes – Allowed values for source properFes – Exposed on MediaStreamTrack
• Constraints – Requested ranges for track properFes – Used in getUserMedia(), applyConstraints()
TAD Summit Bangkok 2013 49
Transmipng media
• Signaling channel – Non-‐standard – Must exist to set up Peer ConnecFon
• Peer ConnecFon – Links together two peers – Add/Remove Media Streams
• addStream(), removeStream() – Handlers for ICE or media change – Data Channel support
TAD Summit Bangkok 2013 50
Peer ConnecFon
• "Links" together two peers – Via new RTCPeerConnection() – Generates Session DescripFon offers/answers
• createOffer(), createAnswer() – From SDP answers, iniFates media
• setLocalDescription(), setRemoteDescription()
– Offers/answers MUST be relayed by applicaFon code!
– ICE candidates can also be relayed and added by app • addIceCandidate()
TAD Summit Bangkok 2013 51
Peer ConnecFon
• Handlers for signaling, ICE or media change – onsignalingstatechange – onicecandidate, oniceconnectionstatechange
– onaddstream, onremovestream – onnegotiationneeded
– A few others
TAD Summit Bangkok 2013 52
Peer ConnecFon
• “Extra” APIs – Data – DTMF – StaFsFcs – IdenFty
• Grouped separately in WebRTC spec – but really part of RTCPeerConnection definiFon
– all are mandatory to implement
TAD Summit Bangkok 2013 53
Data Channel API • RTCDataChannel createDataChannel() • Configurable with
– ordered – maxRetransmits, maxRetransmitTime – negotiated – id
• Provides RTCDataChannel with – send() – onopen, onerror, onclose, onmessage*
TAD Summit Bangkok 2013 54
DTMF API
• RTCDTMFSender createDTMFSender() – Associates track input parameter with this RTCPeerConnection
• RTCDTMFSender provides – boolean canInsertDTMF() – insertDTMF() – ontonechange – (other stuff)
TAD Summit Bangkok 2013 55
StaFsFcs API
• getStats() – Callback returns staFsFcs for given track
• StaFsFcs available (local/remote) are: – Bytes/packets xmiWed – Bytes/packets received
• May be useful for congesFon-‐based adjustments
TAD Summit Bangkok 2013 56
IdenFty API
• setIdentityProvider(), getIdentityAssertion()
• Used to verify idenFty via third party, e.g., Facebook Connect
• Both methods are opFonal • onidentity handler called amer any verificaFon aWempt
• RTCPeerConnection.peerIdentity holds any verified idenFty asserFon
TAD Summit Bangkok 2013 57
Pseudo Code Walkthrough
Pseudo Code
• Close to real code, but . . . • No HTML, no signaling channel, not asynchronous, and API is sFll in flux
• Don't expect this to work anywhere
TAD Summit Bangkok 2013 59
Back to first diagram
• Mobile browser "calls" laptop browser • Each sends media to the other
Stereo Audio
Browser M on Mobile
Browser L on Laptop
Front Camera Video
Rear Camera Video
Microphone Audio
ApplicaFon Sharing Video
WebCam Video
TAD Summit Bangkok 2013 60
Mobile browser code outline
• We will look next at each of these • . . . except for creaFng the signaling channel
var pc; var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]}; var microphone, application, front, rear;
var presentation, presenter, demonstration;
var remote_av, stereo, mono;
var display, left, right;
function s(sdp) {} // stub success callback
function e(error) {} // stub error callback
var signalingChannel = createSignalingChannel();
getMedia();
createPC();
attachMedia();
call();
function getMedia() { // get local audio (microphone)
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
// get local video (application sharing)
///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream. //
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "front"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream;
}, e);
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "rear"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
}
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);}; }
function attachMedia() {
presentation =
new MediaStream(
[microphone.getAudioTracks()[0], // Audio
application.getVideoTracks()[0]]); // Presentation
presenter = new MediaStream(
[microphone.getAudioTracks()[0], // Audio
front.getVideoTracks()[0]]); // Presenter
demonstration =
new MediaStream(
[microphone.getAudioTracks()[0], // Audio
rear.getVideoTracks()[0]]); // Demonstration
pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);
signalingChannel.send(
JSON.stringify({ "presentation": presentation.id,
"presenter": presenter.id,
"demonstration": demonstration.id
})); }
function call() {
pc.createOffer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.getVideoTracks().length == 1) {
av_stream = st;
show_av(av_stream); } else if (st.getAudioTracks().length == 2) {
stereo = st;
} else {
mono = st;
}
}
function show_av(st) {
display.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0]));
left.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
right.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[1]));
}
signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate)); }
};
var signalingChannel = createSignalingChannel();
getMedia();
createPC();
attachMedia();
call();
TAD Summit Bangkok 2013 61
Mobile browser produces . . .
• At least 3 calls to getUserMedia() • Three calls to new MediaStream() • App sends stream ids, then streams
Sources
Presenter Video
DemonstraFon Video
Audio
PresentaFon Video
PresentaFon Stream
Presenter Stream
DemonstraFon Stream
Captured MediaStreams Created MediaStreams Tracks
“Audio” Track
“PresentaFon” Track
“Audio” Track
“Presenter” Track
“Audio” Track
“DemonstraFon” Track
Front Camera Video
Rear Camera Video
Microphone Audio
ApplicaFon Sharing Video
Browser M
TAD Summit Bangkok 2013 62
funcFon getMedia() [1]
• Get audio • (Get window video – out of scope)
var pc; var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]}; var microphone, application, front, rear;
var presentation, presenter, demonstration;
var remote_av, stereo, mono;
var display, left, right;
function s(sdp) {} // stub success callback
function e(error) {} // stub error callback
var signalingChannel = createSignalingChannel();
getMedia();
createPC();
attachMedia();
call();
function getMedia() { // get local audio (microphone)
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
// get local video (application sharing)
///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream. //
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "front"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream;
}, e);
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "rear"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
}
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);}; }
function attachMedia() {
presentation =
new MediaStream(
[microphone.getAudioTracks()[0], // Audio
application.getVideoTracks()[0]]); // Presentation
presenter = new MediaStream(
[microphone.getAudioTracks()[0], // Audio
front.getVideoTracks()[0]]); // Presenter
demonstration =
new MediaStream(
[microphone.getAudioTracks()[0], // Audio
rear.getVideoTracks()[0]]); // Demonstration
pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);
signalingChannel.send(
JSON.stringify({ "presentation": presentation.id,
"presenter": presenter.id,
"demonstration": demonstration.id
})); }
function call() {
pc.createOffer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.getVideoTracks().length == 1) {
av_stream = st;
show_av(av_stream); } else if (st.getAudioTracks().length == 2) {
stereo = st;
} else {
mono = st;
}
}
function show_av(st) {
display.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0]));
left.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
right.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[1]));
}
signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate)); }
};
navigator.getUserMedia({"audio": true }, function (stream) { microphone = stream;
}, e);
// get local video (application sharing)
///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream.
//
. . .
TAD Summit Bangkok 2013 63
funcFon getMedia() [2]
• Get front-‐facing camera • Get rear-‐facing camera
var pc; var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]}; var microphone, application, front, rear;
var presentation, presenter, demonstration;
var remote_av, stereo, mono;
var display, left, right;
function s(sdp) {} // stub success callback
function e(error) {} // stub error callback
var signalingChannel = createSignalingChannel();
getMedia();
createPC();
attachMedia();
call();
function getMedia() { // get local audio (microphone)
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
// get local video (application sharing)
///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream. //
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "front"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream;
}, e);
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "rear"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
}
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);}; }
function attachMedia() {
presentation =
new MediaStream(
[microphone.getAudioTracks()[0], // Audio
application.getVideoTracks()[0]]); // Presentation
presenter = new MediaStream(
[microphone.getAudioTracks()[0], // Audio
front.getVideoTracks()[0]]); // Presenter
demonstration =
new MediaStream(
[microphone.getAudioTracks()[0], // Audio
rear.getVideoTracks()[0]]); // Demonstration
pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);
signalingChannel.send(
JSON.stringify({ "presentation": presentation.id,
"presenter": presenter.id,
"demonstration": demonstration.id
})); }
function call() {
pc.createOffer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.getVideoTracks().length == 1) {
av_stream = st;
show_av(av_stream); } else if (st.getAudioTracks().length == 2) {
stereo = st;
} else {
mono = st;
}
}
function show_av(st) {
display.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0]));
left.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
right.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[1]));
}
signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate)); }
};
. . . constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "front"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream;
}, e);
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "rear"}}}; navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
TAD Summit Bangkok 2013 64
funcFon createPC()
• Create RTCPeerConnection • Set handlers
var pc; var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]}; var microphone, application, front, rear;
var presentation, presenter, demonstration;
var remote_av, stereo, mono;
var display, left, right;
function s(sdp) {} // stub success callback
function e(error) {} // stub error callback
var signalingChannel = createSignalingChannel();
getMedia();
createPC();
attachMedia();
call();
function getMedia() { // get local audio (microphone)
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
// get local video (application sharing)
///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream. //
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "front"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream;
}, e);
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "rear"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
}
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);}; }
function attachMedia() {
presentation =
new MediaStream(
[microphone.getAudioTracks()[0], // Audio
application.getVideoTracks()[0]]); // Presentation
presenter = new MediaStream(
[microphone.getAudioTracks()[0], // Audio
front.getVideoTracks()[0]]); // Presenter
demonstration =
new MediaStream(
[microphone.getAudioTracks()[0], // Audio
rear.getVideoTracks()[0]]); // Demonstration
pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);
signalingChannel.send(
JSON.stringify({ "presentation": presentation.id,
"presenter": presenter.id,
"demonstration": demonstration.id
})); }
function call() {
pc.createOffer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.getVideoTracks().length == 1) {
av_stream = st;
show_av(av_stream); } else if (st.getAudioTracks().length == 2) {
stereo = st;
} else {
mono = st;
}
}
function show_av(st) {
display.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0]));
left.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
right.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[1]));
}
signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate)); }
};
var configuration = {"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]};
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate })); };
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};
TAD Summit Bangkok 2013 65
Mobile browser consumes . . .
• Receives three media streams • Chooses one • Sends tracks to output channels
Lem Headphone
Display
Right Headphone
Stereo Stream
Mono Stream
Audio & Video Stream
Sinks MediaStreams
“Right” Track
“Lem” Track
“Mono” track
“Video” Track
“Right” Track “Lem” Track
Tracks
(Audio & Video Stream selected) Browser M
TAD Summit Bangkok 2013 66
FuncFon handleIncomingStream()
• If incoming stream has video track, set to av_stream and display it
• If it has two audio tracks, must be stereo • Otherwise, must be the mono stream
var pc; var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]}; var microphone, application, front, rear;
var presentation, presenter, demonstration;
var remote_av, stereo, mono;
var display, left, right;
function s(sdp) {} // stub success callback
function e(error) {} // stub error callback
var signalingChannel = createSignalingChannel();
getMedia();
createPC();
attachMedia();
call();
function getMedia() { // get local audio (microphone)
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
// get local video (application sharing)
///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream. //
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "front"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream;
}, e);
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "rear"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
}
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);}; }
function attachMedia() {
presentation =
new MediaStream(
[microphone.getAudioTracks()[0], // Audio
application.getVideoTracks()[0]]); // Presentation
presenter = new MediaStream(
[microphone.getAudioTracks()[0], // Audio
front.getVideoTracks()[0]]); // Presenter
demonstration =
new MediaStream(
[microphone.getAudioTracks()[0], // Audio
rear.getVideoTracks()[0]]); // Demonstration
pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);
signalingChannel.send(
JSON.stringify({ "presentation": presentation.id,
"presenter": presenter.id,
"demonstration": demonstration.id
})); }
function call() {
pc.createOffer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.getVideoTracks().length == 1) {
av_stream = st;
show_av(av_stream); } else if (st.getAudioTracks().length == 2) {
stereo = st;
} else {
mono = st;
}
}
function show_av(st) {
display.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0]));
left.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
right.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[1]));
}
signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate)); }
};
if (st.getVideoTracks().length == 1) { av_stream = st;
show_av(av_stream);
} else if (st.getAudioTracks().length == 2) {
stereo = st;
} else {
mono = st;
}
TAD Summit Bangkok 2013 67
FuncFon show_av(st)
• Turn streams into URLs • Set as source for media elements
var pc; var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]}; var microphone, application, front, rear;
var presentation, presenter, demonstration;
var remote_av, stereo, mono;
var display, left, right;
function s(sdp) {} // stub success callback
function e(error) {} // stub error callback
var signalingChannel = createSignalingChannel();
getMedia();
createPC();
attachMedia();
call();
function getMedia() { // get local audio (microphone)
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
// get local video (application sharing)
///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream. //
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "front"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream;
}, e);
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "rear"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
}
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);}; }
function attachMedia() {
presentation =
new MediaStream(
[microphone.getAudioTracks()[0], // Audio
application.getVideoTracks()[0]]); // Presentation
presenter = new MediaStream(
[microphone.getAudioTracks()[0], // Audio
front.getVideoTracks()[0]]); // Presenter
demonstration =
new MediaStream(
[microphone.getAudioTracks()[0], // Audio
rear.getVideoTracks()[0]]); // Demonstration
pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);
signalingChannel.send(
JSON.stringify({ "presentation": presentation.id,
"presenter": presenter.id,
"demonstration": demonstration.id
})); }
function call() {
pc.createOffer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.getVideoTracks().length == 1) {
av_stream = st;
show_av(av_stream); } else if (st.getAudioTracks().length == 2) {
stereo = st;
} else {
mono = st;
}
}
function show_av(st) {
display.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0]));
left.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
right.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[1]));
}
signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate)); }
};
display.srcObject = new MediaStream(st.getVideoTracks()[0]);
left.srcObject =
new MediaStream(st.getAudioTracks()[0]);
right.srcObject =
new MediaStream(st.getAudioTracks()[1]);
TAD Summit Bangkok 2013 68
Mobile browser code outline
• We will look next at each of these • . . . except for creaFng the signaling channel
var pc; var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]}; var microphone, application, front, rear;
var presentation, presenter, demonstration;
var remote_av, stereo, mono;
var display, left, right;
function s(sdp) {} // stub success callback
function e(error) {} // stub error callback
var signalingChannel = createSignalingChannel();
getMedia();
createPC();
attachMedia();
call();
function getMedia() { // get local audio (microphone)
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
// get local video (application sharing)
///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream. //
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "front"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream;
}, e);
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "rear"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
}
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);}; }
function attachMedia() {
presentation =
new MediaStream(
[microphone.getAudioTracks()[0], // Audio
application.getVideoTracks()[0]]); // Presentation
presenter = new MediaStream(
[microphone.getAudioTracks()[0], // Audio
front.getVideoTracks()[0]]); // Presenter
demonstration =
new MediaStream(
[microphone.getAudioTracks()[0], // Audio
rear.getVideoTracks()[0]]); // Demonstration
pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);
signalingChannel.send(
JSON.stringify({ "presentation": presentation.id,
"presenter": presenter.id,
"demonstration": demonstration.id
})); }
function call() {
pc.createOffer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.getVideoTracks().length == 1) {
av_stream = st;
show_av(av_stream); } else if (st.getAudioTracks().length == 2) {
stereo = st;
} else {
mono = st;
}
}
function show_av(st) {
display.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0]));
left.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
right.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[1]));
}
signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate)); }
};
var signalingChannel = createSignalingChannel();
getMedia();
createPC();
attachMedia();
call();
TAD Summit Bangkok 2013 69
funcFon aWachMedia() [1]
• Create 3 new streams, all with same audio but different video
var pc; var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]}; var microphone, application, front, rear;
var presentation, presenter, demonstration;
var remote_av, stereo, mono;
var display, left, right;
function s(sdp) {} // stub success callback
function e(error) {} // stub error callback
var signalingChannel = createSignalingChannel();
getMedia();
createPC();
attachMedia();
call();
function getMedia() { // get local audio (microphone)
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
// get local video (application sharing)
///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream. //
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "front"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream;
}, e);
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "rear"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
}
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);}; }
function attachMedia() {
presentation =
new MediaStream(
[microphone.getAudioTracks()[0], // Audio
application.getVideoTracks()[0]]); // Presentation
presenter = new MediaStream(
[microphone.getAudioTracks()[0], // Audio
front.getVideoTracks()[0]]); // Presenter
demonstration =
new MediaStream(
[microphone.getAudioTracks()[0], // Audio
rear.getVideoTracks()[0]]); // Demonstration
pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);
signalingChannel.send(
JSON.stringify({ "presentation": presentation.id,
"presenter": presenter.id,
"demonstration": demonstration.id
})); }
function call() {
pc.createOffer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.getVideoTracks().length == 1) {
av_stream = st;
show_av(av_stream); } else if (st.getAudioTracks().length == 2) {
stereo = st;
} else {
mono = st;
}
}
function show_av(st) {
display.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0]));
left.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
right.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[1]));
}
signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate)); }
};
presentation = new MediaStream(
[microphone.getAudioTracks()[0], // Audio
application.getVideoTracks()[0]]); // Presentation
presenter =
new MediaStream(
[microphone.getAudioTracks()[0], // Audio
front.getVideoTracks()[0]]); // Presenter
demonstration = new MediaStream(
[microphone.getAudioTracks()[0], // Audio
rear.getVideoTracks()[0]]); // Demonstration
. . .
TAD Summit Bangkok 2013 70
funcFon aWachMedia() [2]
• AWach all 3 streams to Peer ConnecFon • Send stream ids to peer (before streams!)
var pc; var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]}; var microphone, application, front, rear;
var presentation, presenter, demonstration;
var remote_av, stereo, mono;
var display, left, right;
function s(sdp) {} // stub success callback
function e(error) {} // stub error callback
var signalingChannel = createSignalingChannel();
getMedia();
createPC();
attachMedia();
call();
function getMedia() { // get local audio (microphone)
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
// get local video (application sharing)
///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream. //
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "front"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream;
}, e);
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "rear"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
}
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);}; }
function attachMedia() {
presentation =
new MediaStream(
[microphone.getAudioTracks()[0], // Audio
application.getVideoTracks()[0]]); // Presentation
presenter = new MediaStream(
[microphone.getAudioTracks()[0], // Audio
front.getVideoTracks()[0]]); // Presenter
demonstration =
new MediaStream(
[microphone.getAudioTracks()[0], // Audio
rear.getVideoTracks()[0]]); // Demonstration
pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);
signalingChannel.send(
JSON.stringify({ "presentation": presentation.id,
"presenter": presenter.id,
"demonstration": demonstration.id
})); }
function call() {
pc.createOffer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.getVideoTracks().length == 1) {
av_stream = st;
show_av(av_stream); } else if (st.getAudioTracks().length == 2) {
stereo = st;
} else {
mono = st;
}
}
function show_av(st) {
display.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0]));
left.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
right.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[1]));
}
signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate)); }
};
pc.addStream(presentation); pc.addStream(presenter);
pc.addStream(demonstration);
signalingChannel.send(
JSON.stringify({ "presentation": presentation.id,
"presenter": presenter.id,
"demonstration": demonstration.id
}));
TAD Summit Bangkok 2013 71
Mobile browser code outline
• We will look next at each of these • . . . except for creaFng the signaling channel
var pc; var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]}; var microphone, application, front, rear;
var presentation, presenter, demonstration;
var remote_av, stereo, mono;
var display, left, right;
function s(sdp) {} // stub success callback
function e(error) {} // stub error callback
var signalingChannel = createSignalingChannel();
getMedia();
createPC();
attachMedia();
call();
function getMedia() { // get local audio (microphone)
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
// get local video (application sharing)
///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream. //
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "front"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream;
}, e);
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "rear"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
}
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);}; }
function attachMedia() {
presentation =
new MediaStream(
[microphone.getAudioTracks()[0], // Audio
application.getVideoTracks()[0]]); // Presentation
presenter = new MediaStream(
[microphone.getAudioTracks()[0], // Audio
front.getVideoTracks()[0]]); // Presenter
demonstration =
new MediaStream(
[microphone.getAudioTracks()[0], // Audio
rear.getVideoTracks()[0]]); // Demonstration
pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);
signalingChannel.send(
JSON.stringify({ "presentation": presentation.id,
"presenter": presenter.id,
"demonstration": demonstration.id
})); }
function call() {
pc.createOffer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.getVideoTracks().length == 1) {
av_stream = st;
show_av(av_stream); } else if (st.getAudioTracks().length == 2) {
stereo = st;
} else {
mono = st;
}
}
function show_av(st) {
display.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0]));
left.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
right.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[1]));
}
signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate)); }
};
var signalingChannel = createSignalingChannel();
getMedia();
createPC();
attachMedia();
call();
TAD Summit Bangkok 2013 72
funcFon call()
• Ask browser to create SDP offer • Set offer as local descripFon • Send offer to peer
var pc; var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]}; var microphone, application, front, rear;
var presentation, presenter, demonstration;
var remote_av, stereo, mono;
var display, left, right;
function s(sdp) {} // stub success callback
function e(error) {} // stub error callback
var signalingChannel = createSignalingChannel();
getMedia();
createPC();
attachMedia();
call();
function getMedia() { // get local audio (microphone)
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
// get local video (application sharing)
///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream. //
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "front"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream;
}, e);
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "rear"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
}
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);}; }
function attachMedia() {
presentation =
new MediaStream(
[microphone.getAudioTracks()[0], // Audio
application.getVideoTracks()[0]]); // Presentation
presenter = new MediaStream(
[microphone.getAudioTracks()[0], // Audio
front.getVideoTracks()[0]]); // Presenter
demonstration =
new MediaStream(
[microphone.getAudioTracks()[0], // Audio
rear.getVideoTracks()[0]]); // Demonstration
pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);
signalingChannel.send(
JSON.stringify({ "presentation": presentation.id,
"presenter": presenter.id,
"demonstration": demonstration.id
})); }
function call() {
pc.createOffer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.getVideoTracks().length == 1) {
av_stream = st;
show_av(av_stream); } else if (st.getAudioTracks().length == 2) {
stereo = st;
} else {
mono = st;
}
}
function show_av(st) {
display.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0]));
left.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
right.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[1]));
}
signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate)); }
};
pc.createOffer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
TAD Summit Bangkok 2013 73
How do we get the SDP answer?
• Signaling channel provides message • If SDP, set as remote descripFon • If ICE candidate, tell the browser
var pc; var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]}; var microphone, application, front, rear;
var presentation, presenter, demonstration;
var remote_av, stereo, mono;
var display, left, right;
function s(sdp) {} // stub success callback
function e(error) {} // stub error callback
var signalingChannel = createSignalingChannel();
getMedia();
createPC();
attachMedia();
call();
function getMedia() { // get local audio (microphone)
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
// get local video (application sharing)
///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream. //
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "front"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream;
}, e);
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "rear"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
}
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);}; }
function attachMedia() {
presentation =
new MediaStream(
[microphone.getAudioTracks()[0], // Audio
application.getVideoTracks()[0]]); // Presentation
presenter = new MediaStream(
[microphone.getAudioTracks()[0], // Audio
front.getVideoTracks()[0]]); // Presenter
demonstration =
new MediaStream(
[microphone.getAudioTracks()[0], // Audio
rear.getVideoTracks()[0]]); // Demonstration
pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);
signalingChannel.send(
JSON.stringify({ "presentation": presentation.id,
"presenter": presenter.id,
"demonstration": demonstration.id
})); }
function call() {
pc.createOffer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.getVideoTracks().length == 1) {
av_stream = st;
show_av(av_stream); } else if (st.getAudioTracks().length == 2) {
stereo = st;
} else {
mono = st;
}
}
function show_av(st) {
display.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0]));
left.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
right.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[1]));
}
signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate)); }
};
signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate)); }
};
TAD Summit Bangkok 2013 74
And now the laptop browser . . .
• Watch for the following – We set up media *amer* receiving the offer – but the signaling channel sFll must exist first! – Also, need to save incoming stream ids
TAD Summit Bangkok 2013 75
Signaling channel message is trigger
• Set up PC and media if not already done
var pc; var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]}; var webcam, left, right;
var av, stereo, mono;
var incoming;
var speaker, win1, win2, win3;
function s(sdp) {} // stub success callback
function e(error) {} // stub error callback
var signalingChannel = createSignalingChannel();
function prepareForIncomingCall() {
createPC();
getMedia();
attachMedia(); }
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate })); };
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};
}
function getMedia() {
navigator.getUserMedia({"video": true }, function (stream) {
webcam = stream;
}, e);
constraint =
{"audio": {"mandatory": {"audioDirectionEnum": "left"}}};
navigator.getUserMedia(constraint, function (stream) {
left = stream;
}, e);
constraint =
{"audio": {"mandatory": {"audioDirectionEnum": "right"}}};
navigator.getUserMedia(constraint, function (stream) {
right = stream;
}, e);
}
function attachMedia() {
av = new MediaStream(
[webcam.getVideoTracks()[0], // Video
left.getAudioTracks()[0], // Left audio
right.getAudioTracks()[0]]); // Right audio
stereo = new MediaStream(
[left.getAudioTracks()[0], // Left audio
right.getAudioTracks()[0]]); // Right audio
mono = left; // Treat the left audio as the mono stream
pc.addStream(av);
pc.addStream(stereo);
pc.addStream(mono);
}
function answer() {
pc.createAnswer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.id === incoming.presentation) {
speaker.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
win1.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
} else if (st.id === incoming.presenter) { win2.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
} else {
win3.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
}
}
signalingChannel.onmessage = function (msg) {
if (!pc) {
prepareForIncomingCall();
}
var sgnl = JSON.parse(msg.data);
if (sgnl.sdp) {
pc.setRemoteDescription( new RTCSessionDescription(sgnl.sdp), s, e);
answer();
} else if (sgnl.candidate) {
pc.addIceCandidate(new RTCIceCandidate(sgnl.candidate));
} else {
incoming = sgnl;
}
};
signalingChannel.onmessage = function (msg) { if (!pc) { prepareForIncomingCall(); } var sgnl = JSON.parse(msg.data);
. . .
};
TAD Summit Bangkok 2013 76
Signaling channel message is trigger
• If SDP, *also* answer • But if neither SDP nor ICE candidate, must be set of incoming stream ids, so save
var pc; var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]}; var webcam, left, right;
var av, stereo, mono;
var incoming;
var speaker, win1, win2, win3;
function s(sdp) {} // stub success callback
function e(error) {} // stub error callback
var signalingChannel = createSignalingChannel();
function prepareForIncomingCall() {
createPC();
getMedia();
attachMedia(); }
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate })); };
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};
}
function getMedia() {
navigator.getUserMedia({"video": true }, function (stream) {
webcam = stream;
}, e);
constraint =
{"audio": {"mandatory": {"audioDirectionEnum": "left"}}};
navigator.getUserMedia(constraint, function (stream) {
left = stream;
}, e);
constraint =
{"audio": {"mandatory": {"audioDirectionEnum": "right"}}};
navigator.getUserMedia(constraint, function (stream) {
right = stream;
}, e);
}
function attachMedia() {
av = new MediaStream(
[webcam.getVideoTracks()[0], // Video
left.getAudioTracks()[0], // Left audio
right.getAudioTracks()[0]]); // Right audio
stereo = new MediaStream(
[left.getAudioTracks()[0], // Left audio
right.getAudioTracks()[0]]); // Right audio
mono = left; // Treat the left audio as the mono stream
pc.addStream(av);
pc.addStream(stereo);
pc.addStream(mono);
}
function answer() {
pc.createAnswer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.id === incoming.presentation) {
speaker.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
win1.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
} else if (st.id === incoming.presenter) { win2.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
} else {
win3.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
}
}
signalingChannel.onmessage = function (msg) {
if (!pc) {
prepareForIncomingCall();
}
var sgnl = JSON.parse(msg.data);
if (sgnl.sdp) {
pc.setRemoteDescription( new RTCSessionDescription(sgnl.sdp), s, e);
answer();
} else if (sgnl.candidate) {
pc.addIceCandidate(new RTCIceCandidate(sgnl.candidate));
} else {
incoming = sgnl;
}
};
signalingChannel.onmessage = function (msg) {
. . .
if (sgnl.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(sgnl.sdp), s, e);
answer(); } else if (sgnl.candidate) {
pc.addIceCandidate(new RTCIceCandidate(sgnl.candidate)); } else { incoming = sgnl; }
};
TAD Summit Bangkok 2013 77
FuncFon prepareForIncomingCall()
• No suprises here • Media obtained is a liWle different • But aWached the same way
var pc; var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]}; var webcam, left, right;
var av, stereo, mono;
var incoming;
var speaker, win1, win2, win3;
function s(sdp) {} // stub success callback
function e(error) {} // stub error callback
var signalingChannel = createSignalingChannel();
function prepareForIncomingCall() {
createPC();
getMedia();
attachMedia(); }
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate })); };
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};
}
function getMedia() {
navigator.getUserMedia({"video": true }, function (stream) {
webcam = stream;
}, e);
constraint =
{"audio": {"mandatory": {"audioDirectionEnum": "left"}}};
navigator.getUserMedia(constraint, function (stream) {
left = stream;
}, e);
constraint =
{"audio": {"mandatory": {"audioDirectionEnum": "right"}}};
navigator.getUserMedia(constraint, function (stream) {
right = stream;
}, e);
}
function attachMedia() {
av = new MediaStream(
[webcam.getVideoTracks()[0], // Video
left.getAudioTracks()[0], // Left audio
right.getAudioTracks()[0]]); // Right audio
stereo = new MediaStream(
[left.getAudioTracks()[0], // Left audio
right.getAudioTracks()[0]]); // Right audio
mono = left; // Treat the left audio as the mono stream
pc.addStream(av);
pc.addStream(stereo);
pc.addStream(mono);
}
function answer() {
pc.createAnswer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.id === incoming.presentation) {
speaker.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
win1.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
} else if (st.id === incoming.presenter) { win2.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
} else {
win3.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
}
}
signalingChannel.onmessage = function (msg) {
if (!pc) {
prepareForIncomingCall();
}
var sgnl = JSON.parse(msg.data);
if (sgnl.sdp) {
pc.setRemoteDescription( new RTCSessionDescription(sgnl.sdp), s, e);
answer();
} else if (sgnl.candidate) {
pc.addIceCandidate(new RTCIceCandidate(sgnl.candidate));
} else {
incoming = sgnl;
}
};
createPC();
getMedia();
attachMedia();
TAD Summit Bangkok 2013 78
FuncFon answer()
• createAnswer() automaFcally uses value of remoteDescription when generaFng new SDP
var pc; var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]}; var webcam, left, right;
var av, stereo, mono;
var incoming;
var speaker, win1, win2, win3;
function s(sdp) {} // stub success callback
function e(error) {} // stub error callback
var signalingChannel = createSignalingChannel();
function prepareForIncomingCall() {
createPC();
getMedia();
attachMedia(); }
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate })); };
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};
}
function getMedia() {
navigator.getUserMedia({"video": true }, function (stream) {
webcam = stream;
}, e);
constraint =
{"audio": {"mandatory": {"audioDirectionEnum": "left"}}};
navigator.getUserMedia(constraint, function (stream) {
left = stream;
}, e);
constraint =
{"audio": {"mandatory": {"audioDirectionEnum": "right"}}};
navigator.getUserMedia(constraint, function (stream) {
right = stream;
}, e);
}
function attachMedia() {
av = new MediaStream(
[webcam.getVideoTracks()[0], // Video
left.getAudioTracks()[0], // Left audio
right.getAudioTracks()[0]]); // Right audio
stereo = new MediaStream(
[left.getAudioTracks()[0], // Left audio
right.getAudioTracks()[0]]); // Right audio
mono = left; // Treat the left audio as the mono stream
pc.addStream(av);
pc.addStream(stereo);
pc.addStream(mono);
}
function answer() {
pc.createAnswer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.id === incoming.presentation) {
speaker.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
win1.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
} else if (st.id === incoming.presenter) { win2.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
} else {
win3.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
}
}
signalingChannel.onmessage = function (msg) {
if (!pc) {
prepareForIncomingCall();
}
var sgnl = JSON.parse(msg.data);
if (sgnl.sdp) {
pc.setRemoteDescription( new RTCSessionDescription(sgnl.sdp), s, e);
answer();
} else if (sgnl.candidate) {
pc.addIceCandidate(new RTCIceCandidate(sgnl.candidate));
} else {
incoming = sgnl;
}
};
pc.createAnswer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
TAD Summit Bangkok 2013 79
Laptop browser consumes . . .
• Three input streams • All have same # of audio and video tracks • Need stream ids to disFnguish
Tracks MediaStreams Sinks
Display
Display
Speaker
Display
“Audio” Track
“PresentaFon” Track
“Audio” Track
“Presenter” Track
“Audio” Track
“DemonstraFon” Track
PresentaFon Stream
Presenter Stream
DemonstraFon Stream
(All video streams selected) Browser L
TAD Summit Bangkok 2013 80
FuncFon handleIncomingStream()
• Use ids to disFnguish streams • Extract one audio and all video tracks • Assign to element sources
var pc; var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]}; var webcam, left, right;
var av, stereo, mono;
var incoming;
var speaker, win1, win2, win3;
function s(sdp) {} // stub success callback
function e(error) {} // stub error callback
var signalingChannel = createSignalingChannel();
function prepareForIncomingCall() {
createPC();
getMedia();
attachMedia(); }
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate })); };
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};
}
function getMedia() {
navigator.getUserMedia({"video": true }, function (stream) {
webcam = stream;
}, e);
constraint =
{"audio": {"mandatory": {"audioDirectionEnum": "left"}}};
navigator.getUserMedia(constraint, function (stream) {
left = stream;
}, e);
constraint =
{"audio": {"mandatory": {"audioDirectionEnum": "right"}}};
navigator.getUserMedia(constraint, function (stream) {
right = stream;
}, e);
}
function attachMedia() {
av = new MediaStream(
[webcam.getVideoTracks()[0], // Video
left.getAudioTracks()[0], // Left audio
right.getAudioTracks()[0]]); // Right audio
stereo = new MediaStream(
[left.getAudioTracks()[0], // Left audio
right.getAudioTracks()[0]]); // Right audio
mono = left; // Treat the left audio as the mono stream
pc.addStream(av);
pc.addStream(stereo);
pc.addStream(mono);
}
function answer() {
pc.createAnswer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.id === incoming.presentation) {
speaker.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
win1.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
} else if (st.id === incoming.presenter) { win2.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
} else {
win3.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
}
}
signalingChannel.onmessage = function (msg) {
if (!pc) {
prepareForIncomingCall();
}
var sgnl = JSON.parse(msg.data);
if (sgnl.sdp) {
pc.setRemoteDescription( new RTCSessionDescription(sgnl.sdp), s, e);
answer();
} else if (sgnl.candidate) {
pc.addIceCandidate(new RTCIceCandidate(sgnl.candidate));
} else {
incoming = sgnl;
}
};
if (st.id === incoming.presentation) { speaker.srcObject =
new MediaStream(st.getAudioTracks()[0]);
win1.srcObject =
new MediaStream(st.getVideoTracks()[0]);
} else if (st.id === incoming.presenter) {
win2.srcObject =
new MediaStream(st.getVideoTracks()[0]);
} else { win3.srcObject =
new MediaStream(st.getVideoTracks()[0]);
}
TAD Summit Bangkok 2013 81
Right Microphone
WebCam
Lem Microphone
Stereo Stream
Mono Stream
Audio & Video Stream
Sources Created MediaStreams
Right Audio
Video
Lem Audio
Captured MediaStreams
“Right” Track
“Lem” Track
“Mono” Track
“Video” Track
“Right” Track “Lem” Track
Tracks
Browser L
Laptop browser produces . . .
• Three calls to getUserMedia() • Three calls to new MediaStream() • No stream ids needed
TAD Summit Bangkok 2013 82
FuncFon getMedia() [1]
• Request webcam video
var pc; var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]}; var webcam, left, right;
var av, stereo, mono;
var incoming;
var speaker, win1, win2, win3;
function s(sdp) {} // stub success callback
function e(error) {} // stub error callback
var signalingChannel = createSignalingChannel();
function prepareForIncomingCall() {
createPC();
getMedia();
attachMedia(); }
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate })); };
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};
}
function getMedia() {
navigator.getUserMedia({"video": true }, function (stream) {
webcam = stream;
}, e);
constraint =
{"audio": {"mandatory": {"audioDirectionEnum": "left"}}};
navigator.getUserMedia(constraint, function (stream) {
left = stream;
}, e);
constraint =
{"audio": {"mandatory": {"audioDirectionEnum": "right"}}};
navigator.getUserMedia(constraint, function (stream) {
right = stream;
}, e);
}
function attachMedia() {
av = new MediaStream(
[webcam.getVideoTracks()[0], // Video
left.getAudioTracks()[0], // Left audio
right.getAudioTracks()[0]]); // Right audio
stereo = new MediaStream(
[left.getAudioTracks()[0], // Left audio
right.getAudioTracks()[0]]); // Right audio
mono = left; // Treat the left audio as the mono stream
pc.addStream(av);
pc.addStream(stereo);
pc.addStream(mono);
}
function answer() {
pc.createAnswer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.id === incoming.presentation) {
speaker.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
win1.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
} else if (st.id === incoming.presenter) { win2.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
} else {
win3.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
}
}
signalingChannel.onmessage = function (msg) {
if (!pc) {
prepareForIncomingCall();
}
var sgnl = JSON.parse(msg.data);
if (sgnl.sdp) {
pc.setRemoteDescription( new RTCSessionDescription(sgnl.sdp), s, e);
answer();
} else if (sgnl.candidate) {
pc.addIceCandidate(new RTCIceCandidate(sgnl.candidate));
} else {
incoming = sgnl;
}
};
navigator.getUserMedia({"video": true}, function (stream) { webcam = stream; }, e);
. . .
TAD Summit Bangkok 2013 83
FuncFon getMedia() [2]
• Request lem and right audio streams • Save them as left and right variables
var pc; var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]}; var webcam, left, right;
var av, stereo, mono;
var incoming;
var speaker, win1, win2, win3;
function s(sdp) {} // stub success callback
function e(error) {} // stub error callback
var signalingChannel = createSignalingChannel();
function prepareForIncomingCall() {
createPC();
getMedia();
attachMedia(); }
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate })); };
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};
}
function getMedia() {
navigator.getUserMedia({"video": true }, function (stream) {
webcam = stream;
}, e);
constraint =
{"audio": {"mandatory": {"audioDirectionEnum": "left"}}};
navigator.getUserMedia(constraint, function (stream) {
left = stream;
}, e);
constraint =
{"audio": {"mandatory": {"audioDirectionEnum": "right"}}};
navigator.getUserMedia(constraint, function (stream) {
right = stream;
}, e);
}
function attachMedia() {
av = new MediaStream(
[webcam.getVideoTracks()[0], // Video
left.getAudioTracks()[0], // Left audio
right.getAudioTracks()[0]]); // Right audio
stereo = new MediaStream(
[left.getAudioTracks()[0], // Left audio
right.getAudioTracks()[0]]); // Right audio
mono = left; // Treat the left audio as the mono stream
pc.addStream(av);
pc.addStream(stereo);
pc.addStream(mono);
}
function answer() {
pc.createAnswer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.id === incoming.presentation) {
speaker.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
win1.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
} else if (st.id === incoming.presenter) { win2.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
} else {
win3.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
}
}
signalingChannel.onmessage = function (msg) {
if (!pc) {
prepareForIncomingCall();
}
var sgnl = JSON.parse(msg.data);
if (sgnl.sdp) {
pc.setRemoteDescription( new RTCSessionDescription(sgnl.sdp), s, e);
answer();
} else if (sgnl.candidate) {
pc.addIceCandidate(new RTCIceCandidate(sgnl.candidate));
} else {
incoming = sgnl;
}
};
. . .
constraint =
{"audio": {"mandatory": {"audioDirectionEnum": "left"}}};
navigator.getUserMedia(constraint,
function (stream) {left = stream;}, e);
constraint =
{"audio": {"mandatory": {"audioDirectionEnum": "right"}}}; navigator.getUserMedia(constraint,
function (stream) {right = stream;}, e);
TAD Summit Bangkok 2013 84
FuncFon aWachMedia()
• Create new streams • Add them to PC for transmission
var pc; var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]}; var webcam, left, right;
var av, stereo, mono;
var incoming;
var speaker, win1, win2, win3;
function s(sdp) {} // stub success callback
function e(error) {} // stub error callback
var signalingChannel = createSignalingChannel();
function prepareForIncomingCall() {
createPC();
getMedia();
attachMedia(); }
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate })); };
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};
}
function getMedia() {
navigator.getUserMedia({"video": true }, function (stream) {
webcam = stream;
}, e);
constraint =
{"audio": {"mandatory": {"audioDirectionEnum": "left"}}};
navigator.getUserMedia(constraint, function (stream) {
left = stream;
}, e);
constraint =
{"audio": {"mandatory": {"audioDirectionEnum": "right"}}};
navigator.getUserMedia(constraint, function (stream) {
right = stream;
}, e);
}
function attachMedia() {
av = new MediaStream(
[webcam.getVideoTracks()[0], // Video
left.getAudioTracks()[0], // Left audio
right.getAudioTracks()[0]]); // Right audio
stereo = new MediaStream(
[left.getAudioTracks()[0], // Left audio
right.getAudioTracks()[0]]); // Right audio
mono = left; // Treat the left audio as the mono stream
pc.addStream(av);
pc.addStream(stereo);
pc.addStream(mono);
}
function answer() {
pc.createAnswer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.id === incoming.presentation) {
speaker.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
win1.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
} else if (st.id === incoming.presenter) { win2.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
} else {
win3.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
}
}
signalingChannel.onmessage = function (msg) {
if (!pc) {
prepareForIncomingCall();
}
var sgnl = JSON.parse(msg.data);
if (sgnl.sdp) {
pc.setRemoteDescription( new RTCSessionDescription(sgnl.sdp), s, e);
answer();
} else if (sgnl.candidate) {
pc.addIceCandidate(new RTCIceCandidate(sgnl.candidate));
} else {
incoming = sgnl;
}
};
av = new MediaStream( [webcam.getVideoTracks()[0], // Video
left.getAudioTracks()[0], // Left audio
right.getAudioTracks()[0]]); // Right audio
stereo = new MediaStream(
[left.getAudioTracks()[0], // Left audio
right.getAudioTracks()[0]]); // Right audio
mono = left; // Treat the left audio as the mono stream
pc.addStream(av);
pc.addStream(stereo);
pc.addStream(mono);
TAD Summit Bangkok 2013 85
In real code . . .
• The error callbacks must do something useful • Methods with callbacks are asynchronous! – Need to wait for callbacks before conFnuing
• Signaling channel is very important
TAD Summit Bangkok 2013 86
PracFcal bits
• Signaling • Audio and video • Data channel • More info online
TAD Summit Bangkok 2013 87
Signaling
• Decision is most challenging part of WebRTC • Server-‐side code provides – LocaFng and idenFfying peer browser – Message (signaling) relaying – Possibly media relaying/transcoding if a TURN server
• OpFons – Write your own server code
• Anything “real-‐Fme” will do – Use exisFng services
• PubNub, FireBase, Google App Engine TAD Summit Bangkok 2013 88
Audio and Video
• Very recent agreement on how to send and synchronize mulFple flows of same type
• For interop, keep it one of each only • SFll no agreement on codecs – Chrome: VP8 and H.264 – Firefox: VP8
TAD Summit Bangkok 2013 89
Data Channel
• Minimum safe chunk size sFll under debate – A few kb should be okay
• Protocol field is for app use
TAD Summit Bangkok 2013 90
Great online resources
• InformaFonal sites – webrtc.org – html5rocks.com/en/tutorials/webrtc/basics – webrtchacks.com
• Games/demos/apps – www.cubeslam.com – shinydemos.com/facekat – sharefest.me (github.com/Peer5/Sharefest)
TAD Summit Bangkok 2013 91
The WebRTC School is the web’s premier locaFon for WebRTC Integrator and Developer educaFon. Associated with the training programs are the industry supported cerFficaFons, the WebRTC School Qualified Integrator (WSQI™) and WebRTC School Qualified Developer (WSQD™). For more informaFon and online demos, please visit
hWp://www.webrtcschool.com
Training and CerFficaFon
TAD Summit Bangkok 2013 92
QuesFons?
hWp://webrtcbook.com
TAD Summit Bangkok 2013 93
Recommended