WEBRTC WG Meetings
TPAC
November 6-7, 2017
Burlingame, CA
1
Chairs: Stefan Hakansson
Bernard Aboba
Harald Alvestrand
Welcome to the WebRTC WG Meetings at TPAC!
2
W3C WG IPR Policy
3
About These Meetings
4
Monday TPAC Agenda
5
Tuesday TPAC Agenda
6
WebRTC Testing
W3C TPAC
November 6, 2017
Burlingame, CA
7
Presenters: Huib, Alexandre, Soares
About This Session
8
Path to WebRTC 1.0
Specs | Tests | Compliance | Reliability |
huib
Spec status
W3C | IETF | ||
WebRTC 1.0 | CR | JSEP | → RFC |
Media Capture and Streams | CR | Data Channel | ~RFC |
Identifiers for WebRTC's Statistics | WD | RTP Usage | ~RFC |
| | Transports | ~RFC |
| | Audio/Video codecs | RFC |
| | Requirements | RFC |
Tests tracks: WPT and Interop testing
javascript
HTTP
SRTP/SCTP
STUN/TURN
javascript
javascript
Web Platform Tests
KITE
Compliance
Webrtc.org status
ETA: mostly completed end of Q1 2018
Reliability
Web Platform Tests Progress
Coverage Status - 2017
$ cd webrtc/tools
$ node scripts/overview.js
Overall Coverage�====================�todo | 248�tested | 315�trivial | 173�untestable | 79�====================�total | 815�coverage | 69.57%�====================
4. Peer-to-peer connections | 67.83% |
5. RTP Media API | 67.01% |
6. Peer-to-peer Data API | 71.87% |
7. Peer-to-peer DTMF | 93.54% |
8. Statistics Model | 100.00% |
9. Identity | 86.04% |
10. Media Stream API Extensions for Network Use | 35.71% |
Challenges to reach 100% coverage
Path to 100% coverage
Next Steps
Interop testing
19
Thank you
Special thanks to:
W3C/MIT for WebEx
Huib, Dr. Alex, Soares
20
Test As You Commit Policy
W3C TPAC
November 6, 2017
Burlingame, CA
21
Presenters: Bernard
For Discussion in this session
22
23
rwaldron.github.io/webrtc-pc/
24
wpt.fyi (soon much less red)
25
Automation using WebDriver
26
Testing policy adoption (>50%!)
27
WebRTC-PC Session
W3C TPAC
November 6, 2017
Burlingame, CA
28
Presenters: Peter, Bernard
For Discussion in this session
29
Issue 1625/PR 1632: RTCPriorityType undesirably combines relative bitrate with QoS priority (Peter)
Problem: You want to send something with high priority for QoS but with fewer bits.�RTPEncodingParameters.priority controls both and makes higher QoS = more bits
Examples:
Also: the ratios of 1:2:4:8, are not granular enough to be very useful.
30
Issue 1625/PR 1632: RTCPriorityType undesirably combines relative bitrate with QoS priority (Peter)
Proposal: break up "more bits" and "higher QoS" into two separate controls
relativeBitrate of type double
Indicates the relative amount of bitrate that this encoding should be allocated when congestion occurs, relative to other encodings being sent under the same congestion control regime. For example, if two encodings use values of 1.0 and 1.5, respectively, and the congestion controller determines that 5Mbps are available to allocate, the encodings should be allocated 2Mbps and 3Mbps, respectively. The encoding may also be further constrained by other limits (such as maxBitrate or per-transport or per-session bandwidth limits), resulting in it using less than its available share.
31
Issue 1625/PR 1632: RTCPriorityType undesirably combines relative bitrate with QoS priority (Peter)
Peter's recommendation: split "priority" into "QoS priority" and "bitrate priority". More flexible bitrate ratios via "relativeBitrate" seems like an elegant solution and nice bonus.
32
Issue 1635: Need for initial bitrate by the application/ RtpSender (Peter)
Should we add a way for the app to say "I think the bandwidth estimate is X. Start there"?
If so, where does it go and what does it mean? It's complicated.
What if you're not bundling? �What does it do to SCTP? �What happens if you switch ICE network routes?
33
Issue 1635: Need for initial bitrate by the application/ RtpSender (Peter)
Peter's recommendation: Leave it out (for now?). We can try something in WebRTC NV (perhaps we'll have a separate RtpTransport where this would make sense).
34
Issue 1194: AudioLevel of tracks, both send and receive (Peter)
Would RTCRtpReceiver.getSynchronizationSources() be useful for getting the audio level of the last sent packet?
Or would it make more sense to get the audio level from RTCRtpReceiver.track (either using WebAudio or adding a new event to MediaStreamTrack)?
35
Issue 1194: AudioLevel of tracks, both send and receive (Peter)
Peter's recommendation: leave it off RtpSender. Put it on MediaStreamTrack, or just let WebAudio handle it. Even the initial bug reporter (Henrik Bostrom) said that's fine.
36
Issue 1644: Adding more values to RTCIceTransportPolicy Enum (Peter)
Two separate things going on:
37
Issue 1644: Adding more values to RTCIceTransportPolicy Enum (Peter)
Peter's recommendation: Leave it out. Maybe try to handle it with the permissions spec.
38
Issue 1586: behavior of OfferToReceive* set to false (Jan-Ivar)
History: offerToReceive was removed when Transceivers were introduced
Questions:
39
Issue 1646: Isolated Media Streams require modification to permission algorithms (soareschen)
40
WebRTC-Stats Session
W3C TPAC
November 6, 2017
Burlingame, CA
41
Presenters: Varun, Harald
We’ve made a lot of progress
Major changes
Path to CR
Open issues to discuss at this meeting
Issue 99/PR 251: Security and privacy considerations
Issue 99/PR 251: Security and privacy considerations
47
Issue 177: Caching and consistency of getStats()
Issue 131/PR 262: “objectDeleted” marker
This reflects explicitly the model that stats exist for objects that are gone.
Is this marker useful? Correctly defined?
CR blocker?
Editors think this is ready to merge.
Issue 231/PR 273: Sender & Receiver stats vs Track stats
Track stats were introduced before senders/receivers. Later we made each track per-attachment to the PC, a compromise. A more accurate view is to have stats per sender, receiver and track objects.
This is an issue because ReplaceTrack will make the two stats different
Much simpler, removes the need for detached, and accumulation of stale objects.
Have to decide one way or the other where we want to end up
Might present all stats on both objects for a while as a transition strategy?
Issue 230/PR 272: RTCMediaStreamTrackStats to 4 dicts
Removes "Only valid for..." prose in nearly every RTCMediaStreamTrackStats member by using WebIDL instead (dictionary inheritance).
kind and remoteSource stay. API unaffected. Minor JS observable change (order).
(If we go for #231 remoteSource disappears naturally: remoteSource == "receiver").
Editors like it.
Issue 133/135: DSCP information
Not CR blocker
Gives information that can diagnose remarking and codepoint-blocking problems.�Belongs on RTCTransportStats or RTCIceCandidatePairStats?�Information is not accessible on all platforms.
Proposal: Add a record<USVString, long>, where key is the DSCP codepoint in string-numeric form and the value is # of packets sent / received with that marking.
Alternative: define an interface that is a maplike<short, long>, where key is DSCP codepoint and value is # of packets sent/received.
Issue 202: concealedAudibleSamples
Concealed samples are samples synthesized to conceal packet loss.
concealedAudibleSamples are such samples synthesized during “audible” portions of the stream, such as when someone is speaking.
Hard to find a definition on “audible” that’s comprehensible and supportable.
PR #215 is a proposal
Not a CR blocker
Issue 1613: Stats and Isolated Streams (varun)
54
SVC use cases
for an SFU developer
Sergio Garcia Murillo - CosMo Software
SVC Use cases
Simulcast and SVC use cases are very similar (if not identical): content adaptation without transcoding
Adaptation will happen as a combination and/or trade off of the following items:
Typically you would always love to have the best quality, with biggest image size and most fps, but most of the times you have will be restricted to do that and have to adapt video stream to the match certain limits.
Also, another interesting use case would be for increasing reliability against packet losses by applying different protection levels to each layer and maximizing the probability that at least the base layer is received.
Content Adaptation (I)
Bitrate adaptation is the most common one:
Image size adaptation is also very common:
Content Adaptation (II)
Adapt on FPS and/or Quality
Adapt based on decoding complexity
What do SFU developers need?
Media Capture Session
W3C TPAC
November 6-7, 2017
Burlingame, CA
60
Presenters: Peter T, Jan-Ivar, Peter B
For Discussion In This Session
61
Scenario: Sender has a 16:9 track. Receiver wants to receive a 1:1 track.
Solution we don't want: something on RtpSender to change 16:9 into 1:1 when sending it.
Solution we do want: a way to get a 1:1 track from a 16:9 track.
// Or maybe height and width instead of aspectRatio�track.applyConstraints({aspectRatio: {exact: 1.0}})
Problem: does the app prefer adding pixels (pillar/letterbox) or removing pixels (crop)? �Related problem: is the browser allowed to scale to avoid overly adding or removing?
62
Proposal: let the app tell you what it's happy with a new constraint.
Let's call it resizeMode.
enum ResizeModeEnum {� "crop-and-scale", // Removes pixels; allows scaling� "box-and-scale", // Adds pixels; allows scaling� "none", // No adding or removing; no scaling�};
track.applyConstraints({aspectRatio: {exact: 1.0}, � resizeMode: "crop-and-scale"})��Last question: what's the default? I prefer "crop-and-scale".
63
Issue 472: No constraint defaults. When do we downscale? (Jan-Ivar)
Two models being explored today:
64
Issue 470: Does getSettings() reflect configured or actual settings? (Jan-Ivar)
65
Issue 466: Question about setting belong to source in Section 3 (Jan-Ivar)
66
Issue 470: Does getSettings() reflect configured or actual settings? (Jan-Ivar)
67
Issue 441: Interop for muted/track.enabled=false tracks (Jan-Ivar)
68
Issue 478: Content hints for MediaStreamTrack (Peter Boström)
69
Issue 478: Content hints for MediaStreamTrack (Peter Boström)
70
Issue 1533: Live RTCRtpContributingSource confusing (Jan-Ivar)
dictionary RTCRtpContributingSource {� DOMHighResTimeStamp timestamp;� unsigned long source;� byte? audioLevel;� };
dictionary RTCRtpSynchronizationSource : RTCRtpContributingSource {� boolean? voiceActivityFlag;� };
71
Splitting Priority Part 2
draft-ietf-rtcweb-transports-17 says "The priority settings affect two pieces of behavior: Packet send sequence decisions and packet markings. Each is described in its own section below". We can change that to say "there are two priority settings":
https://github.com/rtcweb-wg/rtcweb-transport/pull/50
Meanwhile, we can add this to RtpEncodingParameters. The existing priority sets both, except when overridden by either or both.
https://github.com/w3c/webrtc-pc/pull/1659
RTCPriorityType bitratePriority = "low";
RTCPriorityType markingPriority = "low";
72
Media Capture, WebRTC-PC and WebRTC-Stats Next Steps
W3C TPAC
November 7, 2017
Burlingame, CA
73
Presenters: Bernard, Harald, Varun
For Discussion Today
74
Media Capture Next Steps
75
WebRTC-PC Status
76
WebRTC-PC Next Steps
77
Other specifications
78
WebRTC-Stats Status
79
WebRTC-Stats Next Steps
80
WebRTC-Stats Experimentation
81
WebRTC New Work
W3C TPAC
November 7, 2017
Burlingame, CA
82
Presenters: Peter, Sergio
For Discussion Today
83
Objectives for this session
84
Things definitely "post 1.0"
85
QUIC data channels
Idea: Send data with QUIC instead of SCTP + DTLS
86
Proposal: add QUIC data channels to WebRTC
87
Optional long-term work not proposed here
88
QuicTransport
Like DtlsTransport: does a crypto handshake based on local certificates and remote fingerprints.
Like SctpTransport: creates data channels/streams and has an event for when the remote side creates channels/streams
89
QuicTransport
[Constructor(IceTransport, sequence<Certificate>)]
interface QuicTransport {
readonly attribute QuicTransportState state;
QuicParameters getLocalParameters();
sequence<ArrayBuffer> getRemoteCertificates();
void start(QuicParameters remoteParameters);
void stop();
QuicStream createStream();
attribute EventHandler onstream;
attribute EventHandler onstatechange;
};
90
Example: setup
var ice = ...;
var certs = ...;
var quic = new QuicTransport(ice, certs);
var remoteParameters = signalExchange(quic.getLocalParameters())
quic.start(remoteParmeters);
quic.onstatechange = (evt) => {
if (evt.state == "connected") {
// :)
}
}
91
QuicStream
92
QuicStream
interface QuicStream {
readonly attribute QuicStreamState state;
readonly attribute unsigned long readBufferedAmount;
readonly attribute unsigned long writeBufferedAmount;
setTargetReadBufferedAmount(unsigned long amount);
unsigned long readInto(Uint8Array data);
void write(Uint8Array data);
void finish();
void reset();
void attribute EventHandler onstatechange;
Promise waitForReadable(amount); // like waitForReadBufferedAmountBelow
Promise waitForWritable(amount, ...); // like waitForWriteBufferedAmountBelow
}
93
Example: small, unordered, unreliable
// Send side for each message
var qstream = quic.createStream();
qstream.write(data);
qstream.finish();
// Could be replaced with a max time or max rtx policy
setTimeout(() => qstream.reset(), 5000);
// Receive side for each message
quic.onstream = (qstream) => {
qstream.readInto(buffer);
};
94
Example: big, reliable, ordered
let qstream = quic.createStream();
for chunk in chunks {
await qstream.waitForWritable(chunk.byteLength, targetBuffer); // back pressure
qstream.write(chunk);
}
qstream.finish();
quic.onstream = qstream => {
await qstream.waitForReadable();
while (qstream.state == "open") {
qstream.readInto(buffer); // back pressure
await qstream.waitForReadable();
}
}
95
Example:ordered data channels
Could be defined in IETF, but doesn't need to be. Can be implemented in JS.
96
Example:unordered channels
Could be defined in IETF, but doesn't need to be. Can be implemented in JS.
97
Roadblocks
98
QUIC transport protocol stabilization
Still stabilizing in regards to:
99
Demuxing with RTP/RTCP, DTLS, and ICE
Good news: it probably can be done with no change to QUIC:
draft-aboba-avtcore-quic-multiplexing-01
But it could be cleaner with changes to QUIC or an extra byte of framing. Read the draft if you're interested.
100
IceTransport
A QuicTransport needs an IceTransport. Where does it get it?
The easiest thing to do is to allow constructing an IceTransport without a PeerConnection.
With something like the next slide.
101
IceTransport
partial interface IceTransport {
void start(IceGatherer, IceParameters remote, IceRole);
void stop();
void addRemoteCandidate(IceCandidate);
}
�interface RTCIceGatherer {
IceParameters getLocalParameters();
void gather(optional RTCIceGatherOptions);
void close();
attribute EventHandler onlocalcandidate;
}
102
IceTransport example
var gatherer = new IceGatherer();
gatherer.gather();
signalLocalIceParameters(gatherer.getLocalParameters());
gatherer.onlocalcandidate = signalLocalIceCandidate;
var ice = new IceTransport();
onSignaledRemoteIceParameters = (remoteParameters) => {
ice.start(gatherer, remoteParameters, role);
};
onSignaledRemoteIceCandidate = (remoteCandidate) => {
ice.addRemoteCandidate(remoteCandidate);
}
103
Summary of Proposal
Add QuicTransport
Add QuicStream
Add IceTransport constructor/methods
Add IceGatherer
Great! Now we'll have p2p QUIC data streams in WebRTC.
Question: Should this be added to the CR, or in a separate doc? (or hybrid: ICE in CR; QUIC in separate doc)?
104
Fine grain media stack control (not SDP)
Goals:
105
One last attempt at content hint
Use case
Has a capture card�sets "detailed"
�
Has a game screencapture�sets "motion"
Has a stream music app�sets "music"
Effect on PC (especially w/ balanced):�Gets same behavior as screencast: �high resolution/quality; fps degrades; denoising off; maybe special code mode
Gets same behavior as camera: �high fps; resolution/quality degrades; denoising off
Gets intelligibility enhancement off (and other non-standard implicit speech processing); noise suppression; special codec mode; higher bitrate;
106
One last attempt at content hint
Use case
Has a capture card�sets "detailed"
Has a game screencapture�sets "motion"
Has a music app sets "music"
Effect on MediaRecorder�Gets same implicit encoding behavior as tab / screen capture (can’t be overridden otherwise).
�Gets same implicit encoding preferences as UVC (can’t be overridden otherwise).
�More bits allocated to the audio stream (w/o audioBitsPerSecond set). Default codec set as appropriate for music if not provided (user doesn’t need to know what Opus is or good music target bitrate for specific codecs).
107
Wrap-up and Action Items
W3C TPAC
November 7, 2017
Burlingame, CA
108
Presenters: Bernard, Stefan, Harald
Wrap-up items
Mediacapture-main still references allowUserMedia:
<iframe allowUserMedia=true sandbox="allow-scripts" src="/foo">
Chrome 64 implements* different syntax for iframes as part of Feature Policy:
<iframe allow="camera" allow="microphone" sandbox="allow-scripts" src="/foo">
Firefox and Safari are looking to do the same
*) Chrome warns: “VideoCapture permission has been blocked because of a Feature Policy applied to the current document. See https://goo.gl/EuHzyv for more details.”
Action Items
* SVC + Simulcast few features: Not considering new features for WebRTC 1.0 (could be considered as a separate document)
* QUIC: QUIC Transport document, ICE transport extension and consider an ICE constructor in WebRTC
* Preference for Scary A over Scary B
* Project snowflake: put together an IETF document on a subset of ICE, W3C API document to access the subset.
More scary proposal: audio/video over QUIC
Two options:
Option A: low-level access to encoders/decoders
Option B: higher-level access to send/receive
112
Option A: low-level access to encoders/decoders
interface VideoEncoder {
Promise<EncodedVideoFrame> encodeVideoFrame(track, VideoEncodeParameters);
}
interface VideoDecoder { // Also a jitter buffer
void decodeVideoFrame(EncodedVideoFrame frame);
readonly attribute MediaStreamTrack decodedVideo;
}
113
Option A: low-level access to encoders/decoders
// To send a video frame
var encoded_frame = await encoder.encodeVideoFrame(track, {bitrate: ...});
var serialized_frame = serialize_frame(encoded_frame);
var qstream = quicTransport.createQuicStream();
qstream.write(serialized_frame); qstream.finish();
// To receive a video frame
var track = new MediaStreamTrack("video");
var buffer = ...; // Anyone remember how to make a buffer?
quicTransport.onstream = (qstream) => {
qstream.readInto(buffer);
var encoded_frame = deserialize_frame(buffer);
decoder.decodeVideoFrame(encoded_frame, track);
}
render_track(track);
114
Option A: low-level access to encoders/decoders
dictionary VideoEncodeParameters {
unsigned long bitrate;
boolean generateKeyFrame;
// TODO: resolutionScale, framerateScale, ...
}
dictionary EncodedVideoFrame {
unsigned long id;
unsigned short width;
unsigned short height;
unsigned short rotation;
unsigned long capturedTimestampMs;
ByteArray encodedData;
// TODO: SVC, vp8, vp9, h264, and h265-specific parameters, ...
}
115
Option A: low-level access to encoders/decoders
interface AudioEncoder {
Promise<EncodedAudioFrame> encodeAudioFrame(track, AudioEncodeParameters);
}
interface AudioDecoder { // Also a jitter buffer
void decodeAudioFrame(EncodedAudioFrame);
readonly attribute MediaStreamTrack decodedAudio;
}
116
Option A: low-level access to encoders/decoders
// Send an audio frame
var encoded_frame = encoder.encodeAudioFrame(track, {ptime: 20});
var serialized_samples = serialize_samples(encoded_frame);
var qstream = quicTransport.createStream();
qstream.write(encoded_samples); qstream.finish();
// Receive an audio frame
var track = new MediaStreamTrack("audio");
quic.onstream = function(qstream) {
var buffer = ...; // Anyone remember how?
serialized_frame = qstream.readInto(buffer);
encoded_frame = deserialize_frame(serialize_frame);
decoder.decodeAudioFrame(encoded_samples, track);
}
playout_track(track);
117
Option A: low-level access to encoders/decoders
dictionary AudioEncodeParameters {
unsigned long frameSize; // aka ptime, in ms
unsigned long bitrate;
}
dictionary EncodedAudioSamples {
// Effectively a start and end time stamp, but in terms of a "sample rate clock"
unsigned long startSampleIndex;
// Theoretical endSampleIndex == startSampleIndex + sampleCount
unsigned int sampleCount;
ByteArray encodedData;
// TODO: opus-specific parameters?
}
118
Option B: high-level access to send/receive
interface VideoSender {
void setTransport(QuicTransport);
void setTrack(MediaStreamTrack);
void sendVideo(VideoSendParameters);
}
dictionary VideoSendParameters {
DOMString muxId;
DOMString codecId; // Obtained from capabilities
unsigned long maxBitrate;
RTCDegradationPreference degradationPreference = "balanced";
// TODO: fec, resolutionScale, framerateScale
}
119
Option B: high-level access to send/receive
interface VideoReceiver {
readonly attribute MediaStreamTrack track;
void setTransport(QuicTransport);
Promise receiveVideo(); // Resolves when the remote side sends a stream
}
120
Scary ICE
Api to:
Advantages:
121
One last attempt at content hint
Use case
Has a capture card�sets "detailed"
�
Has a game screencapture�sets "motion"
Has a stream music app�sets "music"
Effect on PC (especially w/ balanced):�Gets same behavior as screencast: �high resolution/quality; fps degrades; denoising off; maybe special code mode
Gets same behavior as camera: �high fps; resolution/quality degrades; denoising off
Gets intelligibility enhancement off (and other non-standard implicit speech processing); noise suppression; special codec mode; higher bitrate;
122
One last attempt at content hint
Use case
Has a capture card�sets "detailed"
Has a game screencapture�sets "motion"
Has a music app sets "music"
Effect on MediaRecorder�Gets same implicit encoding behavior as tab / screen capture (can’t be overridden otherwise).
�Gets same implicit encoding preferences as UVC (can’t be overridden otherwise).
�More bits allocated to the audio stream (w/o audioBitsPerSecond set). Default codec set as appropriate for music if not provided (user doesn’t need to know what Opus is or good music target bitrate for specific codecs).
123
For extra credit
124
Name that bird!
Thank you
Special thanks to:
W3C/MIT for WebEx
WG Participants, Editors & Chairs
The bird
125