W3C WebRTC
WG Meeting
September 25, 2018
1 PM - 2:30 PM Pacific Time
1
Chairs: Bernard Aboba
Harald Alvestrand
W3C WG IPR Policy
2
Welcome!
3
About this Virtual Meeting
Information on the meeting:
4
A Word of Thanks
Best wishes to Stefan Hakansson, who after 7 years as a WEBRTC WG Chair has now stepped down.
Stefan’s contributions are much appreciated, and he will be greatly missed!
5
TPAC (Lyon, France)
We will meet Monday October 22 and Tuesday the 23rd of TPAC week.
6
Reviews are pouring in for….
WebRTC Next Version Use Cases:
https://w3c.github.io/webrtc-nv-use-cases/
“Like Moby Dick, without the whale.”
“It would have been a pleasure to burn, if it was worth printing.”
File your Issues today!
https://github.com/w3c/webrtc-nv-use-cases/issues
7
For Discussion Today
8
For Discussion Today (cont’d)
9
enum PermissionName {� "camera",� "microphone",� "display", // ← so we need this� …�}��...getDisplayMedia already assumes this (search for “display” with quotes), and
already says “The User Agent MUST NOT create a permission storage entry with a value of "granted". ”
10
11
Remote participants hearing themselves in the screen capture makes the stream unusable.
12
13
14
Security concerns
Full-screen/browser sharing is scary!
Not just passive threats.
If a web surface under site control is captured, that website has keys to the car, and can iframe-navigate as the logged-in user effectively.
Sidesteps cross-origin protections.
Firefox warns, but hard to explain 👉
Google “share screen trust” for more
That said, might it be OK to allow influencing away from “scary” sources?
await navigator.getDisplayMedia({video: {displaySurface: ”window”}}); // OK�await navigator.getDisplayMedia({video: {displaySurface: ”application”}}); // OK�await navigator.getDisplayMedia({video: {displaySurface: ”monitor”}}); //TypeError�await navigator.getDisplayMedia({video: {displaySurface: ”browser”}}); //TypeError
Any use-case for it? (sharing e.g. Google doc sadly is browser).
Is this a slippery slope?
16
The life-cycle of MediaStreamTracks lends itself getDisplayMedia(); mute/unmute is a temporary state, ended is permanent.
Minimized windows: “hidden” or special case of “covered”?
17
18
For Discussion Today
19
Issue 1858: What happens when an answerer stops a transceiver that others are “bundled” on? (Bernard)
20
Issue 1858: What happens when an answerer stops a transceiver that others are “bundled” on? (cont’d)
21
Issue 1858: What happens when an answerer stops a transceiver that others are “bundled” on? (cont’d)
22
Issue 1888: RTCPriorityType is not documented for simulcast (Harald)
Group preference: ?
23
Issue 1896: Order of RTCRtpSendParameters.encodings is not described (Harald)
24
Issue 1930: Rename sender.transport.transport to sender.transport.iceTransport? (Jan-Ivar)
Two nested attributes of the same name is unintuitive / hard to read:
pc.getTransceivers()[0].sender.transport.transport; // whah?
Can we rename it?
pc.getTransceivers()[0].sender.transport.iceTransport; // ah!�
Edge already implements this, thus would be affected.
But with WebRTC for ORTC already shimmed in adapter, is this fixable? Shim:
sender.transport.iceTransport || sender.transport.transport;
25
Issue 1982: Missing normative steps for determining codecs (Jan-Ivar)
sender.getParameters today:"The codecs sequence is populated based on the codecs that have been negotiated for sending, and which the user agent is currently capable of sending"�receiver.getParameters today: "The codecs sequence is populated based on the codecs that the receiver is currently prepared to receive"
But these are synchronous methods. Proposal:�sender.getParameters: “codecs is set to the value of [[SendCodecs]]”�receiver.getParameters “codecs is set to the value of [[ReceiveCodecs]]”
...and have SRD(Answer) and SLD(Answer):�”Set [[SendCodecs]] to the codecs that have been negotiated for sending, and which the user agent is currently capable of sending” and�”Set [[ReceiveCodecs]] to the codecs that have been negotiated for receiving, and which the user agent is currently prepared to receive” ?
E.g.:� console.log(sender.getParameters().codecs.length); // 0� await pc.setRemoteDescription(msg.offer);� console.log(sender.getParameters().codecs.length); // 0� await pc.setLocalDescription(await pc.createAnswer());� console.log(sender.getParameters().codecs.length); // 3
26
Issue 1983: getSynchronizationSources and getContributingSources should work for video too (henbos)
getSynchronizationSources() and getContributingSources() return the timestamps of the most recent RTP packets. Currently audio-only as a means to surface audioLevel.
27
For extra credit
28
Name that bird!
Thank you
Special thanks to:
W3C/MIT for WebEx
WG Participants, Editors & Chairs
The bird
29