1 of 38

W3C WebRTC

WG Meeting

October 21, 2025

8 AM - 10 AM

1

Chairs: Jan-Ivar Bruaroey

Youenn Fablet

Guido Urdaneta

2 of 38

W3C WG IPR Policy

2

3 of 38

Welcome!

  • Welcome to the October 2025 interim meeting of the W3C WebRTC WG�Note: new meeting link! Check invite�
  • Future meetings:
    • TPAC 2025 https://www.w3.org/2025/11/TPAC/schedule.html#tuesday
      • Tuesday November 11, 16:30-18:00 JST
      • Thursday November 13, 09:00-10:30 JST
      • Thursday November 13, 13:45-15:00 JST (Media WG Joint Meeting)
    • December 9

3

4 of 38

About this Virtual Meeting

4

5 of 38

W3C Code of Conduct

  • This meeting operates under W3C Code of Ethics and Professional Conduct
  • We're all passionate about improving WebRTC and the Web, but let's all keep the conversations cordial and professional

5

6 of 38

Virtual Interim Meeting Tips

This session is (still) being recorded

  • Click to get into the speaker queue.
  • Click to get out of the speaker queue.
  • Please wait for microphone access to be granted before speaking.
  • If you jump the speaker queue, you will be muted.
  • Please use headphones when speaking to avoid echo.
  • Please state your full name before speaking.
  • Poll mechanism may be used to gauge the “sense of the room”.

6

7 of 38

Understanding Document Status

  • Hosting within the W3C repo does not imply adoption by the WG.
    • WG adoption requires a Call for Adoption (CfA) on the mailing list.
  • Editor’s drafts do not represent WG consensus.
    • WG drafts do imply consensus, once they’re confirmed by a Call for Consensus (CfC) on the mailing list.
    • Possible to merge PRs that may lack consensus, if a note is attached indicating controversy.

7

8 of 38

Issues for Discussion Today

  • 08:10 - 08:20 AM WebRTC-pc remote track mute (Henrik)
  • 08:20 - 08:40 AM WebRTC-extensions receiver.on[c/s]srcchange (Henrik)
  • 08:40 - 09:00 AM WebRTC-extensions 5G network slicing (Youenn)
  • 09:00 - 09:30 AM SFrame processing model (Youenn)
  • 09:30 - 09:55 AM MediaCapture-main/extensions (Jan-Ivar)
  • 09:55 - 10:00 AM Wrapup and Next Steps (Chairs)

Time control:

  • A warning will be given 2 minutes before time is up.
  • Once time has elapsed we will move on to the next item.

8

9 of 38

WebRTC-pc remote track mute (Henrik)

Start Time: 08:10 AM

End Time: 08:20 AM

9

10 of 38

#3077: Should the remote track mute in response to replaceTrack(null)?

What is a muted track?

  • It’s a track that is not producing any frames.

🌈 QUIZ TIME 🌈

So, if my track isn’t producing frames, does that mean it’s muted?

  1. 👍 Well clearly, you just said so.

  • 👎 Not necessarily, mute implies silence but silence does not imply mute.

10

11 of 38

#3077: Should the remote track mute in response to replaceTrack(null)?

What are UAs doing?

  • Chrome does #1: mute if frames stop flowing after “a while”.�How long constitutes as “a while” is based on FPS heuristics and is video-only.
  • Firefox/Safari does #2: mute is something UA is doing, not FPS based.

Problems with #1:

  • We don’t want to standardize arbitrary FPS heuristics.
  • Heuristics are bad for variable FPS sources (UA can’t know if app wants mute).
  • Doesn’t work for audio (where mute is ”silence” as opposed to “0 fps”).

Spec language implies Quiz answer #2.

  • This appears to have WG support.
  • Note: implies sender.replaceTrack(null) does not cause remote mute.

11

12 of 38

#3077: Should the remote track mute in response to replaceTrack(null)?

Want to detect “frames not flowing”? Just poll getStats().

  • UA interoperable and app can decide on its own heuristic. 👍

Proposal 1:

  • Don’t mute if frames stop flowing.
  • No spec change needed but Chrome needs a “don’t mute” bugfix.

12

13 of 38

#3077: Should the remote track mute in response to replaceTrack(null)?

Spec also says to mute when:

  • “[the SSRC] goes away due to [RTCP timeout].” = 2 RTCP intervals

Problems:

  • Nobody has implemented this and it lacks WPTs.
  • It lacks a clear intent from the remote side.
  • getStats() implies SSRCs do not go away due to a=inactive.

Proposal 2:

  • Remove “mute on timeout” from spec.
  • (We still mute on RTCP BYE which has intent.)

13

14 of 38

WebRTC-extensions�receiver.on[c/s]srcchange event (Henrik)

Start Time: 08:20 AM

End Time: 08:40 AM

14

15 of 38

PR #243: Add onssrcchange/oncsrcchange events to RTCRtpReceiver (Henrik)

Recap:

  • Last decoded SSRCs/CSRCs are obtained via the getSynchronizationSources() and getContributingSources().
  • Use cases are timing sensitive:
    • UX to show correct participant name on correct video tile.
    • UX to show which participant is speaking + volume bar.
  • Polling infrequently adds delays, polling 1000/s is inefficient/unergonomic.

At the November 2024 WG meeting we decided to add events that fire when the SSRC or CSRC value changes.

  • We now have a PR.
  • “Rough consensus [...], with concerns noted on related onunmute interop”.

15

16 of 38

PR #243: Add onssrcchange/oncsrcchange events to RTCRtpReceiver (Henrik)

The noted concern:

  • What if receiver.onssrcchange is a reason not to fix track.onunmute?

Reasons not to be concerned:

  1. Chrome work ongoing to fix track.onunmute.
  2. As previously discussed, onunmute represents “SSRC is live”, meaning packet reception (prior to jitter buffer and decode).
  3. SSRC is updated at decode time, could be 100s of milliseconds later.

The idea that one of the APIs can replace the other when they fire at different points in time and for different reasons is wrong.

  • SSRC/onunmute behaviors already have consensus.

16

17 of 38

PR #243: Add onssrcchange/oncsrcchange events to RTCRtpReceiver (Henrik)

Proposal: Land the PR.

  • No new information exposed, only improves UX timing.

17

18 of 38

Discussion (End Time: 08:40)

18

19 of 38

WebRTC-extensions 5G network slicing (Youenn)

Start Time: 08:40 AM

End Time: 09:00 AM

19

20 of 38

  • 5G network slicing
    • One device, several independent virtual networks
      • Managed between device and network operator
    • Adapt 5G parameters to improve latency, bandwidth…
      • Available to iOS/Android 5G devices

  • Native application use cases
    • Video call application
    • Cloud gaming

  • iOS API
    • Set app category: gaming, communication, streaming
    • Set traffic category on individual connections

20

21 of 38

  • 5G network slicing for web browsers
    • Regular 5G networking for fetching resources
      • JS, MSE
    • Low latency 5G networking for demanding web applications
      • Video calls, cloud gaming�
  • Should web applications be able to influence UA decisions?�
  • How to know when to use 5G network slicing?
    • In particular for WebRTC and WebTransport

21

22 of 38

  • Use cases for letting web applications influence UA choice
    • Most WebRTC applications want low latency
      • Not all WebRTC applications need low latency
        • P2P CDN, communication with sensors�
    • Most WebTransport applications do not require low latency?
      • Some may want low latency in the future�
    • A/B testing

22

23 of 38

  1. Let UA decide what to do
    • Note to implementors in webrtc-pc?

  • A dedicated 5G network slicing hint API
    • Per RTCPeerConnection / per WebTransport�
  • A more general hint API
    • WebTransport congestionControl
    • A new RTCConfiguration member

23

24 of 38

Discussion (End Time: 09:00)

24

25 of 38

SFrame processing model (Youenn)

Start Time: 09:00 AM

End Time: 09:30 AM

25

26 of 38

  • SFrame RTP format spec under way
    • To be further discussed at next IETF meeting�
  • Fine grained encryption support
    • From per-packet up to per-frame encryption
      • Up to the application to decide

  • Proposed negotiation rules
    • SFrame negotiation granularity is the m-section
      • SFrame negotiation is sticky
        • An m-section can only go from no-SFrame to SFrame
      • m-section rejected if one of the peer does not support SFrame
    • SFrame parameters are exchanged out-of-band
      • Crypto suite, per-packet/per-frame, keys

26

m=audio 50000 RTP/SAVPF 10 11

a=sframe

a=rtpmap:10 opus/48000/2

a=rtpmap:11 CN/8000

27 of 38

  • Sender-side

  • Receiver-side

27

const transceiver = pc.addTransceiver("video");

const sframeOptions = { ... };

// Native SFrame transformn

transceiver.sender.transform = new SFrameTransform(sframeOptions);�// Script SFrame transformn

transceiver.sender.transform = new RTCRtpScriptTransform(worker, { type:"sframe" });

pc.ontrack = e => e.receiver.transform = new SFrameTransform(...);�pc.onnegotiationneeded = () => {� // Renegotiate with SFrame enabled, if needed

}

28 of 38

  • If SFrame is negotiated for a m-section
    • RTP packets are dropped if SFrame transform is NOT set

  • If SFrame is NOT negotiated for a m-section
    • RTP packets are dropped if SFrame transform is set

  • Migration from no-SFrame to SFrame
    • Sender will stop sending media until negotiation cycle is done
    • Receiver will start processing as soon as receiving SFrame packets

  • No migration from SFrame to no-SFrame
    • Rejection of whole description or m-section?�
  • What about rollback when trying to enable SFrame on a no-SFrame m-section?
    • Web application will have to manually set back `transform` to null

28

29 of 38

  • RTCRtpScriptTransform and per-packet encryption
    • Receiver-side support
      • Transform need to deal with codec-specific depacketization
    • Possible extension to current spec for sender-side support
      • SFrame packetizer needs partitioning info
    • Can be used for whatever encryption granularity needed�
  • No request for that feature so far

29

30 of 38

Discussion (End Time: 09:30)

30

31 of 38

MediaCapture-main/extensions (Jan-Ivar)

Start Time: 09:30 AM

End Time: 09:55 AM

31

32 of 38

MediaCapture-main/extensions (Jan-Ivar)

  • mediacapture-main
    • Issue 1058: Clarify what "system default" means

  • mediacapture-extensions
    • Issue 164: Detect speech on muted microphone

32

33 of 38

Issue 1058 - Clarify what "system default" means

Spec says: "User Agents are encouraged to default to using the user's primary or system default device for kind (when possible)."

Can’t mean UA default or above sentence is circular. Means OS default: Browsers expose inherent/changed OS settings through enumerateDevices() order + devicechange

Picking “BRIO” in Firefox picker: vs. Chrome picker:

33

34 of 38

Issue 164: Detect speech on muted microphone

Privacy: the mic is still on during mute (!)

Users shouldn’t need to trust websites to

not record them during “mute”.�

Will work poorly with UA mute & PEPC

https://github.com/WICG/PEPC/issues/62

Proposal:

const stream = await navigator.mediaDevices.getUserMedia({

audio: {detectSpeechActivity: true}

});

const [track] = stream.getAudioTracks();

if (track.getSettings().detectSpeechActivity) { // feature detection

track.onspeechactivity = () => track.enabled || alert("Are you talking? Your mic is off.");

}

34

35 of 38

Discussion (End Time: 09:55)

35

36 of 38

Wrapup and Next Steps

Start Time: 09:55 AM

End Time: 10:00 AM

36

37 of 38

Next Steps

  • Content goes here

37

38 of 38

Thank you

Special thanks to:

WG Participants, Editors & Chairs

38