What is WebRTC currently missing?

Are there any features that you wish WebRTC had?
For example, features that exist in other specs, or are coming in future iterations but not here yet? If so, what are they?
Comments
-
uniform browser support 😅
3 -
This may not meet the exact specifications of the question, but hopefully it's in the right ballpark :) I wish there was an easier way to do file transfers through WebRTC. I've seen plenty of example code that essentially shows how to do your own chunking on both ends, and so on... but I wish it was easier to just say "send this file, or blob of data, or whatever". Like a MediaStreamTrack, but, I guess a FileStream?
4 -
An actual definition. Right now WebRTC is just... "video stuff" :)
1 -
A proper (and embeddable) alternative implementation that isn't owned/maintained by Google.
4 -
@Vincent If only webrtc GStreamer's implementation was the default standard... It's very clean and well written: https://gitlab.freedesktop.org/gstreamer/gstreamer/-/tree/main/subprojects/gst-examples/webrtc
2 -
WebRTC is a high level abstraction, I wish it were more low-level. The current API and architecture forces the new features that people think about to be a much bigger lift than it ought to be.
For example, building a new codec, does not only require you to build a new codec, but also figure out how this would be set in the APIs, negotiated on the wire, packetised on the wire, etc. Which means you need to have a lot of muscle power to make new features.
Another example, want to change noise suppression or any in-pipeline improvements, either cannot be done or done inefficiently (at the cost of CPU cycles).
My main ask has been for a lower api, i.e., allows the web developer to build the pipeline in javascript. Developers will build abstractions and perhaps if we are able to get this dream realised in v2, we would be able to build WebRTC v1 APIs atop those v2 APIs.
@aconchillo's example of gstreamer and building the pipeline is a good one:
gst-launch-1.0 videotestsrc is-live=true ! videoconvert ! queue leaky=2 max-size-buffers=5 ! nvv4l2vp8enc maxperf-enable=1 ! rtpvp8pay ! webrtcwrapperbin ip_port_discovery_sendrecv ! rtpvp8depay ! decodebin ! autovideosink
It would be conceivable with v2 and webassembly that developers would override parts of the existing pipeline with their own code, which would give us creativity at two levels -- optimised pipelines and optimised algorithms for specific use-cases.
4 -
@chad there was a fair amount of debate on FileStream, TimedText, eventually building DataChannels as the underlying API and protocol, the assumption was that it would allow web developers to build the higher level abstraction.
Alas, most abstractions intertwine these with their flavour of signalling protocol, which meant that they were more prescriptive than they needed to be...
2 -
@jameshush That's interesting, I'd like to understand your perspective a bit more. It feels pretty well-defined to me. What do you think the gaps are?
0 -
@vr000m It's funny you mentioned "intertwining with signaling protocol"; that was one of the weirdest things for me to understand as I was getting into all this stuff. It's like the spec defines specific ways to do so many things, but then just kind of hand waves about "oh yeah, you probably need to do something something signaling too" :)
2