Talk:MediaStreamAPI
Some basic feedback:
I don't understand what the 'audiostream' attribute is for
In the examples it seems to be used to tell the original video element not to play its own audio, and instead audio is streamed (and optionally filtered) to a separate audio element, while the original video still plays. If that's the design, isn't a similar attribute needed to hide the video if that's what's being processed?
Would the attributes' use be more obvious if they were called 'muteaudio' and 'mutevideo'? Is there not already CSS for that?
It seems like a complicated way to do things. Maybe this is an argument for the mediaresource element, as a non-displaying source. What if a filter graph could be fed back to the original element for display? Or better, if that were the default?
There's no way for workers to signal latency to the scheduler
If the worker just writes out timestamped buffers, it can dynamically signal it's latency, and the scheduler can make its own decision about what constitutes and underrun. However, in realtime contexts (conferencing and games) it is helpful to optimize for the lowest possible latency. To assist this, it's helpful if the workers (and internal elements like codecs and playback sinks) can advertise their expected latency. The scheduler can then sum these over the pipeline to determine a more aggressive minimal time separation to maintain between the sources an sinks.
One possible resolution is just to use an aggressive schedule for all realtime streams, and leave it to developers to discover in testing what they can get away with on current systems.