50
edits
(→I don't understand what the 'audiostream' attribute is for: it's streamaudio, not audiostream) |
Notmasteryet (talk | contribs) No edit summary |
||
Line 20: | Line 20: | ||
Most filter graph APIs have a notion of data types on each stream connection, so encoders, decoders and muxers are possible worker types, as well as filters which work on compressed data; anything that doesn't need to talk directly to hardware could be written in Javascript. As it stands, the API seems to disallow implementations of: compressed stream copying and editing for things like efficient frame dup/drop to maintain sync, keyframe detection, codecs written in javascript, and feeding compressed data obtained elsewhere into the graph. | Most filter graph APIs have a notion of data types on each stream connection, so encoders, decoders and muxers are possible worker types, as well as filters which work on compressed data; anything that doesn't need to talk directly to hardware could be written in Javascript. As it stands, the API seems to disallow implementations of: compressed stream copying and editing for things like efficient frame dup/drop to maintain sync, keyframe detection, codecs written in javascript, and feeding compressed data obtained elsewhere into the graph. | ||
== <mediaresource> as an HTML element == | |||
There is no reason to have a media resource object in the DOM as an element. It is not a presentation element, and it is used only by the JavaScript. It is recommended to implement a media resource as a regular object: | |||
<pre>interface MediaResource { | |||
... | |||
}</pre> | |||
It's usage will be similar to the XMLHttpRequest: call the open method with async and url parameters. The object can have additional canPlayType to avoid request with unsupported media types. | |||
It makes sense to have the MediaResoure (and Audio) object available in Worker's global scope. That will let a worker process to load the media, process, and then play. |
edits