Media/WebRTC/WebRTCE10S: Difference between revisions

Line 61: Line 61:


=== Hardware Acceleration ===
=== Hardware Acceleration ===
In this design, we make no attempt to combine HW acceleration and capture
or rendering. I.e., if we have a standalone HW encoder, we just insert it
into the pipeline in place of the the SW encoder and then redirect the
encoded media out the network interface. The same goes for decoding.
There's no attempt made to shortcut the rest of the stack. This design
promotes modularity, since we can just make the HW encoder look
like another module inside of GIPS. In the longer term, we may want
to revisit this, but I think it's the best design for now.
Note that if we have an integrated encoder (e.g., in a camera) then
we *can* accomodate that by just having gUM return encoded frames
instead of I420 and then we pass those directly to the network without
encoding them. (Though this is somewhat complicated by the need
to render them locally in a video tag.)


=== Network Access ===
=== Network Access ===
Confirmed users
214

edits