My idea is to provide video as asynchronous streams of some native MCOP data type which contains images. This data type is to be created yet. Doing so, plugins which deal with video images could be connected the same way audio plugins can be connected.
There are a few things that are important not to leave out, namely:
There are RGB and YUV colorspaces.
The format should be somehow tagged to the stream.
Synchronization is important.
My idea is to leave it possible to reimplement the
VideoFrame
class so that it can store stuff in a
shared memory segment. Doing so, even video streaming between different
processes would be possible without too much pain.
However, the standard situation for video is that things are in the same process, from the decoding to the rendering.
I have done a prototypic video streaming implementation, which you can download here . This would need to be integrated into MCOP after some experiments.
A rendering component should be provided that supports XMITSHM (with RGB and YUV), Martin Vogt told me he is working on such a thing.
Would you like to make a comment or contribute an update to this page?
Send feedback to the KDE Docs Team