The QVideoFilterRunnable class represents the implementation of a filter that owns all graphics and computational resources, and performs the actual filtering or calculations. More...
Header: | #include <QVideoFilterRunnable> |
qmake: | QT += multimedia |
Since: | Qt 5.5 |
virtual QVideoFrame | run(QVideoFrame *input, const QVideoSurfaceFormat &surfaceFormat, QVideoFilterRunnable::RunFlags flags) = 0 |
The QVideoFilterRunnable class represents the implementation of a filter that owns all graphics and computational resources, and performs the actual filtering or calculations.
Video filters are split into QAbstractVideoFilter and corresponding QVideoFilterRunnable instances, similar to QQuickItem and QSGNode. This is necessary to support threaded rendering scenarios. When using the threaded render loop of the Qt Quick scene graph, all rendering happens on a dedicated thread. QVideoFilterRunnable instances always live on this thread and all its functions, run(), the constructor, and the destructor, are guaranteed to be invoked on that thread with the OpenGL context bound. QAbstractVideoFilter instances live on the main (GUI) thread, like any other QObject and QQuickItem instances created from QML.
Once created, QVideoFilterRunnable instances are managed by Qt Multimedia and will be automatically destroyed and recreated when necessary, for example when the scene graph is invalidated or the QQuickWindow changes or is closed. Creation happens via the QAbstractVideoFilter::createFilterRunnable() factory function.
See also QAbstractVideoFilter.
Constant | Value | Description |
---|---|---|
QVideoFilterRunnable::LastInChain |
0x01 |
Indicates that the filter runnable's associated QAbstractVideoFilter is the last in the corresponding VideoOutput type's filters list, meaning that the returned frame is the one that is going to be presented to the scene graph without invoking any further filters. |
The RunFlags type is a typedef for QFlags<RunFlag>. It stores an OR combination of RunFlag values.
[pure virtual]
QVideoFrame QVideoFilterRunnable::run(QVideoFrame *input, const QVideoSurfaceFormat &surfaceFormat, QVideoFilterRunnable::RunFlags flags)Reimplement this function to perform filtering or computation on the input video frame. Like the constructor and destructor, this function is always called on the render thread with the OpenGL context bound.
Implementations that do not modify the video frame can simply return input.
It is safe to access properties of the associated QAbstractVideoFilter instance from this function.
input will not be mapped, it is up to this function to call QVideoFrame::map() and QVideoFrame::unmap() as necessary.
surfaceFormat provides additional information, for example it can be used to determine which way is up in the input image as that is important for filters to operate on multiple platforms with multiple cameras.
flags contains additional information about the filter's invocation. For example the LastInChain flag indicates that the filter is the last in a VideoOutput's associated filter list. This can be very useful in cases where multiple filters are chained together and the work is performed on image data in some custom format (for example a format specific to some computer vision framework). To avoid conversion on every filter in the chain, all intermediate filters can return a QVideoFrame hosting data in the custom format. Only the last, where the flag is set, returns a QVideoFrame in a format compatible with Qt.
Filters that want to expose the results of their computation to Javascript code in QML can declare their own custom signals in the QAbstractVideoFilter subclass to indicate the completion of the operation. For filters that only calculate some results and do not modify the video frame, it is also possible to operate asynchronously. They can queue the necessary operations using the compute API and return from this function without emitting any signals. The signal indicating the completion is then emitted only when the compute API indicates that the operations were done and the results are available. Note that it is strongly recommended to represent the filter's output data as a separate instance of QJSValue or a QObject-derived class which is passed as a parameter to the signal and becomes exposed to the Javascript engine. In case of QObject the ownership of this object is controlled by the standard QML rules: if it has no parent, ownership is transferred to the Javascript engine, otherwise it stays with the emitter. Note that the signal connection may be queued,for example when using the threaded render loop of Qt Quick, and so the object must stay valid for a longer time, destroying it right after calling this function is not safe. Using a dedicated results object is guaranteed to be safe even when using threaded rendering. The same is not necessarily true for properties on the QAbstractVideoFilter instance itself: properties can safely be read in run() since the gui thread is blocked during that time but writing may become problematic.
Note: Avoid time consuming operations in this function as they block the entire rendering of the application.
Note: The handleType() and pixelFormat() of input is completely up to the video decoding backend on the platform in use. On some platforms different forms of input are used depending on the graphics stack. For example, when playing back videos on Windows with the WMF backend, QVideoFrame contains OpenGL-wrapped Direct3D textures in case of using ANGLE, but regular pixel data when using desktop OpenGL (opengl32.dll). Similarly, the video file format will often decide if the data is RGB or YUV, but this may also depend on the decoder and the configuration in use. The returned video frame does not have to be in the same format as the input, for example a filter with an input of a QVideoFrame backed by system memory can output a QVideoFrame with an OpenGL texture handle.
See also QVideoFrame and QVideoSurfaceFormat.