SurfaceTextureHelper
Helper class for using a SurfaceTexture to create WebRTC VideoFrames. In order to create WebRTC VideoFrames, render onto the SurfaceTexture. The frames will be delivered to the listener. Only one texture frame can be in flight at once, so the frame must be released in order to receive a new frame. Call stopListening() to stop receiveing new frames. Call dispose to release all resources once the texture frame is released.
Types
Link copied to clipboard
interface FrameRefMonitor
Interface for monitoring texture buffers created from this SurfaceTexture.
Functions
Link copied to clipboard
Same as above with alignTimestamps set to false and yuvConverter set to new YuvConverter.
open fun create(threadName: String, sharedContext: EglBase.Context, alignTimestamps: Boolean): SurfaceTextureHelper
Same as above with yuvConverter set to new YuvConverter.
open fun create(threadName: String, sharedContext: EglBase.Context, alignTimestamps: Boolean, yuvConverter: YuvConverter): SurfaceTextureHelper
Create a SurfaceTextureHelper without frame ref monitor.
open fun create(threadName: String, sharedContext: EglBase.Context, alignTimestamps: Boolean, yuvConverter: YuvConverter, frameRefMonitor: SurfaceTextureHelper.FrameRefMonitor): SurfaceTextureHelper
Construct a new SurfaceTextureHelper sharing OpenGL resources with `sharedContext`.
Link copied to clipboard
Forces a frame to be produced.
Link copied to clipboard
Set the rotation of the delivered frames.
Link copied to clipboard
Use this function to set the texture size.
Link copied to clipboard
Start to stream textures to the given `listener`.
Link copied to clipboard
Stop listening.
Link copied to clipboard
Posts to the correct thread to convert `textureBuffer` to I420.