Raw Data


The Video SDK provides you with an option to access real-time raw audio and video data of a session. After accessing the raw data, you can process it and apply additional effects to it to enhance the session experience.

This section will provide you with information on how to enable and gain access to the raw data of a session.

Specify Raw Data Memory Mode

In order to utilize raw data of any type, you must first select the memory mode you wish to use. The SDK supports heap-based and stack-based memory modes.

Stack-Based Memory

  • Variables are allocated automatically and deallocated after the data leaves scope.
  • Variables are not accessible from or transferable to other threads.
  • Typically has faster access than heap-based memory allocation.
  • Memory space is managed by the CPU and will not become fragmented.
  • Variables cannot be resized.

Heap-Based Memory

  • Variables are allocated and deallocated manually. You must allocate and free variables to avoid memory leaks.
  • Variables can be accessed globally.
  • Has relatively slower access than stack-based memory allocation.
  • Has no guarantee on the efficiency of memory space and can become fragmented.
  • Variables can be resized.

For more information on these types of memory allocation, refer to Stack-based memory allocation and Heap-based dynamic memory allocation guides.

After determining which memory mode is right for you, you must specify it when the SDK is initialized. Note that this must be done for audio and video individually. The two types of memory allocation are defined with the following enum values:

Stack-based Memory:

ZoomVideoSDKRawDataMemoryMode.ZoomVideoSDKRawDataMemoryModeStack

Heap-based Memory:

ZoomVideoSDKRawDataMemoryMode.ZoomVideoSDKRawDataMemoryModeHeap

To specify a raw data memory mode, provide one of these enums cases to the ZoomVideoSDKInitParams during SDK initialization.


ZoomVideoSDKInitParams *initParams = [[ZoomVideoSDKInitParams alloc] init];
// Set audio memory mode to heap.
initParams.audioRawdataMemoryMode = ZoomVideoSDKRawDataMemoryModeHeap;
// Set video memory mode to heap.
initParams.videoRawdataMemoryMode = ZoomVideoSDKRawDataMemoryModeHeap;
// Set share memory mode to heap.
initParams.shareRawdataMemoryMode = ZoomVideoSDKRawDataMemoryModeHeap;

let initParams = ZoomVideoSDKInitParams()
// Set audio memory mode to heap.
initParams.audioRawdataMemoryMode = .heap
// Set video memory mode to heap.
initParams.videoRawdataMemoryMode = .heap
// Set share memory mode to heap.
initParams.shareRawdataMemoryMode = .heap

Raw video data is offered in the following types:

  • YUV420: A data object commonly used by the renderer based on OpenGL ES.
  • CVPixelBuffer defined in NV12: A data object defined by Apple that can be used with Metal renderer.

To access and modify the raw video data, you will need to do the following:

  1. Implement an instance of the ZoomVideoSDKRawDataPipeDelegate.
  2. Use callback functions provided by the ZoomVideoSDKRawDataPipeDelegate to receive each frame of the raw video data.

// Used to receive video's NV12 data(CVPixelBufferRef).
- (void)onPixelBuffer:(CVPixelBufferRef)pixelBuffer rotation:(ZoomVideoSDKVideoRawDataRotation)rotation {
    // Access CVPixelBufferRef using pixelBuffer.
    // Get rotation of raw data video stream.
    switch (rotation) {
        case ZoomVideoSDKVideoRawDataRotation90:
            // 90 degrees.
            break;
        default:
            break;
    }
}
// Used to receive video's YUV420 data.
- (void)onRawDataFrameReceived:(ZoomVideoSDKVideoRawData *)rawData {
    // Access the raw data for each of the 3 components.
    char *yBuffer = [rawData yBuffer];
    char *uBuffer = [rawData uBuffer];
    char *vBuffer = [rawData vBuffer];
    // Get format of raw data.
    ZoomVideoSDKVideoRawDataFormat videoFormat = [rawData format];
    switch (videoFormat) {
        case ZoomVideoSDKVideoRawDataFormatI420:
            // Raw data is I420 format.
            break;
        case ZoomVideoSDKVideoRawDataFormatI420_Limit:
            // Raw data is I420_limit format.
            break;
    }
    // Get rotation of raw data video stream.
    ZoomVideoSDKVideoRawDataRotation rotation = [rawData rotation];
    switch (rotation) {
        case ZoomVideoSDKVideoRawDataRotation90:
            // 90 degrees.
            break;
        default:
            break;
    }
}
// Called when the raw data has been toggled.
- (void)onRawDataStatusChanged:(ZoomVideoSDKUserRawdataStatus)userRawdataStatus {
    switch (userRawdataStatus) {
        case ZoomVideoSDKUserRawdataOn:
            // User's raw data is now on.
            break;
        case ZoomVideoSDKUserRawdataOff:
            // User's raw data is now off.
            break;
    }
}

// Used to receive video's NV12 data(CVPixelBufferRef).
func onPixelBuffer(_ pixelBuffer: CVPixelBuffer!, rotation: ZoomVideoSDKVideoRawDataRotation) {
    // Access CVPixelBufferRef using pixelBuffer.
    // Get rotation of raw data video stream.
    switch rotation {
        case .rotation90:
            // 90 degrees.
            break
        default:
            break
    }
}
// Used to receive video's YUV420 data.
func onRawDataFrameReceived(_ rawData: ZoomVideoSDKVideoRawData!) {
    // Access the raw data for each of the 3 components.
    let yBuffer = rawData.yBuffer
    let uBuffer = rawData.uBuffer
    let vBuffer = rawData.vBuffer
    // Get format of raw data.
    let format = rawData.format
    switch format {
    case .I420:
        // Raw data is I420 format.
        break
    default:
        // Raw data is I420_limit format.
        break
    }
    // Get rotation of raw data video stream.
    let rotation = rawData.rotation
    switch rotation {
        case .rotation90:
            // 90 degrees.
            break
        default:
            break
    }
}
// Called when the raw data has been toggled.
func onRawDataStatusChanged(_ userRawdataStatus: ZoomVideoSDKUserRawdataStatus) {
    switch userRawdataStatus {
        case .on:
        // User's raw data is now on.
        break
        case .off:
        // User's raw data is now off.
        break
    }
}

- (void)onZoomRender:(ZoomVideoSDKRenderer *)renderer frameRawData:(ZoomVideoSDKVideoRawData *)rawData {
    CGFloat *height = rawData.size.height;
    CGFloat *width = rawData.size.width;
    char *uBuffer = rawData.uBuffer;
    char *vBuffer = rawData.vBuffer;
    char *yBuffer = rawData.yBuffer;
}

// Used to receive video's NV12 data(CVPixelBufferRef).
func onPixelBuffer(_ pixelBuffer: CVPixelBuffer!, rotation: ZoomVideoSDKVideoRawDataRotation) {
    // Access CVPixelBufferRef using pixelBuffer.
    // Get rotation of raw data video stream.
    switch rotation {
        case .rotation90:
            // 90 degrees.
            break
        default:
            break
    }
}

If you're using onPixelBuffer, you can access the frame data through the CVPixelBufferRef object.

Note: It is not recommended to implement a callback that does not belong to the renderer you are using. It is also not recommended to implement both onRawDataFrameReceived and onPixelBuffer. Implementing both callbacks will cause unnecessary and multiple raw data manipulations.

Send Raw Video Data

You can also send video data by using sendVideoFrame within ZoomVideoSDKVideoSender. To obtain a ZoomVideoSDKVideoSender you must assign a ZoomVideoSDKVideoSource. The onInitialize within ZoomVideoSDKVideoSource will provide a ZoomVideoSDKVideoSender.


- (void)onInitialize:(ZoomVideoSDKVideoSender *)rawDataSender supportCapabilityArray:(NSArray *)supportCapabilityArray suggestCapabilityItem:(ZoomVideoSDKVideoCapabilityItem *)suggestCapabilityItem {
    // Store video rawdata sender.
    self.videoRawdataSender = rawDataSender;
}
// Call sendVideoFrame to send a frame buffer of raw data.
[self.videoRawdataSender sendVideoFrame:frameBuffer width:width height:height dataLength:dataLength rotation:rotation];

func onInitialize(_ rawDataSender: ZoomVideoSDKVideoSender, supportCapabilityArray: [Any], suggest suggestCapabilityItem: ZoomVideoSDKVideoCapabilityItem) {
    // Store video rawdata sender.
    self.rawDataSender = rawDataSender
}
// Call sendVideoFrame to send a frame buffer of raw data.
self.rawDataSender?.sendVideoFrame(frameBuffer, width: width, height: height, dataLength: dataLength, rotation: rotation)

Raw video data can also be pre-processed using onPreProcessRawData within ZoomVideoSDKVideoSourcePreProcessor.


- (void)onPreProcessRawData:(ZoomVideoSDKPreProcessRawData *)rawData {
    // Perform preprocess actions here.
}

func onPreProcessRawData(_ rawData: ZoomVideoSDKPreProcessRawData!) {
    // Perform preprocess actions here.
}

Once you have implemented these callbacks, you must subscribe the listener to receive audio data through an instance of

ZoomVideoSDKAudioHelper.

Receive raw audio data

See the sample code below.

VirtualSpeakerExample.h

#import <Foundation/Foundation.h>
NS_ASSUME_NONNULL_BEGIN
@interface VirtualSpeakerExample : NSObject <ZoomVideoSDKVirtualAudioSpeaker>
@end
NS_ASSUME_NONNULL_END

#import "VirtualSpeakerExample.h"
@implementation VirtualSpeakerExample
- (void)onVirtualSpeakerMixedAudioReceived:(ZoomVideoSDKAudioRawData *)rawData
{
    // Received raw audio from whole session that has been sent from virtual microphones
    // Play the raw audio here
}
- (void)onVirtualSpeakerOneWayAudioReceived:(ZoomVideoSDKAudioRawData *)rawData user:(ZoomVideoSDKUser *)user
{
    // Received raw audio from single user that has been sent from a virtual microphone
}
- (void)onVirtualSpeakerSharedAudioReceived:(ZoomVideoSDKAudioRawData *)rawData
{
    // Received raw audio from share that was manually sent
}
@end

import Foundation
class VirtualSpeakerExample: NSObject, ZoomVideoSDKVirtualAudioSpeaker {
    func onVirtualSpeakerMixedAudioReceived(_ rawData: ZoomVideoSDKAudioRawData!) {
        // Received raw audio from whole session that has been sent from virtual microphones
        // Play the raw audio here
    }
    func onVirtualSpeakerOneWayAudioReceived(_ rawData: ZoomVideoSDKAudioRawData!, user: ZoomVideoSDKUser!) {
        // Received raw audio from single user that has been sent from a virtual microphone
    }
    func onVirtualSpeakerSharedAudioReceived(_ rawData: ZoomVideoSDKAudioRawData!) {
        // Received raw audio from share that was manually sent
    }
}

Send raw audio data

See the sample code below.

VirtualMicExample.h

#import <Foundation/Foundation.h>
NS_ASSUME_NONNULL_BEGIN
@interface VirtualMicExample : NSObject <ZoomVideoSDKVirtualAudioMic>
@end
NS_ASSUME_NONNULL_END

#import "VirtualMicExample.h"
#import <AVFoundation/AVFoundation.h>
#import <AssetsLibrary/AssetsLibrary.h>
@interface VirtualMicExample ()
@property (nonatomic, strong) ZoomVideoSDKAudioSender *audioSender;
@end
@implementation VirtualMicExample
- (instancetype)init
{
    self = [super init];
    if (self) {
        _audioSender = nil;
    }
    return self;
}
- (void)onMicInitialize:(ZoomVideoSDKAudioSender *_Nonnull)rawDataSender;
{
    // Virtual Microphone has initialized, store rawDataSender to send raw audio data into session later
    if (self.audioSender != rawDataSender) {
        self.audioSender = rawDataSender;
    }
}
- (void)onMicStartSend;
{
    // Virtual Microphone can begin sending raw audio into session
    if (!self.audioSender) {
        return;
    }
    // Provide your audio data info here
    unsigned char* yourAudioBuffer;
    NSUInteger yourAudioDataLength;
    NSUInteger yourAudioSampleRate;
    ZoomVideoSDKERROR sendRawAudioReturnStatus = [self.audioSender send:(char *)yourAudioBuffer dataLength:yourAudioDataLength sampleRate:yourAudioSampleRate];
    if (sendRawAudioReturnStatus == Errors_Success) {
        // Call to send raw audio succeeded
    } else {
        NSLog(@"Call to send raw audio produced an error: %@", @(sendRawAudioReturnStatus));
    }
}
- (void)onMicStopSend;
{
    // Virtual Microphone has stopped sending raw audio
}
- (void)onMicUninitialized;
{
    // Virtual Microphone has been destroyed
    self.audioSender = nil;
}
@end

import Foundation
class VirtualMicExample: NSObject, ZoomVideoSDKVirtualAudioMic {
    var audioSender: ZoomVideoSDKAudioSender?
    func onMicInitialize(_ rawDataSender: ZoomVideoSDKAudioSender) {
        // Virtual Microphone has initialized, store rawDataSender to send raw audio data into session later
        audioSender = rawDataSender
    }
    func onMicStartSend() {
        guard let audioSender = audioSender else { return }
        // Virtual Microphone can begin sending raw audio into session
        // Provide your audio data info here
        guard let yourAudioBuffer = UnsafeMutablePointer<Int8>(mutating: ("buffer here" as NSString).utf8String) else { return }
        let yourAudioDataLength = UInt()
        let yourAudioSampleRate = UInt()
        let sendRawAudioReturnStatus = audioSender.send(yourAudioBuffer, dataLength: yourAudioDataLength, sampleRate: yourAudioSampleRate)
        switch sendRawAudioReturnStatus {
        case .Errors_Success:
            print("Call to send raw audio succeeded")
        default:
            print("Call to send raw audio produced an error: (sendRawAudioReturnStatus)")
        }
    }
    func onMicStopSend() {
        // Virtual Microphone has stopped sending raw audio
    }
    func onMicUninitialized() {
        // Virtual Microphone has been destroyed
        audioSender = nil
    }
}

Need help?

If you're looking for help, try Developer Support or our Developer Forum. Priority support is also available with Premier Developer Support plans.