Raw Data

The Video SDK provides you with an option to access and send real-time raw audio and video data of a session. After accessing the raw data, you can process it and apply additional effects to it to enhance the session experience.

This section will provide you with information on how to enable and gain access to the raw data of a session.

On This Page

Specify Raw Data Memory Mode

In order to receive raw data of any type, you must first select the memory mode you wish to use. The SDK supports heap-based and stack-based memory modes.

Stack-Based Memory

  • Variables are allocated automatically and deallocated after the data leaves scope.
  • Variables are not accessible from or transferable to other threads.
  • Typically has faster access than heap-based memory allocation.
  • Memory space is managed by the CPU and will not become fragmented.
  • Variables cannot be resized.

Heap-Based Memory

  • Variables are allocated and deallocated manually. You must allocate and free variables to avoid memory leaks.
  • Variables can be accessed globally.
  • Has relatively slower access than stack-based memory allocation.
  • Has no guarantee on the efficiency of memory space and can become fragmented.
  • Variables can be resized.

For more information on these types of memory allocation, refer to Stack-based memory allocation and Heap-based dynamic memory allocation guides.

After determining which memory mode is right for you, you must specify it when the SDK is initialized. Note that this must be done for audio, video and share data types individually. The two types of memory allocation are defined with the following enum values:

Stack-based Memory:

ZoomVideoSDKRawDataMemoryMode.ZoomVideoSDKRawDataMemoryModeStack

Heap-based Memory:

ZoomVideoSDKRawDataMemoryMode.ZoomVideoSDKRawDataMemoryModeHeap

To specify raw data memory mode, provide one of these enums to the ZoomVideoSDKInitParams method during SDK initialization.

ZoomVideoSDKRawDataMemoryMode modeHeap = ZoomVideoSDKRawDataMemoryMode.ZoomVideoSDKRawDataMemoryModeHeap;
ZoomVideoSDKInitParams params = new ZoomVideoSDKInitParams();
params.videoRawDataMemoryMode = modeHeap; // Set video memory to heap
params.audioRawDataMemoryMode = modeHeap; // Set audio memory to heap
params.shareRawDataMemoryMode = modeHeap; // Set share memory to heap
val modeHeap = ZoomVideoSDKRawDataMemoryMode.ZoomVideoSDKRawDataMemoryModeHeap
val params = ZoomVideoSDKInitParams().apply {
videoRawDataMemoryMode = modeHeap // Set video memory to heap
audioRawDataMemoryMode = modeHeap // Set audio memory to heap
shareRawDataMemoryMode = modeHeap // Set share memory to heap
}

Access Raw Video Data

Raw video data is encoded in the YUV420p format. YUV420 is a data object commonly used by the renderer based on OpenGL ES.

To access and modify the video data, you will need to do the following:

  1. Implement an instance of the ZoomVideoSDKRawDataPipeDelegate.
  2. Use callback functions provided by the ZoomVideoSDKRawDataPipeDelegate to receive each frame of the raw video data.
  3. Pass the delegate into the video pipe of a specific user.
ZoomVideoSDKRawDataPipeDelegate dataDelegate = new ZoomVideoSDKRawDataPipeDelegate() {
@Override
public void onRawDataFrameReceived(ZoomVideoSDKVideoRawData ZoomVideoSDKVideoRawData) {
}
@Override
public void onRawDataStatusChanged(ZoomVideoSDKRawDataPipeDelegate.RawDataStatus status) {
}
};
ZoomVideoSDKRawDataPipe pipe = user.getVideoPipe();
pipe.subscribe(ZoomVideoSDKVideoResolution.VideoResolution_360P, dataDelegate);
val dataDelegate = object : ZoomVideoSDKRawDataPipeDelegate {
override fun onRawDataStatusChanged(status: ZoomVideoSDKRawDataPipeDelegate.RawDataStatus?) {
}
override fun onRawDataFrameReceived(rawData: ZoomVideoSDKVideoRawData?) {
}
}
val pipe = user.videoPipe
pipe.subscribe(ZoomVideoSDKVideoResolution.VideoResolution_360P, dataDelegate)

Each frame of video data will be made available through the ZoomVideoSDKVideoRawData object. Various pieces of data can be accessed through this object in onVideoRawDataFrame:

int height = rawData.getStreamHeight();
int width = rawData.getStreamWidth();
ByteBuffer uBuffer = rawData.getuBuffer();
ByteBuffer vBuffer = rawData.getvBuffer();
ByteBuffer yBuffer = rawData.getyBuffer();
val height = rawData.streamHeight
val width = rawData.streamWidth
val uBuffer = rawData.getuBuffer()
val vBuffer = rawData.getvBuffer()
val yBuffer = rawData.getyBuffer()

Access Raw Audio Data

Through your implementation of ZoomVideoSDKDelegate, you can access mixed (combined audio output from one or more user, as heard in a session) and per-user raw audio data.

Note: Unlike raw video data, raw audio data will default to stack-based memory if you do not specify a memory mode.

You can also receive raw audio if it was sent through ZoomVideoSDKVirtualAudioMic. To do so:

  1. Create an instance of ZoomVideoSDKVirtualAudioSpeaker.
  2. Pass that instance into ZoomVideoSDKSessionContext.
  3. Access raw data in each callback method.

To access raw audio data, you will need to listen for the following callbacks in your listener:

@Override
public void onMixedAudioRawDataReceived(ZoomVideoSDKAudioRawData rawData) {
// Access mixed data for the whole session here
}
@Override
public void onOneWayAudioRawDataReceived(ZoomVideoSDKAudioRawData rawData, ZoomVideoSDKUser user) {
// Access user-specific raw data here for the user associated with the userId
}
override fun onMixedAudioRawDataReceived(rawData: ZoomVideoSDKAudioRawData?) {
// Access mixed data for the whole session here
}
override fun onOneWayAudioRawDataReceived(rawData: ZoomVideoSDKAudioRawData?, user: ZoomVideoSDKUser) {
// Access user-specific raw data here from the provided user
}

From within the callbacks, you can access the data buffer with rawData.buffer property.

For more information on implementing your ZoomVideoSDKDelegate, see Listen for Callback Event.

Send Raw Audio Data

In addition to being able to receive and process raw audio data, you may also send raw or processed audio data of a user from the user's device.

To do this:

  1. Create an instance of ZoomVideoSDKVirtualAudioMic.
  2. Pass that instance into ZoomVideoSDKSessionContext.
  3. Obtain ZoomVideoSDKAudioRawDataSender from the onMicInitialize callback.
  4. Use the send method to send raw audio data.
ZoomVideoSDKVirtualAudioMic virtualMic = new ZoomVideoSDKVirtualAudioMic() {
@Override
public void onMicInitialize(ZoomVideoSDKAudioRawDataSender sender) {
// Store sender for later use
sender.send(rawData, lengthInBytes, sampleRate);
}
@Override
public void onMicStartSend() {
// You can now send audio
}
@Override
public void onMicStopSend() {
// You can no longer send audio
}
@Override
public void onMicUninitialized() {
// Mic is no longer active
}
};
ZoomVideoSDKVirtualAudioSpeaker virtualSpeaker = new ZoomVideoSDKVirtualAudioSpeaker() {
@Override
public void onVirtualSpeakerMixedAudioReceived(ZoomVideoSDKAudioRawData zoomVideoSDKAudioRawData) {
// Access session audio here
}
@Override
public void onVirtualSpeakerOneWayAudioReceived(ZoomVideoSDKAudioRawData zoomVideoSDKAudioRawData, ZoomVideoSDKUser zoomVideoSDKUser) {
// Access user-specific audio here
}
@Override
public void onVirtualSpeakerShareAudioReceived(ZoomVideoSDKAudioRawData zoomVideoSDKAudioRawData) {
}
};
ZoomVideoSDKSessionContext params = new ZoomVideoSDKSessionContext();
params.virtualAudioMic = virtualMic;
params.virtualAudioSpeaker = virtualSpeaker;
val virtualMic = object : ZoomVideoSDKVirtualAudioMic {
override fun onMicInitialize(sender: ZoomVideoSDKAudioRawDataSender?) {
// Store sender for later use
sender?.send(rawData, lengthInBytes, sampleRate)
}
override fun onMicStartSend() {
// You can now send audio
}
override fun onMicStopSend() {
// You can no longer send audio
}
override fun onMicUninitialized() {
// Mic is no longer active
}
}
val virtualSpeaker = object : ZoomVideoSDKVirtualAudioSpeaker {
override fun onVirtualSpeakerMixedAudioReceived(rawData: ZoomVideoSDKAudioRawData?) {
// Access session audio here
}
override fun onVirtualSpeakerOneWayAudioReceived(rawData: ZoomVideoSDKAudioRawData?, user: ZoomVideoSDKUser?) {
// Access user-specific audio here
}
override fun onVirtualSpeakerShareAudioReceived(p0: ZoomVideoSDKAudioRawData?) {
}
}
val params = ZoomVideoSDKSessionContext().apply {
virtualAudioMic = virtualMic
virtualAudioSpeaker = virtualSpeaker
}

Send Raw Video Data

In addition to being able to receive and process raw video data, you may also send raw as well as processed video data of a user from the user's device. To do this, you must provide an implementation of ZoomVideoSDKVideoSource in your ZoomVideoSDKSessionContext object when joining a session.

The Android SDK supports receiving videos in the resolutions enumerated in the reference entry for ZoomSDKVideoResolution on the Android SDK reference page, with the exception of 720p resolution.

ZoomVideoSDKVideoSource source = new ZoomVideoSDKVideoSource() {
@Override
public void onInitialize(ZoomVideoSDKVideoSender sender, List<ZoomVideoSDKVideoCapability> list, ZoomVideoSDKVideoCapability ZoomVideoSDKVideoCapability) {
sender.sendVideoFrame(/* Your raw data buffer goes here -> */ buffer, width, height, frameLength, rotation);
}
@Override
public void onPropertyChange(List<ZoomVideoSDKVideoCapability> list, ZoomVideoSDKVideoCapability ZoomVideoSDKVideoCapability) {
}
@Override
public void onStartSend() {
}
@Override
public void onStopSend() {
}
@Override
public void onUninitialized() {
}
};
ZoomVideoSDKSessionContext params = new ZoomVideoSDKSessionContext();
params.externalVideoSource = source;
val source = object : ZoomVideoSDKVideoSource {
override fun onStopSend() {
}
override fun onPropertyChange(capabilityList: MutableList<ZoomVideoSDKVideoCapability>, capability: ZoomVideoSDKVideoCapability?) {
}
override fun onUninitialized() {
}
override fun onStartSend() {
}
override fun onInitialize(sender: ZoomVideoSDKVideoSender?, capabilityList: MutableList<ZoomVideoSDKVideoCapability>, capability: ZoomVideoSDKVideoCapability?) {
sender?.sendVideoFrame( /* Your raw data buffer goes here -> */ buffer, width, height, frameLength, rotation)
}
}
val params = ZoomVideoSDKSessionContext().apply {
externalVideoSource = source
}

Need help?

If you're looking for help, try Developer Support or our Developer Forum. Priority support is also available with Premier Developer Support plans.