ios – CMSampleBufferRef的深层副本

前端之家收集整理的这篇文章主要介绍了ios – CMSampleBufferRef的深层副本前端之家小编觉得挺不错的,现在分享给大家,也给大家做个参考。
我正在尝试为CMS音频和视频连接执行CMSampleBufferRef的深层复制?我需要使用此缓冲区进行延迟处理.有人可以通过指向示例代码帮助这里.

谢谢

解决方法

解决了这个问题

我需要长时间访问样本数据.

尝试很多方式:

CVPixelBufferRetain —–>程序坏了
CVPixelBufferPool —–>程序坏了
CVPixelBufferCreateWithBytes —->它可以解决这个程序,但这会降低性能,不建议Apple这样做

CMSampleBufferCreateCopy —>没关系,苹果推荐它.

列表:为了保持最佳性能,某些样本缓冲区直接引用可能需要由设备系统和其他捕获输入重用的内存池.对于未压缩的设备本机捕获,通常会出现这种情况,其中尽可能少地复制内存块.如果多个样本缓冲区长时间引用此类内存池,则输入将无法再将新样本复制到内存中,并且这些样本将被丢弃.如果您的应用程序通过保留提供的CMSampleBuffer对象太久而导致删除样本,但需要长时间访问样本数据,请考虑将数据复制到新缓冲区中,然后在样本缓冲区上调用CFRelease (如果以前保留过),以便可以重用它引用的内存.

REF:https://developer.apple.com/reference/avfoundation/avcapturefileoutputdelegate/1390096-captureoutput

这可能是你需要的:

pragma mark -captureOutput

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection{
    if (connection == m_videoConnection) {        
        /* if you did not read m_sampleBuffer,here you must CFRelease m_sampleBuffer,it is causing samples to be dropped 
        */
        if (m_sampleBuffer) {
            CFRelease(m_sampleBuffer);
            m_sampleBuffer = nil;
        }

        OSStatus status = CMSampleBufferCreateCopy(kcfAllocatorDefault,sampleBuffer,&m_sampleBuffer);
        if (noErr != status) {
            m_sampleBuffer = nil;
        }
        NSLog(@"m_sampleBuffer = %p sampleBuffer= %p",m_sampleBuffer,sampleBuffer);
    }
}

pragma mark -get CVPixelBufferRef可以使用很长一段时间

- (ACResult) readVideoFrame: (CVPixelBufferRef *)pixelBuffer{
    while (1) {
        dispatch_sync(m_readVideoData,^{
            if (!m_sampleBuffer) {
                _readDataSuccess = NO;
                return;
            }

            CMSampleBufferRef sampleBufferCopy = nil;
            OSStatus status = CMSampleBufferCreateCopy(kcfAllocatorDefault,&sampleBufferCopy);
            if ( noErr == status)
            {
                 CVPixelBufferRef buffer  = CMSampleBufferGetImageBuffer(sampleBufferCopy);

                 *pixelBuffer = buffer;

                 _readDataSuccess = YES;

                 NSLog(@"m_sampleBuffer = %p ",m_sampleBuffer);

                 CFRelease(m_sampleBuffer);
                 m_sampleBuffer = nil;

             }
             else{
                 _readDataSuccess = NO;
                 CFRelease(m_sampleBuffer);
                 m_sampleBuffer = nil;
             }
        });

        if (_readDataSuccess) {
            _readDataSuccess = NO;
            return ACResultNoErr;
        }
        else{
            usleep(15*1000);
            continue;
        }
    }
}

然后你可以这样使用它:

-(void)getCaptureVideoDataToEncode{
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT,0),^(){
        while (1) {
            CVPixelBufferRef buffer = NULL;
            ACResult result= [videoCapture readVideoFrame:&buffer];
            if (ACResultNoErr == result) {
                ACResult error = [videoEncode encoder:buffer outputPacket:&streamPacket];
                if (buffer) {
                    CVPixelBufferRelease(buffer);
                    buffer = NULL;
                }
                if (ACResultNoErr == error) {
                NSLog(@"encode success");
                }
            }
        }
    });
}

猜你在找的iOS相关文章