ios – 我可以使用AVCaptureSession将AAC流编码到内存吗?

前端之家收集整理的这篇文章主要介绍了ios – 我可以使用AVCaptureSession将AAC流编码到内存吗?前端之家小编觉得挺不错的,现在分享给大家,也给大家做个参考。
我正在编写一个iOS应用程序,通过网络传输视频和音频.

我正在使用AVCaptureSession使用AVCaptureVideoDataOutput抓取原始视频帧并在软件using x264中对其进行编码.这非常有用.

我想对音频做同样的事情,只是因为我不需要在音频方面那么多控制,所以我想使用内置的硬件编码器来产生AAC流.这意味着使用Au@R_502_410@ ToolBox图层中的Audio Converter.为了做到这一点,我为AVCaptudeAu@R_502_410@DataOutput的音频帧添加了一个处理程序:

- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
       fromConnection:(AVCaptureConnection *)connection 
{
    // get the au@R_502_410@ samples into a common buffer _pcmBuffer
    CMBlockBufferRef blockBuffer = CMSampleBufferGetDataBuffer(sampleBuffer);
    CMBlockBufferGetDataPointer(blockBuffer,NULL,&_pcmBufferSize,&_pcmBuffer);

    // use Au@R_502_410@Converter to
    UInt32 ouputPacketsCount = 1;
    Au@R_502_410@BufferList bufferList;
    bufferList.mNumberBuffers = 1;
    bufferList.mBuffers[0].mNumberChannels = 1;
    bufferList.mBuffers[0].mDataByteSize = sizeof(_aacBuffer);
    bufferList.mBuffers[0].mData = _aacBuffer;
    OSStatus st = Au@R_502_410@ConverterFillComplexBuffer(_converter,converter_callback,(__bridge void *) self,&ouputPacketsCount,&bufferList,NULL);
    if (0 == st) {
        // ... send bufferList.mBuffers[0].mDataByteSize bytes from _aacBuffer...
    }
}

在这种情况下,音频转换器的回调函数非常简单(假设数据包大小和计数设置正确):

- (void) putPcmSamplesInBufferList:(Au@R_502_410@BufferList *)bufferList withCount:(UInt32 *)count
{
    bufferList->mBuffers[0].mData = _pcmBuffer;         
    bufferList->mBuffers[0].mDataByteSize = _pcmBufferSize;
}

音频转换器的设置如下所示:

{
    // ...
    Au@R_502_410@StreamBasicDescription pcmASBD = {0};
    pcmASBD.mSampleRate = ((AVAu@R_502_410@Session *) [AVAu@R_502_410@Session sharedInstance]).currentHardwareSampleRate;
    pcmASBD.mFormatID = kAu@R_502_410@FormatLinearPCM;
    pcmASBD.mFormatFlags = kAu@R_502_410@FormatFlagsCanonical;
    pcmASBD.mChannelsPerFrame = 1;
    pcmASBD.mBytesPerFrame = sizeof(Au@R_502_410@SampleType);
    pcmASBD.mFramesPerPacket = 1;
    pcmASBD.mBytesPerPacket = pcmASBD.mBytesPerFrame * pcmASBD.mFramesPerPacket;
    pcmASBD.mBitsPerChannel = 8 * pcmASBD.mBytesPerFrame;

    Au@R_502_410@StreamBasicDescription aacASBD = {0};
    aacASBD.mFormatID = kAu@R_502_410@FormatMPEG4AAC;
    aacASBD.mSampleRate = pcmASBD.mSampleRate;
    aacASBD.mChannelsPerFrame = pcmASBD.mChannelsPerFrame;
    size = sizeof(aacASBD);
    Au@R_502_410@FormatGetProperty(kAu@R_502_410@FormatProperty_FormatInfo,&size,&aacASBD);

    Au@R_502_410@ConverterNew(&pcmASBD,&aacASBD,&_converter);
    // ...
}

这似乎很简单,只有IT不工作. AVCaptureSession运行后,音频转换器(特别是Au@R_502_410@ConverterFillComplexBuffer)返回’hwiu'(硬件使用中)错误.如果会话停止但是我无法捕获任何内容,转换工作正常…

我想知道是否有办法从AVCaptureSession中获取AAC流.我正在考虑的选项是:

>以某种方式使用AVAssetWriterInput将音频样本编码到AAC中,然后以某种方式获取编码的数据包(而不是通过AVAssetWriter,它只会写入文件).
>重新组织我的应用程序,使其仅在视频端使用AVCaptureSession,并在音频端使用Audio Queues.这将使流控制(开始和停止录制,响应中断)变得更加复杂,我担心它可能会导致音频和视频之间的同步问题.而且,它似乎不是一个好的设计.

有没有人知道是否可以从AVCaptureSession中获取AAC?我必须在这里使用音频队列吗?这可以让我进入同步或控制问题吗?

解决方法

我最后向Apple寻求建议(如果你有一个付费的开发者帐户,你可以这样做).

似乎AVCaptureSession抓住了AAC硬件编码器,但只允许您使用它直接写入文件.

您可以使用软件编码器,但您必须专门询问它而不是使用Au@R_502_410@ConverterNew:

Au@R_502_410@ClassDescription *description = [self
        getAu@R_502_410@ClassDescriptionWithType:kAu@R_502_410@FormatMPEG4AAC
                        fromManufacturer:kAppleSoftwareAu@R_502_410@CodecManufacturer];
if (!description) {
    return false;
}
// see the question as for setting up pcmASBD and arc ASBD
OSStatus st = Au@R_502_410@ConverterNewSpecific(&pcmASBD,1,description,&_converter);
if (st) {
    NSLog(@"error creating au@R_502_410@ converter: %s",OSSTATUS(st));
    return false;
}

- (Au@R_502_410@ClassDescription *)getAu@R_502_410@ClassDescriptionWithType:(UInt32)type
                                           fromManufacturer:(UInt32)manufacturer
{
    static Au@R_502_410@ClassDescription desc;

    UInt32 encoderSpecifier = type;
    OSStatus st;

    UInt32 size;
    st = Au@R_502_410@FormatGetPropertyInfo(kAu@R_502_410@FormatProperty_Encoders,sizeof(encoderSpecifier),&encoderSpecifier,&size);
    if (st) {
        NSLog(@"error getting au@R_502_410@ format propery info: %s",OSSTATUS(st));
        return nil;
    }

    unsigned int count = size / sizeof(Au@R_502_410@ClassDescription);
    Au@R_502_410@ClassDescription descriptions[count];
    st = Au@R_502_410@FormatGetProperty(kAu@R_502_410@FormatProperty_Encoders,descriptions);
    if (st) {
        NSLog(@"error getting au@R_502_410@ format propery: %s",OSSTATUS(st));
        return nil;
    }

    for (unsigned int i = 0; i < count; i++) {
        if ((type == descriptions[i].mSubType) &&
            (manufacturer == descriptions[i].mManufacturer)) {
            memcpy(&desc,&(descriptions[i]),sizeof(desc));
            return &desc;
        }
    }

    return nil;
}

当然,软件编码器会占用cpu资源,但会完成工作.

猜你在找的iOS相关文章