ios – 可以使用AVAudioEngine从文件读取,使用音频单元进行处理并写入文件,比实时更快吗?

前端之家收集整理的这篇文章主要介绍了ios – 可以使用AVAudioEngine从文件读取,使用音频单元进行处理并写入文件,比实时更快吗?前端之家小编觉得挺不错的,现在分享给大家,也给大家做个参考。
我正在使用一个iOS应用程序,它使用AVAudioEngine进行各种操作,包括将音频录制到文件,使用音频单元对该音频应用效果,并使用应用的效果播放音频.我用一个tap也可以将输出写入一个文件.当这样做完成时,它会在音频播放时实时写入文件.

可以设置从文件读取的AVAudioEngine图形,使用音频单元处理声音,并将其输出文件,但是比实时更快(即,与硬件一样快)?这样做的用例是输出几分钟的音频并应用效果,我当然不想等待几分钟才能被处理.

编辑:这里是我用来设置AVAudioEngine的图形的代码,并播放一个声音文件

  1. AVAudioEngine* engine = [[AVAudioEngine alloc] init];
  2.  
  3. AVAudioPlayerNode* player = [[AVAudioPlayerNode alloc] init];
  4. [engine attachNode:player];
  5.  
  6. self.player = player;
  7. self.engine = engine;
  8.  
  9. if (!self.distortionEffect) {
  10. self.distortionEffect = [[AVAudioUnitDistortion alloc] init];
  11. [self.engine attachNode:self.distortionEffect];
  12. [self.engine connect:self.player to:self.distortionEffect format:[self.distortionEffect outputFormatForBus:0]];
  13. AVAudioMixerNode* mixer = [self.engine mainMixerNode];
  14. [self.engine connect:self.distortionEffect to:mixer format:[mixer outputFormatForBus:0]];
  15. }
  16.  
  17. [self.distortionEffect loadFactoryPreset:AVAudioUnitDistortionPresetDrumsBitBrush];
  18.  
  19. NSError* error;
  20. if (![self.engine startAndReturnError:&error]) {
  21. NSLog(@"error: %@",error);
  22. } else {
  23. NSURL* fileURL = [[NSBundle mainBundle] URLForResource:@"test2" withExtension:@"mp3"];
  24. AVAudioFile* file = [[AVAudioFile alloc] initForReading:fileURL error:&error];
  25.  
  26. if (error) {
  27. NSLog(@"error: %@",error);
  28. } else {
  29. [self.player scheduleFile:file atTime:nil completionHandler:nil];
  30. [self.player play];
  31. }
  32. }

上述代码播放test2.mp3文件中的声音,并实时应用AVAudioUnitDistortionPresetDrumsBitBrush失真预设.

然后我通过在[self.player play]之后添加这些行来修改上面的代码

  1. [self.engine stop];
  2. [self renderAudioAndWriteToFile];

修改了Vladimir提供的renderAudioAndWriteToFile方法,而不是在第一行分配一个新的AVAudioEngine,它只是使用已经设置的self.engine.

但是,在renderAudioAndWriteToFile中,由于AudioUnitRender返回kAudioUnitErr_Uninitialized的状态,因此它记录“无法渲染音频单元”.

编辑2:我应该提到,我非常高兴转换我发布的AVAudioEngine代码使用C apis,如果这将使事情变得更容易.但是,我希望代码产生与AVAudioEngine代码相同的输出(包括使用上面显示的出厂预设).

解决方法

>配置您的引擎和播放器节点.
>您的播放器节点的通话播放方式.
>暂停引擎.
>从AVAudioOutputNode(audioEngine.outputNode)获取音频单元
用这个 method.
>从 AudioUnitRender的音频单元渲染循环,并将音频缓冲区列表写入 Extended Audio File Services文件.

例:

音频引擎配置

  1. - (void)configureAudioEngine {
  2. self.engine = [[AVAudioEngine alloc] init];
  3. self.playerNode = [[AVAudioPlayerNode alloc] init];
  4. [self.engine attachNode:self.playerNode];
  5. AVAudioUnitDistortion *distortionEffect = [[AVAudioUnitDistortion alloc] init];
  6. [self.engine attachNode:distortionEffect];
  7. [self.engine connect:self.playerNode to:distortionEffect format:[distortionEffect outputFormatForBus:0]];
  8. self.mixer = [self.engine mainMixerNode];
  9. [self.engine connect:distortionEffect to:self.mixer format:[self.mixer outputFormatForBus:0]];
  10. [distortionEffect loadFactoryPreset:AVAudioUnitDistortionPresetDrumsBitBrush];
  11. NSError* error;
  12. if (![self.engine startAndReturnError:&error])
  13. NSLog(@"Can't start engine: %@",error);
  14. else
  15. [self scheduleFileToPlay];
  16. }
  17.  
  18. - (void)scheduleFileToPlay {
  19. NSError* error;
  20. NSURL *fileURL = [[NSBundle mainBundle] URLForResource:@"filename" withExtension:@"m4a"];
  21. self.file = [[AVAudioFile alloc] initForReading:fileURL error:&error];
  22. if (self.file)
  23. [self.playerNode scheduleFile:self.file atTime:nil completionHandler:nil];
  24. else
  25. NSLog(@"Can't read file: %@",error);
  26. }

呈现方式

  1. - (void)renderAudioAndWriteToFile {
  2. [self.playerNode play];
  3. [self.engine pause];
  4. AVAudioOutputNode *outputNode = self.engine.outputNode;
  5. AudioStreamBasicDescription const *audioDescription = [outputNode outputFormatForBus:0].streamDescription;
  6. NSString *path = [self filePath];
  7. ExtAudioFileRef audioFile = [self createAndSetupExtAudioFileWithASBD:audioDescription andFilePath:path];
  8. if (!audioFile)
  9. return;
  10. AVURLAsset *asset = [AVURLAsset assetWithURL:self.file.url];
  11. NSTimeInterval duration = CMTimeGetSeconds(asset.duration);
  12. NSUInteger lengthInFrames = duration * audioDescription->mSampleRate;
  13. const NSUInteger kBufferLength = 4096;
  14. AudioBufferList *bufferList = AEAllocateAndInitAudioBufferList(*audioDescription,kBufferLength);
  15. AudioTimeStamp timeStamp;
  16. memset (&timeStamp,sizeof(timeStamp));
  17. timeStamp.mFlags = kAudioTimeStampSampleTimeValid;
  18. OSStatus status = noErr;
  19. for (NSUInteger i = kBufferLength; i < lengthInFrames; i += kBufferLength) {
  20. status = [self renderToBufferList:bufferList writeToFile:audioFile bufferLength:kBufferLength timeStamp:&timeStamp];
  21. if (status != noErr)
  22. break;
  23. }
  24. if (status == noErr && timeStamp.mSampleTime < lengthInFrames) {
  25. NSUInteger restBufferLength = (NSUInteger) (lengthInFrames - timeStamp.mSampleTime);
  26. AudioBufferList *restBufferList = AEAllocateAndInitAudioBufferList(*audioDescription,restBufferLength);
  27. status = [self renderToBufferList:restBufferList writeToFile:audioFile bufferLength:restBufferLength timeStamp:&timeStamp];
  28. AEFreeAudioBufferList(restBufferList);
  29. }
  30. AEFreeAudioBufferList(bufferList);
  31. ExtAudioFileDispose(audioFile);
  32. if (status != noErr)
  33. NSLog(@"An error has occurred");
  34. else
  35. NSLog(@"Finished writing to file at path: %@",path);
  36. }
  37.  
  38. - (NSString *)filePath {
  39. NSArray *documentsFolders =
  40. NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,NSUserDomainMask,YES);
  41. NSString *fileName = [NSString stringWithFormat:@"%@.m4a",[[NSUUID UUID] UUIDString]];
  42. NSString *path = [documentsFolders[0] stringByAppendingPathComponent:fileName];
  43. return path;
  44. }
  45.  
  46. - (ExtAudioFileRef)createAndSetupExtAudioFileWithASBD:(AudioStreamBasicDescription const *)audioDescription
  47. andFilePath:(NSString *)path {
  48. AudioStreamBasicDescription destinationFormat;
  49. memset(&destinationFormat,sizeof(destinationFormat));
  50. destinationFormat.mChannelsPerFrame = audioDescription->mChannelsPerFrame;
  51. destinationFormat.mSampleRate = audioDescription->mSampleRate;
  52. destinationFormat.mFormatID = kAudioFormatMPEG4AAC;
  53. ExtAudioFileRef audioFile;
  54. OSStatus status = ExtAudioFileCreateWithURL(
  55. (__bridge CFURLRef) [NSURL fileURLWithPath:path],kAudioFileM4AType,&destinationFormat,NULL,kAudioFileFlags_EraseFile,&audioFile
  56. );
  57. if (status != noErr) {
  58. NSLog(@"Can not create ext audio file");
  59. return nil;
  60. }
  61. UInt32 codecManufacturer = kAppleSoftwareAudioCodecManufacturer;
  62. status = ExtAudioFileSetProperty(
  63. audioFile,kExtAudioFileProperty_CodecManufacturer,sizeof(UInt32),&codecManufacturer
  64. );
  65. status = ExtAudioFileSetProperty(
  66. audioFile,kExtAudioFileProperty_ClientDataFormat,sizeof(AudioStreamBasicDescription),audioDescription
  67. );
  68. status = ExtAudioFileWriteAsync(audioFile,NULL);
  69. if (status != noErr) {
  70. NSLog(@"Can not setup ext audio file");
  71. return nil;
  72. }
  73. return audioFile;
  74. }
  75.  
  76. - (OSStatus)renderToBufferList:(AudioBufferList *)bufferList
  77. writeToFile:(ExtAudioFileRef)audioFile
  78. bufferLength:(NSUInteger)bufferLength
  79. timeStamp:(AudioTimeStamp *)timeStamp {
  80. [self clearBufferList:bufferList];
  81. AudioUnit outputUnit = self.engine.outputNode.audioUnit;
  82. OSStatus status = AudioUnitRender(outputUnit,timeStamp,bufferLength,bufferList);
  83. if (status != noErr) {
  84. NSLog(@"Can not render audio unit");
  85. return status;
  86. }
  87. timeStamp->mSampleTime += bufferLength;
  88. status = ExtAudioFileWrite(audioFile,bufferList);
  89. if (status != noErr)
  90. NSLog(@"Can not write audio to file");
  91. return status;
  92. }
  93.  
  94. - (void)clearBufferList:(AudioBufferList *)bufferList {
  95. for (int bufferIndex = 0; bufferIndex < bufferList->mNumberBuffers; bufferIndex++) {
  96. memset(bufferList->mBuffers[bufferIndex].mData,bufferList->mBuffers[bufferIndex].mDataByteSize);
  97. }
  98. }

我使用了this酷框架的一些功能

  1. AudioBufferList *AEAllocateAndInitAudioBufferList(AudioStreamBasicDescription audioFormat,int frameCount) {
  2. int numberOfBuffers = audioFormat.mFormatFlags & kAudioFormatFlagIsNonInterleaved ? audioFormat.mChannelsPerFrame : 1;
  3. int channelsPerBuffer = audioFormat.mFormatFlags & kAudioFormatFlagIsNonInterleaved ? 1 : audioFormat.mChannelsPerFrame;
  4. int bytesPerBuffer = audioFormat.mBytesPerFrame * frameCount;
  5. AudioBufferList *audio = malloc(sizeof(AudioBufferList) + (numberOfBuffers-1)*sizeof(AudioBuffer));
  6. if ( !audio ) {
  7. return NULL;
  8. }
  9. audio->mNumberBuffers = numberOfBuffers;
  10. for ( int i=0; i<numberOfBuffers; i++ ) {
  11. if ( bytesPerBuffer > 0 ) {
  12. audio->mBuffers[i].mData = calloc(bytesPerBuffer,1);
  13. if ( !audio->mBuffers[i].mData ) {
  14. for ( int j=0; j<i; j++ ) free(audio->mBuffers[j].mData);
  15. free(audio);
  16. return NULL;
  17. }
  18. } else {
  19. audio->mBuffers[i].mData = NULL;
  20. }
  21. audio->mBuffers[i].mDataByteSize = bytesPerBuffer;
  22. audio->mBuffers[i].mNumberChannels = channelsPerBuffer;
  23. }
  24. return audio;
  25. }
  26.  
  27. void AEFreeAudioBufferList(AudioBufferList *bufferList ) {
  28. for ( int i=0; i<bufferList->mNumberBuffers; i++ ) {
  29. if ( bufferList->mBuffers[i].mData ) free(bufferList->mBuffers[i].mData);
  30. }
  31. free(bufferList);
  32. }

猜你在找的iOS相关文章