我在下面使用的代码在使用UI ImagePickerController录制视频时效果很好但是如果从相机胶卷中选择视频我会收到此错误,我不明白为什么:
Export Failed: Operation Stopped : Error
Domain=AVFoundationErrorDomain Code=-11841 “Operation Stopped”
UserInfo=0x1815ca50 {NSLocalizedDescription=Operation Stopped,
NSLocalizedFailureReason=The video could not be composed.}
我曾尝试将视频保存到另一个文件,但没有任何区别.
这是我用来转换视频的代码:
- (void)convertVideoToLowQuailtyAndFixRotationWithInputURL:(NSURL*)inputURL handler:(void (^)(NSURL *outURL))handler { if ([[inputURL pathExtension] isEqualToString:@"MOV"]) { NSURL *outputURL = [inputURL URLByDeletingPathExtension]; outputURL = [outputURL URLByAppendingPathExtension:@"mp4"]; AVURLAsset *avAsset = [AVURLAsset URLAssetWithURL:inputURL options:nil]; AVAssetTrack *sourceVideoTrack = [[avAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; AVAssetTrack *sourceAudioTrack = [[avAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]; AVMutableComposition* composition = [AVMutableComposition composition]; AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,avAsset.duration) ofTrack:sourceVideoTrack atTime:kCMTimeZero error:nil]; [compositionVideoTrack setPreferredTransform:sourceVideoTrack.preferredTransform]; AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid]; [compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,avAsset.duration) ofTrack:sourceAudioTrack atTime:kCMTimeZero error:nil]; AVMutableVideoComposition *videoComposition = [self getVideoComposition:avAsset]; NSArray *compatiblePresets = [AVAssetExportSession exportPresetsCompatibleWithAsset:avAsset]; if ([compatiblePresets containsObject:AVAssetExportPresetMediumQuality]) { AVAssetExportSession *exportSession = [[AVAssetExportSession alloc]initWithAsset:composition presetName:AVAssetExportPresetMediumQuality]; exportSession.outputURL = outputURL; exportSession.outputFileType = AVFileTypeMPEG4; exportSession.shouldOptimizeForNetworkUse = YES; exportSession.videoComposition = videoComposition; [exportSession exportAsynchronouslyWithCompletionHandler:^{ switch ([exportSession status]) { case AVAssetExportSessionStatusFailed: NSLog(@"Export Failed: %@ : %@",[[exportSession error] localizedDescription],[exportSession error]); handler(nil); break; case AVAssetExportSessionStatusCancelled: NSLog(@"Export canceled"); handler(nil); break; default: handler(outputURL); break; } }]; } } else { handler(inputURL); } } - (AVMutableVideoComposition *)getVideoComposition:(AVAsset *)asset { AVAssetTrack *videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; AVMutableComposition *composition = [AVMutableComposition composition]; AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition]; CGSize videoSize = videoTrack.naturalSize; BOOL isPortrait_ = [self isVideoPortrait:asset]; if(isPortrait_) { // NSLog(@"video is portrait "); videoSize = CGSizeMake(videoSize.height,videoSize.width); } composition.naturalSize = videoSize; videoComposition.renderSize = videoSize; videoComposition.frameDuration = CMTimeMakeWithSeconds( 1 / videoTrack.nominalFrameRate,600); AVMutableCompositionTrack *compositionVideoTrack; compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,asset.duration) ofTrack:videoTrack atTime:kCMTimeZero error:nil]; AVMutableVideoCompositionLayerInstruction *layerInst; layerInst = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack]; [layerInst setTransform:videoTrack.preferredTransform atTime:kCMTimeZero]; AVMutableVideoCompositionInstruction *inst = [AVMutableVideoCompositionInstruction videoCompositionInstruction]; inst.timeRange = CMTimeRangeMake(kCMTimeZero,asset.duration); inst.layerInstructions = [NSArray arrayWithObject:layerInst]; videoComposition.instructions = [NSArray arrayWithObject:inst]; return videoComposition; }
解决方法
https://developer.apple.com/library/ios/documentation/AVFoundation/Reference/AVFoundation_ErrorConstants/Reference/reference.html
虽然我没有立即出现重大错误,但我可以建议以下方法来缩小问题的根源.
首先,不要在这些调用中传递nil作为error参数:
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,avAsset.duration) ofTrack:sourceVideoTrack atTime:kCMTimeZero error:nil];
创建一个NSError对象并将引用传递给它,如下所示:
NSError *error = nil; [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,avAsset.duration) ofTrack:sourceVideoTrack atTime:kCMTimeZero error:&error];
检查错误以确保您的视频和音轨正确插入合成轨道.如果一切顺利,错误应为零.
if(error) NSLog(@"Insertion error: %@",error);
您可能还想检查AVAsset的可组合和可导出以及hasProtectedContent属性.如果这些分别不是YES,YES和NO,则可能在创建新视频文件时出现问题.
偶尔我会看到一个问题,即创建音频轨道的时间范围时,在带有视频轨道的合成中使用600时刻度时不会喜欢.您可能希望在持续时间(avAsset.duration)中创建新的CMTime
CMTimeRangeMake(kCMTimeZero,avAsset.duration)
仅用于插入音轨.在新的CMTime中,使用44100的时间刻度(或音频轨道的采样率.)您的videoComposition.frameDuration也是如此.根据视频轨道的nominalFrameRate,600时间刻度可能无法正确表示您的时间.
最后,Apple提供了一个有用的工具来调试视频合成:
https://developer.apple.com/library/mac/samplecode/AVCompositionDebugViewer/Introduction/Intro.html
它可以直观地显示您的构图,并且您可以看到事物看起来不应该的样子.