我正在使用
Swift在视图的AVPlayerLayer中显示来自AVPlayer的内容.关联的AVPlayerItem有一个videoComposition,并且用于创建它的代码的略微简化版本(没有错误检查等)如下所示:
playerItem.videoComposition = AVVideoComposition(asset: someAsset,applyingCIFiltersWithHandler: { [unowned self] (request: AVAsynchronousCIImageFilteringRequest) in let paramDict = << set up parameter dictionary based on class vars >> // filter the image let filter = self.ciFilterWithParamDict(paramDict) { filter.setValue(request.sourceImage,forKey: kCIInputImageKey) if let filteredImage = filter.outputImage { request.finishWithImage(filteredImage,context: nil) } })
当AVPlayer正在播放或寻找时,这一切都按预期工作.如果创建并加载了新的videoComposition,则会正确呈现AVPlayerLayer.
但是,当我更改了用于计算滤波器参数的一些值时,我还没有找到一种方法来“触发”AVPlayer / AVPlayerItem / AVVideoComposition重新渲染.如果我更改值然后播放或搜索,则会正确呈现,但仅限于我玩或寻找.有没有办法触发“就地”渲染?
解决方法
我用hack来完全替换avPlayerItem以强制刷新.但是,如果有一种方法可以触发avPlayerItem直接重新渲染,那会好得多.
// If the video is paused,force the player to re-render the frame. if (self.avPlayer.currentItem.asset && self.avPlayer.rate == 0) { CMTime currentTime = self.avPlayerItem.currentTime; [self.avPlayer replaceCurrentItemWithPlayerItem:nil]; [self.avPlayer replaceCurrentItemWithPlayerItem:self.avPlayerItem]; [self.avPlayerItem seekToTime:currentTime toleranceBefore:kCMTimeZero toleranceAfter:kCMTimeZero]; }