objective-c – 加速保存图像 – iOS

前端之家收集整理的这篇文章主要介绍了objective-c – 加速保存图像 – iOS前端之家小编觉得挺不错的,现在分享给大家,也给大家做个参考。
我正在研究更多的迷你项目,后来将其纳入一个新项目.它基本上是一个测试单元.

我正在做的是创建一个AVCaptureSession,然后为OutputSampleBufferDelegate创建一个方法.在该方法中,我将sampleBuffer转换为UIImage并保存UIImage.当我在iPhone 4上运行应用程序时,它每秒只能保存2-3张图像.必须有一种更有效的方法来保存图像.

有人可以帮助我加快速度吗?

谢谢!

lots of the code is from here

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{   
    UIImage *resultUIImage = [self imageFromSampleBuffer:sampleBuffer];

    NSData *imageData = [NSData dataWithData:UIImagePNGRepresentation(resultUIImage)];

    NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,NSUserDomainMask,YES);
    NSString *path = [paths objectAtIndex:0];

    CMTime frameTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);

    NSString *filename =  [NSString stringWithFormat:@"%f.png",CMTimeGetSeconds(frameTime)];

    NSString *finalPath = [path stringByAppendingString:filename];

    [imageData writeToFile:finalPath atomically:YES];
}

// Create a UIImage from sample buffer data
- (UIImage *)imageFromSampleBuffer:(CMSampleBufferRef)sampleBuffer 
{
    // Get a CMSampleBuffer's Core Video image buffer for the media data
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 
    // Lock the base address of the pixel buffer
    CVPixelBufferLockBaseAddress(imageBuffer,0); 

    // Get the number of bytes per row for the pixel buffer
    void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer); 

    // Get the number of bytes per row for the pixel buffer
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 
    // Get the pixel buffer width and height
    size_t width = CVPixelBufferGetWidth(imageBuffer); 
    size_t height = CVPixelBufferGetHeight(imageBuffer); 

    // Create a device-dependent RGB color space
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 

    // Create a bitmap graphics context with the sample buffer data
    CGContextRef context = CGBitmapContextCreate(baseAddress,width,height,8,bytesPerRow,colorSpace,kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst); 
    // Create a Quartz image from the pixel data in the bitmap graphics context
    CGImageRef quartzImage = CGBitmapContextCreateImage(context); 
    // Unlock the pixel buffer
    CVPixelBufferUnlockBaseAddress(imageBuffer,0);

    // Free up the context and color space
    CGContextRelease(context); 
    CGColorSpaceRelease(colorSpace);

    // Create an image object from the Quartz image
    UIImage *image = [UIImage imageWithCGImage:quartzImage];

    // Release the Quartz image
    CGImageRelease(quartzImage);

    return image;
}

解决方法

使用此代码,我可以将图像保存时间缩短到0.1秒.
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
       fromConnection:(AVCaptureConnection *)connection 
{  

    double frameTime = CFAbsoluteTimeGetCurrent();
    UIImage *resultUIImage = [self imageFromSampleBuffer:sampleBuffer];

    // takes freaking forever to do.
    double pre = CFAbsoluteTimeGetCurrent();
    NSData *imageData = UIImageJPEGRepresentation(resultUIImage,0.9);
    NSLog(@"It took to write the file:%f",CFAbsoluteTimeGetCurrent()-pre);

    NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,YES);
    NSString *path = [paths objectAtIndex:0];
    NSString *filename =  [NSString stringWithFormat:@"%f.png",frameTime];
    NSString *finalPath = [path stringByAppendingString:filename];
    [imageData writeToFile:finalPath atomically:YES];
}

猜你在找的C&C++相关文章