ios – 使用AVCaptureSession sessionPreset = AVCaptureSessionPresetPhoto拉伸捕获的照片

前端之家收集整理的这篇文章主要介绍了ios – 使用AVCaptureSession sessionPreset = AVCaptureSessionPresetPhoto拉伸捕获的照片前端之家小编觉得挺不错的,现在分享给大家,也给大家做个参考。
重要提示:如果我使用:session.sessionPreset = AVCaptureSessionPresetHigh;我的预览图像没有拉伸!!如果我将照片保存到设备UI ImageWriteToSavedPhotosAlbum(image,nil,nil);图像是正常的,仅在预览中被拉伸.

我正在使用AVFoundation捕捉照片.

session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetHigh;

CALayer *viewLayer = vImagePreview.layer;
NSLog(@"viewLayer = %@",viewLayer);

AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];

captureVideoPreviewLayer.frame = vImagePreview.bounds;
[vImagePreview.layer addSublayer:captureVideoPreviewLayer];

AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
    // Handle the error appropriately.
    NSLog(@"ERROR: trying to open camera: %@",error);
}
[session addInput:input];

stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG,AVVideoCodecKey,nil];
[stillImageOutput setOutputSettings:outputSettings];
[session addOutput:stillImageOutput];

我将sessionPreset设置为AVCaptureSessionPresetPhoto:

session.sessionPreset = AVCaptureSessionPresetPhoto;

我的捕获方法

-(void)captureNow {
    AVCaptureConnection *videoConnection = nil;
    for (AVCaptureConnection *connection in stillImageOutput.connections)
    {
        for (AVCaptureInputPort *port in [connection inputPorts])
        {
            if ([[port mediaType] isEqual:AVMediaTypeVideo] )
            {
                videoConnection = connection;
                break;
            }
        }
        if (videoConnection)
        {
            break;
        }
    }

    NSLog(@"about to request a capture from: %@",stillImageOutput);
    [stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer,NSError *error)
     {
         CFDictionaryRef exifAttachments = CMGetAttachment( imageSampleBuffer,kCGImagePropertyExifDictionary,NULL);
         if (exifAttachments)
         {
             // Do something with the attachments.
             NSLog(@"attachements: %@",exifAttachments);
         } else {
             NSLog(@"no attachments");
         }

         NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
         UIImage *image = [[UIImage alloc] initWithData:imageData];
         NSLog(@"%@",NSStringFromCGSize(image.size));
         [self animateUpTheImageWithImage:image];


     }];

}

我在哪里添加拍摄的照片预览:

- (void) animateUpTheImageWithImage:(UIImage*)theImage{

    UIView* preview = [[UIView alloc] initWithFrame:CGRectMake(0,self.frame.size.width,self.frame.size.height/*426*/)];
    CALayer *previewLayer = preview.layer;
    AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
    captureVideoPreviewLayer.frame = previewLayer.frame;
    [previewLayer addSublayer:captureVideoPreviewLayer];
    captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspect;

    [self addSubview:preview];

}

结果是我拍摄的图像被拉伸了!

解决方法

所以我解决了我的问题.这是我现在使用的代码,它工作正常:
session = [[AVCaptureSession alloc] init];
    session.sessionPreset = AVCaptureSessionPresetHigh;

    captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
    captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
    captureVideoPreviewLayer.frame = vImagePreview.bounds;
    [vImagePreview.layer addSublayer:captureVideoPreviewLayer];

    device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG,nil];
[stillImageOutput setOutputSettings:outputSettings];
[session addOutput:stillImageOutput];

capturedImageView = [[UIView alloc]initWithFrame:CGRectMake(0,self.screenWidth,self.screenHeight)];
[self addSubview:capturedImageView];

重要的输出imagaView:

vImage = [[UIImageView alloc]initWithFrame:CGRectMake(0,self.screenHeight)];
    [capturedImageView addSubview:vImage];
    vImage.autoresizingMask = (UIViewAutoresizingFlexibleBottomMargin|UIViewAutoresizingFlexibleHeight|UIViewAutoresizingFlexibleLeftMargin|UIViewAutoresizingFlexibleRightMargin|UIViewAutoresizingFlexibleTopMargin|UIViewAutoresizingFlexibleWidth);
    vImage.contentMode = UIViewContentModeScaleAspectFill;
    vImage.image = theImage;

一些额外的信息:

相机图层必须是全屏(您可以修改其y坐标,但宽度和高度必须是完整尺寸),并且outputImageView也必须是.

我希望这些对某些人来说也是有用的信息.

猜你在找的iOS相关文章