opengl-es – 使用CVPixelBufferRef和着色器在ffmpeg的OpenGL中渲染YUV视频

前端之家收集整理的这篇文章主要介绍了opengl-es – 使用CVPixelBufferRef和着色器在ffmpeg的OpenGL中渲染YUV视频前端之家小编觉得挺不错的,现在分享给大家,也给大家做个参考。
我正在使用iOS 5.0方法“CVOpenGLESTextureCacheCreateTextureFrom Image”渲染ffmpeg的YUV帧.

我正在使用像苹果示例GLCameraRipple

我在iPhone屏幕上的结果如下:iPhone Screen

我需要知道我做错了.

我把部分代码放到了发现错误的位置.

ffmpeg配置帧:

ctx->p_sws_ctx = sws_getContext(ctx->p_video_ctx->width,ctx->p_video_ctx->height,ctx->p_video_ctx->pix_fmt,ctx->p_video_ctx->width,PIX_FMT_YUV420P,SWS_FAST_BILINEAR,NULL,NULL);


// Framebuffer for RGB data
ctx->p_frame_buffer = malloc(avpicture_get_size(PIX_FMT_YUV420P,ctx->p_video_ctx->height));

avpicture_fill((AVPicture*)ctx->p_picture_rgb,ctx->p_frame_buffer,ctx->p_video_ctx->height);

我的渲染方法

if (NULL == videoTextureCache) {
    NSLog(@"displayPixelBuffer error");
    return;
}    


CVPixelBufferRef pixelBuffer;    
   CVPixelBufferCreateWithBytes(kcfAllocatorDefault,mTexW,mTexH,kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange,buffer,mFrameW * 3,&pixelBuffer);



CVReturn err;    
// Y-plane
glActiveTexture(GL_TEXTURE0);
err = CVOpenGLESTextureCacheCreateTextureFromImage(kcfAllocatorDefault,videoTextureCache,pixelBuffer,GL_TEXTURE_2D,GL_RED_EXT,GL_UNSIGNED_BYTE,&_lumaTexture);
if (err) 
{
    NSLog(@"Error at CVOpenGLESTextureCacheCreateTextureFromImage %d",err);
}   

glBindTexture(CVOpenGLESTextureGetTarget(_lumaTexture),CVOpenGLESTextureGetName(_lumaTexture));
glTexParameterf(GL_TEXTURE_2D,GL_TEXTURE_WRAP_S,GL_CLAMP_TO_EDGE);
glTexParameterf(GL_TEXTURE_2D,GL_TEXTURE_WRAP_T,GL_CLAMP_TO_EDGE);     

// UV-plane
glActiveTexture(GL_TEXTURE1);
err = CVOpenGLESTextureCacheCreateTextureFromImage(kcfAllocatorDefault,GL_RG_EXT,mTexW/2,mTexH/2,1,&_chromaTexture);
if (err) 
{
    NSLog(@"Error at CVOpenGLESTextureCacheCreateTextureFromImage %d",err);
}

glBindTexture(CVOpenGLESTextureGetTarget(_chromaTexture),CVOpenGLESTextureGetName(_chromaTexture));
glTexParameterf(GL_TEXTURE_2D,GL_CLAMP_TO_EDGE);     

glBindFramebuffer(GL_FRAMEBUFFER,defaultFramebuffer);

// Set the view port to the entire view
glViewport(0,backingWidth,backingHeight);

static const GLfloat squareVertices[] = {
    1.0f,1.0f,-1.0f,};

GLfloat textureVertices[] = {
    1,};

// Draw the texture on the screen with OpenGL ES 2
[self renderWithSquareVertices:squareVertices textureVertices:textureVertices];


// Flush the CVOpenGLESTexture cache and release the texture
CVOpenGLESTextureCacheFlush(videoTextureCache,0);    
CVPixelBufferRelease(pixelBuffer);     

 [moviePlayerDelegate bufferDone];

RenderWithSquareVertices方法

- (void)renderWithSquareVertices:(const GLfloat*)squareVertices textureVertices:(const GLfloat*)textureVertices
{


  // Use shader program.
    glUseProgram(shader.program);

// Update attribute values.
glVertexAttribPointer(ATTRIB_VERTEX,2,GL_FLOAT,squareVertices);
glEnableVertexAttribArray(ATTRIB_VERTEX);
glVertexAttribPointer(ATTRIB_TEXTUREPOSITON,textureVertices);
glEnableVertexAttribArray(ATTRIB_TEXTUREPOSITON);

glDrawArrays(GL_TRIANGLE_STRIP,4);

// Present
glBindRenderbuffer(GL_RENDERBUFFER,colorRenderbuffer);
[context presentRenderbuffer:GL_RENDERBUFFER];

}

我的片段着色器:

uniform sampler2D SamplerY;
uniform sampler2D SamplerUV;


varying highp vec2 _texcoord;

void main()
{

mediump vec3 yuv;
lowp vec3 rgb;

yuv.x = texture2D(SamplerY,_texcoord).r;
yuv.yz = texture2D(SamplerUV,_texcoord).rg - vec2(0.5,0.5);

// BT.601,which is the standard for SDTV is provided as a reference

/* rgb = mat3(    1,-.34413,1.772,1.402,-.71414,0) * yuv;*/


// Using BT.709 which is the standard for HDTV
rgb = mat3(      1,-.18732,1.8556,1.57481,-.46813,0) * yuv;

   gl_FragColor = vec4(rgb,1);

}

很感谢,

解决方法

我想问题是YUV420(或I420)是一种三平面图像格式. I420是8位Y平面,接着是8位2×2子采样U和V平面.来自GLCameraRipple的代码期待NV12格式:8位Y平面,接着是具有2×2子采样的交错U / V平面.鉴于此,我希望你需要三个纹理. luma_tex,u_chroma_tex,v_chroma_tex.

另请注意,GLCameraRipple也可能期待“视频范围”.换句话说,平面格式的值是亮度= [16,235]色度= [16,240].

猜你在找的iOS相关文章