写点什么

IOS 技术分享| 在 iOS WebRTC 中添加美颜滤镜

发布于: 48 分钟前

在使用 WebRTC 的时候,对视频进行美颜处理一般有两种方式:替换 WebRTC 中的采集模块对视频数据进行美颜

一、替换 WebRTC 中的采集模块

替换 WebRTC 中的采集模块,相对比较简单,使用 GPUImageVideoCamera 替换 WebRTC 中的视频采集,得到经过 GPUImage 添加美颜处理后的图像,发送给 WebRTC 的 OnFrame 方法。


参考基于 WebRTC 框架开发的全平台推拉流 SDK:Github

设置美颜

- (void)setBeautyFace:(BOOL)beautyFace{    if(_beautyFace == beautyFace) return;
_beautyFace = beautyFace; [_emptyFilter removeAllTargets]; [_filter removeAllTargets]; [_videoCamera removeAllTargets];
if(_beautyFace){ _filter = [[GPUImageBeautifyFilter alloc] init]; _emptyFilter = [[GPUImageEmptyFilter alloc] init]; }else{ _filter = [[GPUImageEmptyFilter alloc] init]; }
__weak typeof(self) _self = self; [_filter setFrameProcessingCompletionBlock:^(GPUImageOutput *output, CMTime time) { [_self processVideo:output]; }]; [_videoCamera addTarget:_filter]; if (beautyFace) { [_filter addTarget:_emptyFilter]; if(_gpuImageView) [_emptyFilter addTarget:_gpuImageView]; } else { if(_gpuImageView) [_filter addTarget:_gpuImageView]; }}
复制代码

格式转换

GPUImage 处理后的 Pixel 格式为 BGRA,当处理完成后需要转换为 I420 格式,用于内部处理和渲染。


WebRTC 在编码的时候使用的是 NV12 格式的 Pixel,所以在编码的时候会进行二次格式转换


-(void) processVideo:(GPUImageOutput *)output{  rtc::CritScope cs(&cs_capture_);  if (!_isRunning) {      return;  }  @autoreleasepool {      GPUImageFramebuffer *imageFramebuffer = output.framebufferForOutput;
size_t width = imageFramebuffer.size.width; size_t height = imageFramebuffer.size.height; uint32_t size = width * height * 3 / 2;
if(self.nWidth != width || self.nHeight != height) { self.nWidth = width; self.nHeight = height; if(_dst) delete[] _dst; _dst = NULL; } if(_dst == NULL) { _dst = new uint8_t[size]; } uint8_t* y_pointer = (uint8_t*)_dst; uint8_t* u_pointer = (uint8_t*)y_pointer + width*height; uint8_t* v_pointer = (uint8_t*)u_pointer + width*height/4; int y_pitch = width; int u_pitch = (width + 1) >> 1; int v_pitch = (width + 1) >> 1;
libyuv::ARGBToI420([imageFramebuffer byteBuffer], width * 4, y_pointer, y_pitch, u_pointer, u_pitch, v_pointer, v_pitch, width, height); if(self.bVideoEnable) libyuv::I420Rect(y_pointer, y_pitch, u_pointer, u_pitch, v_pointer, v_pitch, 0, 0, width, height, 32, 128, 128);
if(_capturer != nil) _capturer->CaptureYUVData(_dst, width, height, size); }}
复制代码

美颜后的数据发送给 WebRTC 的 OnFrame 方法

GPUImageVideoCapturer 类为 GPUImage 封装的摄像头类,跟 WebRTC 中的采集类功能保持一致,继承 cricket::VideoCapturer 类,便可以往 WebRTC 中塞入采集的音视频流。


namespace webrtc {  // 继承cricket::VideoCapturer  class GPUImageVideoCapturer : public cricket::VideoCapturer {    ...  }}
复制代码


void GPUImageVideoCapturer::CaptureYUVData(const webrtc::VideoFrame& frame, int width, int height){  VideoCapturer::OnFrame(frame, width, height);}
复制代码

二、对视频数据进行美颜

对视频数据美颜的思路就是传统的第三方美颜 SDK 的做法,对内部采集的音视频数据进行处理:内部采集的数据(CVPixelBufferRef)-》转换为纹理(GLuint)-》对纹理进行音视频的美颜-》美颜的纹理转换为 iOS 的采集数据(CVPixelBufferRef)-》返回给 WebRTC 内部进行渲染编码和传输。

同步线程

内部处理的一般都是使用同步线程,这样能够保证数据线性流动,参阅 GPUImage 中的代码片段


 runSynchronouslyOnVideoProcessingQueue(^{ // 美颜处理 });
复制代码

把 CVPixelBufferRef 数据转换为纹理(GLuint)

RGB 格式类型的转换方式
  • CoreVideo框架的方法:使用此方法可以创建CVOpenGLESTextureRef纹理,并通过CVOpenGLESTextureGetName(texture)获取纹理 id。


  - (GLuint)convertRGBPixelBufferToTexture:(CVPixelBufferRef)pixelBuffer {      if (!pixelBuffer) {          return 0;      }      CGSize textureSize = CGSizeMake(CVPixelBufferGetWidth(pixelBuffer),                                      CVPixelBufferGetHeight(pixelBuffer));      CVOpenGLESTextureRef texture = nil;      CVReturn status = CVOpenGLESTextureCacheCreateTextureFromImage(nil,                                                                           [[GPUImageContext sharedImageProcessingContext] coreVideoTextureCache],                                                                     pixelBuffer,                                                                     nil,                                                                     GL_TEXTURE_2D,                                                                     GL_RGBA,                                                                     textureSize.width,                                                                     textureSize.height,                                                                     GL_BGRA,                                                                     GL_UNSIGNED_BYTE,                                                                     0,                                                                     &texture);            if (status != kCVReturnSuccess) {          NSLog(@"Can't create texture");      }      self.renderTexture = texture;      return CVOpenGLESTextureGetName(texture);  }
复制代码


  • OpenGL的方法:创建纹理对象,使用glTexImage2D方法上传CVPixelBufferRef中图像数据 data 到纹理对象中。


      glBindTexture(GL_TEXTURE_2D, [outputFramebuffer texture]);      glTexImage2D(GL_TEXTURE_2D, 0, _pixelFormat==GPUPixelFormatRGB ? GL_RGB : GL_RGBA, (int)uploadedImageSize.width, (int)uploadedImageSize.height, 0, (GLint)_pixelFormat, (GLenum)_pixelType, bytesToUpload);
复制代码
YUV 格式类型的转换方式
- (GLuint)convertYUVPixelBufferToTexture:(CVPixelBufferRef)pixelBuffer {    if (!pixelBuffer) {        return 0;    }        CGSize textureSize = CGSizeMake(CVPixelBufferGetWidth(pixelBuffer),                                    CVPixelBufferGetHeight(pixelBuffer));
[EAGLContext setCurrentContext:self.context]; GLuint frameBuffer; GLuint textureID; // FBO glGenFramebuffers(1, &frameBuffer); glBindFramebuffer(GL_FRAMEBUFFER, frameBuffer); // texture glGenTextures(1, &textureID); glBindTexture(GL_TEXTURE_2D, textureID); glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, textureSize.width, textureSize.height, 0, GL_RGBA, GL_UNSIGNED_BYTE, NULL); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR); glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, textureID, 0); glViewport(0, 0, textureSize.width, textureSize.height); // program glUseProgram(self.yuvConversionProgram); // texture CVOpenGLESTextureRef luminanceTextureRef = nil; CVOpenGLESTextureRef chrominanceTextureRef = nil;
CVReturn status = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, self.textureCache, pixelBuffer, nil, GL_TEXTURE_2D, GL_LUMINANCE, textureSize.width, textureSize.height, GL_LUMINANCE, GL_UNSIGNED_BYTE, 0, &luminanceTextureRef); if (status != kCVReturnSuccess) { NSLog(@"Can't create luminanceTexture"); } status = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, self.textureCache, pixelBuffer, nil, GL_TEXTURE_2D, GL_LUMINANCE_ALPHA, textureSize.width / 2, textureSize.height / 2, GL_LUMINANCE_ALPHA, GL_UNSIGNED_BYTE, 1, &chrominanceTextureRef); if (status != kCVReturnSuccess) { NSLog(@"Can't create chrominanceTexture"); } glActiveTexture(GL_TEXTURE0); glBindTexture(GL_TEXTURE_2D, CVOpenGLESTextureGetName(luminanceTextureRef)); glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE); glUniform1i(glGetUniformLocation(self.yuvConversionProgram, "luminanceTexture"), 0); glActiveTexture(GL_TEXTURE1); glBindTexture(GL_TEXTURE_2D, CVOpenGLESTextureGetName(chrominanceTextureRef)); glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE); glUniform1i(glGetUniformLocation(self.yuvConversionProgram, "chrominanceTexture"), 1); GLfloat kXDXPreViewColorConversion601FullRange[] = { 1.0, 1.0, 1.0, 0.0, -0.343, 1.765, 1.4, -0.711, 0.0, }; GLuint yuvConversionMatrixUniform = glGetUniformLocation(self.yuvConversionProgram, "colorConversionMatrix"); glUniformMatrix3fv(yuvConversionMatrixUniform, 1, GL_FALSE, kXDXPreViewColorConversion601FullRange); // VBO glBindBuffer(GL_ARRAY_BUFFER, self.VBO); GLuint positionSlot = glGetAttribLocation(self.yuvConversionProgram, "position"); glEnableVertexAttribArray(positionSlot); glVertexAttribPointer(positionSlot, 3, GL_FLOAT, GL_FALSE, 5 * sizeof(float), (void*)0); GLuint textureSlot = glGetAttribLocation(self.yuvConversionProgram, "inputTextureCoordinate"); glEnableVertexAttribArray(textureSlot); glVertexAttribPointer(textureSlot, 2, GL_FLOAT, GL_FALSE, 5 * sizeof(float), (void*)(3* sizeof(float))); glDrawArrays(GL_TRIANGLE_STRIP, 0, 4); glDeleteFramebuffers(1, &frameBuffer); glBindFramebuffer(GL_FRAMEBUFFER, 0); glBindBuffer(GL_ARRAY_BUFFER, 0); glFlush(); self.luminanceTexture = luminanceTextureRef; self.chrominanceTexture = chrominanceTextureRef; if (luminanceTextureRef) { CFRelease(luminanceTextureRef); } if (chrominanceTextureRef) { CFRelease(chrominanceTextureRef); } return textureID;}
复制代码

使用 GPUImageTextureInput 加载滤镜和使用 GPUImageTextureOutput 输出数据

    [GPUImageContext setActiveShaderProgram:nil];        GPUImageTextureInput *textureInput = [[ARGPUImageTextureInput alloc] initWithTexture:textureID size:size];     GPUImageSmoothToonFilter *filter = [[GPUImageSmoothToonFilter alloc] init];     [textureInput addTarget:filter];     GPUImageTextureOutput *textureOutput = [[GPUImageTextureOutput alloc] init];     [filter addTarget:textureOutput];     [textureInput processTextureWithFrameTime:kCMTimeZero];
复制代码


得到 textureOutput,即得到输出的纹理。

GPUImageTextureOutput 纹理转化为 CVPixelBufferRef 数据

- (CVPixelBufferRef)convertTextureToPixelBuffer:(GLuint)texture                                    textureSize:(CGSize)textureSize {    [EAGLContext setCurrentContext:self.context];        CVPixelBufferRef pixelBuffer = [self createPixelBufferWithSize:textureSize];    GLuint targetTextureID = [self convertRGBPixelBufferToTexture:pixelBuffer];        GLuint frameBuffer;        // FBO    glGenFramebuffers(1, &frameBuffer);    glBindFramebuffer(GL_FRAMEBUFFER, frameBuffer);        // texture    glBindTexture(GL_TEXTURE_2D, targetTextureID);    glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, textureSize.width, textureSize.height, 0, GL_RGBA, GL_UNSIGNED_BYTE, NULL);        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);        glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, targetTextureID, 0);        glViewport(0, 0, textureSize.width, textureSize.height);        // program    glUseProgram(self.normalProgram);        // texture    glActiveTexture(GL_TEXTURE0);    glBindTexture(GL_TEXTURE_2D, texture);    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);    glUniform1i(glGetUniformLocation(self.normalProgram, "renderTexture"), 0);        // VBO    glBindBuffer(GL_ARRAY_BUFFER, self.VBO);        GLuint positionSlot = glGetAttribLocation(self.normalProgram, "position");    glEnableVertexAttribArray(positionSlot);    glVertexAttribPointer(positionSlot, 3, GL_FLOAT, GL_FALSE, 5 * sizeof(float), (void*)0);        GLuint textureSlot = glGetAttribLocation(self.normalProgram, "inputTextureCoordinate");    glEnableVertexAttribArray(textureSlot);    glVertexAttribPointer(textureSlot, 2, GL_FLOAT, GL_FALSE, 5 * sizeof(float), (void*)(3* sizeof(float)));        glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);        glDeleteFramebuffers(1, &frameBuffer);        glBindFramebuffer(GL_FRAMEBUFFER, 0);    glBindBuffer(GL_ARRAY_BUFFER, 0);        glFlush();        return pixelBuffer;}
复制代码


把美颜后的 CVPixelBufferRef 同步返回给 SDK,进行渲染传输。

三、总结

对音视频的美颜,已经成为了音视频应用的常用功能,除了上述两种做法外,还可以使用第三方美颜,一般音视频厂商都有提供自采集功能,而第三方美颜功能则提供有采集美颜相机功能,二者正好可以无缝结合。如果自身的应用中对美颜要求不是很高,采用音视频 SDK 自带的美颜即可(美白、美颜、红润),如果用在娱乐场景,除了美颜,还要美型(廋脸,大眼)、贴纸(2D、3D)的,必须要集成第三方美颜 SDK 了。

发布于: 48 分钟前阅读数: 2
用户头像

实时交互,万物互联! 2020.08.10 加入

实时交互,万物互联,全球实时互动云服务商领跑者!

评论

发布
暂无评论
IOS技术分享| 在iOS WebRTC 中添加美颜滤镜