Android上使用OpenGLES2.0显示YUV数据

盛年不重来,一日难再晨,及时当勉励,岁月不待人。这篇文章主要讲述Android上使用OpenGLES2.0显示YUV数据相关的知识,希望能为你提供帮助。
在Android上用OpenGLES来显示YUV图像,之所以这样做,是因为:
1.android本身也不能直接显示YUV图像,YUV转成RGB还是必要的;
2.YUV手动转RGB会占用大量的CPU资源,如果以这样的形式播放视频,手机会很热,所以我们尽量让GPU来做这件事;
3.OpenGLES是Android集成到自身框架里的第三方库,它有很多的可取之处。
 
博主的C/C++不是很好,所以整个过程是在Java层实现的,大家见笑,我主要参考(但不限于)以下文章,十分感谢这些朋友的分享:
1. http://blog.csdn.NET/xiaoguaihai/article/details/8672631
2.http://chenshun87.blog.163.com/blog/static/18859389201232011727615/
3.http://blog.csdn.net/ypist/article/details/8950903
4.http://blog.csdn.net/wanglang3081/article/details/8480281
5.http://blog.csdn.net/xdljf/article/details/7178620
 
一、首先我先说一下这个解决方案是怎么运行的,给大家一个概念
1.显示在哪 -> GLSurfaceVIew
2.谁来把数据贴到GLSurfaceVIew上 -> Renderer
3.谁来负责YUV数据转换成RGB -> GL中的Program/Shader
一句话说明白就是:GL的Program/Shader把用户传过来的YUV数据,转换成RGB数据后,通过Renderer贴在GLSurfaceView上。
 
二、怎么检查你的手机是不是支持GLES2.0呢,使用下面的代码段就行了:
 
一般的手机,都是会支持GLES2.0的,大家不必担心。
 

public static boolean detectOpenGLES20(Context context) { ActivityManager am = (ActivityManager) context.getSystemService(Context.ACTIVITY_SERVICE); ConfigurationInfo info = am.getDeviceConfigurationInfo(); return (info.reqGlEsVersion > = 0x20000); }

 
 
三、开搞
 
A 先要有一个GLSurfaceView,把它放入你的布局中就好了。
找到这个家伙,对它进行简单的设置,并为它设置一个Renderer。
Renderer的作用就是在GLSurfaceView上画出图像。
 
mGLSurface = (GLFrameSurface) findViewById(R.id.glsurface); mGLSurface.setEGLContextClientVersion(2); mGLFRenderer = new GLFrameRenderer(this, mGLSurface); mGLSurface.setRenderer(mGLFRenderer);

 
B 再就是看下GLFrameRenderer怎么来写了
 
 
Android上使用OpenGLES2.0显示YUV数据

文章图片
Android上使用OpenGLES2.0显示YUV数据

文章图片
1 public class GLFrameRenderer implements Renderer { 2 3private ISimplePlayer mParentAct; //请无视之 4private GLSurfaceView mTargetSurface; 5private GLProgram prog = new GLProgram(0); 6private int mVideoWidth = -1, mVideoHeight = -1; 7private ByteBuffer y; 8private ByteBuffer u; 9private ByteBuffer v; 10 11public GLFrameRenderer(ISimplePlayer callback, GLSurfaceView surface) { 12mParentAct = callback; //请无视之 13mTargetSurface = surface; 14} 15 16@Override 17public void onSurfaceCreated(GL10 gl, EGLConfig config) { 18Utils.LOGD("GLFrameRenderer :: onSurfaceCreated"); 19if (!prog.isProgramBuilt()) { 20prog.buildProgram(); 21Utils.LOGD("GLFrameRenderer :: buildProgram done"); 22} 23} 24 25@Override 26public void onSurfaceChanged(GL10 gl, int width, int height) { 27Utils.LOGD("GLFrameRenderer :: onSurfaceChanged"); 28GLES20.glViewport(0, 0, width, height); 29} 30 31@Override 32public void onDrawFrame(GL10 gl) { 33synchronized (this) { 34if (y != null) { 35// reset position, have to be done 36y.position(0); 37u.position(0); 38v.position(0); 39prog.buildTextures(y, u, v, mVideoWidth, mVideoHeight); 40GLES20.glClearColor(0.0f, 0.0f, 0.0f, 1.0f); 41GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT); 42prog.drawFrame(); 43} 44} 45} 46 47/** 48* this method will be called from native code, it happens when the video is about to play or 49* the video size changes. 50*/ 51public void update(int w, int h) { 52Utils.LOGD("INIT E"); 53if (w > 0 & & h > 0) { 54if (w != mVideoWidth & & h != mVideoHeight) { 55this.mVideoWidth = w; 56this.mVideoHeight = h; 57int yarraySize = w * h; 58int uvarraySize = yarraySize / 4; 59synchronized (this) { 60y = ByteBuffer.allocate(yarraySize); 61u = ByteBuffer.allocate(uvarraySize); 62v = ByteBuffer.allocate(uvarraySize); 63} 64} 65} 66 67mParentAct.onPlayStart(); //请无视之 68Utils.LOGD("INIT X"); 69} 70 71/** 72* this method will be called from native code, it‘s used for passing yuv data to me. 73*/ 74public void update(byte[] ydata, byte[] udata, byte[] vdata) { 75synchronized (this) { 76y.clear(); 77u.clear(); 78v.clear(); 79y.put(ydata, 0, ydata.length); 80u.put(udata, 0, udata.length); 81v.put(vdata, 0, vdata.length); 82} 83 84// request to render 85mTargetSurface.requestRender(); 86} 87 }

View Code 
代码很简单,Renderer主要处理这么几个事:
 
1.Surface create的时候,我初始化了一些需要用到的Program/Shader,因为马上就要用到它们了;
2.Surface change的时候,重置一下画面;
3.onDrawFrame()时,把数据真正地“ 画” 上去;
4.至于两个update方法,是用来把图像的宽高/数据传过来的。
 
C 看GLProgram是怎么写的,它的作用是向Renderer提供计算单元,你所有对数据的处理,都在这儿了。
 
Android上使用OpenGLES2.0显示YUV数据

文章图片
Android上使用OpenGLES2.0显示YUV数据

文章图片
1 public boolean isProgramBuilt() { 2return isProgBuilt; 3 } 4 5 public void buildProgram() { 6createBuffers(_vertices, coordVertices); 7if (_program < = 0) { 8_program = createProgram(VERTEX_SHADER, FRAGMENT_SHADER); 9} 10Utils.LOGD("_program = " + _program); 11 12/* 13* get handle for "vPosition" and "a_texCoord" 14*/ 15_positionHandle = GLES20.glGetAttribLocation(_program, "vPosition"); 16Utils.LOGD("_positionHandle = " + _positionHandle); 17checkGlError("glGetAttribLocation vPosition"); 18if (_positionHandle == -1) { 19throw new RuntimeException("Could not get attribute location for vPosition"); 20} 21_coordHandle = GLES20.glGetAttribLocation(_program, "a_texCoord"); 22Utils.LOGD("_coordHandle = " + _coordHandle); 23checkGlError("glGetAttribLocation a_texCoord"); 24if (_coordHandle == -1) { 25throw new RuntimeException("Could not get attribute location for a_texCoord"); 26} 27 28/* 29* get uniform location for y/u/v, we pass data through these uniforms 30*/ 31_yhandle = GLES20.glGetUniformLocation(_program, "tex_y"); 32Utils.LOGD("_yhandle = " + _yhandle); 33checkGlError("glGetUniformLocation tex_y"); 34if (_yhandle == -1) { 35throw new RuntimeException("Could not get uniform location for tex_y"); 36} 37_uhandle = GLES20.glGetUniformLocation(_program, "tex_u"); 38Utils.LOGD("_uhandle = " + _uhandle); 39checkGlError("glGetUniformLocation tex_u"); 40if (_uhandle == -1) { 41throw new RuntimeException("Could not get uniform location for tex_u"); 42} 43_vhandle = GLES20.glGetUniformLocation(_program, "tex_v"); 44Utils.LOGD("_vhandle = " + _vhandle); 45checkGlError("glGetUniformLocation tex_v"); 46if (_vhandle == -1) { 47throw new RuntimeException("Could not get uniform location for tex_v"); 48} 49 50isProgBuilt = true; 51 } 52 53 /** 54* build a set of textures, one for Y, one for U, and one for V. 55*/ 56 public void buildTextures(Buffer y, Buffer u, Buffer v, int width, int height) { 57boolean videoSizeChanged = (width != _video_width || height != _video_height); 58if (videoSizeChanged) { 59_video_width = width; 60_video_height = height; 61Utils.LOGD("buildTextures videoSizeChanged: w=" + _video_width + " h=" + _video_height); 62} 63 64// building texture for Y data 65if (_ytid < 0 || videoSizeChanged) { 66if (_ytid > = 0) { 67Utils.LOGD("glDeleteTextures Y"); 68GLES20.glDeleteTextures(1, new int[] { _ytid }, 0); 69checkGlError("glDeleteTextures"); 70} 71// GLES20.glPixelStorei(GLES20.GL_UNPACK_ALIGNMENT, 1); 72int[] textures = new int[1]; 73GLES20.glGenTextures(1, textures, 0); 74checkGlError("glGenTextures"); 75_ytid = textures[0]; 76Utils.LOGD("glGenTextures Y = " + _ytid); 77} 78GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, _ytid); 79checkGlError("glBindTexture"); 80GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_LUMINANCE, _video_width, _video_height, 0, 81GLES20.GL_LUMINANCE, GLES20.GL_UNSIGNED_BYTE, y); 82checkGlError("glTexImage2D"); 83GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_NEAREST); 84GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR); 85GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE); 86GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE); 87 88// building texture for U data 89if (_utid < 0 || videoSizeChanged) { 90if (_utid > = 0) { 91Utils.LOGD("glDeleteTextures U"); 92GLES20.glDeleteTextures(1, new int[] { _utid }, 0); 93checkGlError("glDeleteTextures"); 94} 95int[] textures = new int[1]; 96GLES20.glGenTextures(1, textures, 0); 97checkGlError("glGenTextures"); 98_utid = textures[0]; 99Utils.LOGD("glGenTextures U = " + _utid); 100} 101GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, _utid); 102GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_LUMINANCE, _video_width / 2, _video_height / 2, 0, 103GLES20.GL_LUMINANCE, GLES20.GL_UNSIGNED_BYTE, u); 104GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_NEAREST); 105GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR); 106GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE); 107GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE); 108 109// building texture for V data 110if (_vtid < 0 || videoSizeChanged) { 111if (_vtid > = 0) { 112Utils.LOGD("glDeleteTextures V"); 113GLES20.glDeleteTextures(1, new int[] { _vtid }, 0); 114checkGlError("glDeleteTextures"); 115} 116int[] textures = new int[1]; 117GLES20.glGenTextures(1, textures, 0); 118checkGlError("glGenTextures"); 119_vtid = textures[0]; 120Utils.LOGD("glGenTextures V = " + _vtid); 121} 122GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, _vtid); 123GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_LUMINANCE, _video_width / 2, _video_height / 2, 0, 124GLES20.GL_LUMINANCE, GLES20.GL_UNSIGNED_BYTE, v); 125GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_NEAREST); 126GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR); 127GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE); 128GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE); 129 } 130 131 /** 132* render the frame 133* the YUV data will be converted to RGB by shader. 134*/ 135 public void drawFrame() { 136GLES20.glUseProgram(_program); 137checkGlError("glUseProgram"); 138 139GLES20.glVertexAttribPointer(_positionHandle, 2, GLES20.GL_FLOAT, false, 8, _vertice_buffer); 140checkGlError("glVertexAttribPointer mPositionHandle"); 141GLES20.glEnableVertexAttribArray(_positionHandle); 142 143GLES20.glVertexAttribPointer(_coordHandle, 2, GLES20.GL_FLOAT, false, 8, _coord_buffer); 144checkGlError("glVertexAttribPointer maTextureHandle"); 145GLES20.glEnableVertexAttribArray(_coordHandle); 146 147// bind textures 148GLES20.glActiveTexture(_textureI); 149GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, _ytid); 150GLES20.glUniform1i(_yhandle, _tIindex); 151 152GLES20.glActiveTexture(_textureII); 153GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, _utid); 154GLES20.glUniform1i(_uhandle, _tIIindex); 155 156GLES20.glActiveTexture(_textureIII); 157GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, _vtid); 158GLES20.glUniform1i(_vhandle, _tIIIindex); 159 160GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4); 161GLES20.glFinish(); 162 163GLES20.glDisableVertexAttribArray(_positionHandle); 164GLES20.glDisableVertexAttribArray(_coordHandle); 165 } 166 167 /** 168* create program and load shaders, fragment shader is very important. 169*/ 170 public int createProgram(String vertexSource, String fragmentSource) { 171// create shaders 172int vertexShader = loadShader(GLES20.GL_VERTEX_SHADER, vertexSource); 173int pixelShader = loadShader(GLES20.GL_FRAGMENT_SHADER, fragmentSource); 174// just check 175Utils.LOGD("vertexShader = " + vertexShader); 176Utils.LOGD("pixelShader = " + pixelShader); 177 178int program = GLES20.glCreateProgram(); 179if (program != 0) { 180GLES20.glAttachShader(program, vertexShader); 181checkGlError("glAttachShader"); 182GLES20.glAttachShader(program, pixelShader); 183checkGlError("glAttachShader"); 184GLES20.glLinkProgram(program); 185int[] linkStatus = new int[1]; 186GLES20.glGetProgramiv(program, GLES20.GL_LINK_STATUS, linkStatus, 0); 187if (linkStatus[0] != GLES20.GL_TRUE) { 188Utils.LOGE("Could not link program: ", null); 189Utils.LOGE(GLES20.glGetProgramInfoLog(program), null); 190GLES20.glDeleteProgram(program); 191program = 0; 192} 193} 194return program; 195 } 196 197 /** 198* create shader with given source. 199*/ 200 private int loadShader(int shaderType, String source) { 201int shader = GLES20.glCreateShader(shaderType); 202if (shader != 0) { 203GLES20.glShaderSource(shader, source); 204GLES20.glCompileShader(shader); 205int[] compiled = new int[1]; 206GLES20.glGetShaderiv(shader, GLES20.GL_COMPILE_STATUS, compiled, 0); 207if (compiled[0] == 0) { 208Utils.LOGE("Could not compile shader " + shaderType + ":", null); 209Utils.LOGE(GLES20.glGetShaderInfoLog(shader), null); 210GLES20.glDeleteShader(shader); 211shader = 0; 212} 213} 214return shader; 215 } 216 217 /** 218* these two buffers are used for holding vertices, screen vertices and texture vertices. 219*/ 220 private void createBuffers(float[] vert, float[] coord) { 221_vertice_buffer = ByteBuffer.allocateDirect(vert.length * 4); 222_vertice_buffer.order(ByteOrder.nativeOrder()); 223_vertice_buffer.asFloatBuffer().put(vert); 224_vertice_buffer.position(0); 225 226if (_coord_buffer == null) { 227_coord_buffer = ByteBuffer.allocateDirect(coord.length * 4); 228_coord_buffer.order(ByteOrder.nativeOrder()); 229_coord_buffer.asFloatBuffer().put(coord); 230_coord_buffer.position(0); 231} 232 } 233 234 private void checkGlError(String op) { 235int error; 236while ((error = GLES20.glGetError()) != GLES20.GL_NO_ERROR) { 237Utils.LOGE("***** " + op + ": glError " + error, null); 238throw new RuntimeException(op + ": glError " + error); 239} 240 } 241 242 private static float[] squareVertices = { -1.0f, -1.0f, 1.0f, -1.0f, -1.0f, 1.0f, 1.0f, 1.0f, }; // fullscreen 243 244 private static float[] coordVertices = { 0.0f, 1.0f, 1.0f, 1.0f, 0.0f, 0.0f, 1.0f, 0.0f, }; // whole-texture 245 246 private static final String VERTEX_SHADER = "attribute vec4 vPosition; \n" + "attribute vec2 a_texCoord; \n" 247+ "varying vec2 tc; \n" + "void main() {\n" + "gl_Position = vPosition; \n" + "tc = a_texCoord; \n" + "}\n"; 248 249 private static final String FRAGMENT_SHADER = "precision mediump float; \n" + "uniform sampler2D tex_y; \n" 250+ "uniform sampler2D tex_u; \n" + "uniform sampler2D tex_v; \n" + "varying vec2 tc; \n" + "void main() {\n" 251+ "vec4 c = vec4((texture2D(tex_y, tc).r - 16./255.) * 1.164); \n" 252+ "vec4 U = vec4(texture2D(tex_u, tc).r - 128./255.); \n" 253+ "vec4 V = vec4(texture2D(tex_v, tc).r - 128./255.); \n" + "c += V * vec4(1.596, -0.813, 0, 0); \n" 254+ "c += U * vec4(0, -0.392, 2.017, 0); \n" + "c.a = 1.0; \n" + "gl_FragColor = c; \n" + "}\n"; 255

View Code 
这里面代码比较复杂,我在这里稍作解释:
1.首先,buildProgram()目的要生成一个program,作用是用来将YUV-> RGB,其中用到了2个shader(shader就相当于一个小运算器,它运行一段代码),第1个shader运行VERTEX_SHADER里的代码,目的是将坐标作为参数传入第2个shader;第2个shader来做YUV-> RGB的运算。
2.buildTextures()是要生成3个贴图,分别为了显示R/G/B数据,三个贴图重合在一起,显示出来的就是彩色的图片。
3.drawFrame()是使用program来做运算,并真正去做画这个动作了。
 
至此,就可以将YUV图片也好,视频也可,给显示在Android上了,而且速度不慢哦!希望能帮到大家。
 
相关代码下载链接:
http://download.csdn.net/detail/ueryueryuery/7144851
 
【Android上使用OpenGLES2.0显示YUV数据】本文来自:http://blog.csdn.net/ueryueryuery/article/details/17608185#comments

    推荐阅读