欢迎您访问程序员文章站本站旨在为大家提供分享程序员计算机编程知识!
您现在的位置是: 首页  >  移动技术

android用MediaCodeC将opengl绘制内容录制为一个mp4

程序员文章站 2022-11-21 16:46:55
效果图 实现源码(已上传我的GitHub): "https://github.com/xiaxveliang/GL_AUDIO_VIDEO_RECODE" 参考: "http://bigflake.com/mediacodec/EncodeAndMuxTest.java.txt" 对于以上代码,我做 ......

效果图

android用MediaCodeC将opengl绘制内容录制为一个mp4
android用MediaCodeC将opengl绘制内容录制为一个mp4

实现源码(已上传我的github):

https://github.com/xiaxveliang/gl_audio_video_recode

参考:

http://bigflake.com/mediacodec/encodeandmuxtest.java.txt

对于以上代码,我做了一个简单的注释,代码如下:

import android.media.mediacodec;
import android.media.mediacodecinfo;
import android.media.mediaformat;
import android.media.mediamuxer;
import android.opengl.egl14;
import android.opengl.eglconfig;
import android.opengl.eglcontext;
import android.opengl.egldisplay;
import android.opengl.eglext;
import android.opengl.eglsurface;
import android.opengl.gles20;
import android.os.environment;
import android.test.androidtestcase;
import android.util.log;
import android.view.surface;

import java.io.file;
import java.io.ioexception;
import java.nio.bytebuffer;

// wiki:
// http://bigflake.com/mediacodec/encodeandmuxtest.java.txt

public class encodeandmuxtest extends androidtestcase {
    private static final string tag = encodeandmuxtest.class.getsimplename();

    // 输出文件路径
    private static final file my_output_dir = environment.getexternalstoragedirectory();
    // h.264编码
    private static final string my_mime_type = "video/avc";
    // 视频文件的宽高
    private static final int my_video_width = 480;
    private static final int my_video_height = 480;
    // 视频码率
    private static final int my_bit_rate = 800000;
    // 每秒钟15帧
    private static final int my_fps = 15;


    // 总共30帧,每一秒15帧,所以30帧为2秒钟
    private static final int num_frames = 30;


    // rgb color values for generated frames
    private static final int test_r0 = 0;
    private static final int test_g0 = 136;
    private static final int test_b0 = 0;
    //
    private static final int test_r1 = 236;
    private static final int test_g1 = 50;
    private static final int test_b1 = 186;


    // encoder / muxer state
    private mediacodec mencoder;
    // h.264 转 mp4
    private mediamuxer mmuxer;


    private codecinputsurface minputsurface;

    private int mtrackindex;
    private boolean mmuxerstarted;

    // allocate one of these up front so we don't need to do it every time
    private mediacodec.bufferinfo mbufferinfo;


    /**
     * opengl绘制一个buffer,转成mp4,程序入口
     */
    public void testencodevideotomp4() {

        try {
            // 初始化encoder
            initvideoencoder();
            // 设置 egldisplay dpy, eglsurface draw, eglsurface read, eglcontext ctx
            minputsurface.makecurrent();

            // 共30帧
            for (int i = 0; i < num_frames; i++) {
                // mencoder从缓冲区取数据,然后交给mmuxer编码
                drainencoder(false);
                // opengl绘制一帧
                generatesurfaceframe(i);
                // 设置图像,发送给egl的显示时间
                minputsurface.setpresentationtime(computepresentationtimensec(i));
                // submit it to the encoder
                minputsurface.swapbuffers();
            }
            // send end-of-stream to encoder, and drain remaining output
            drainencoder(true);
        } finally {
            // release encoder, muxer, and input surface
            releaseencoder();
        }
    }

    /**
     * 初始化视频编码器
     */
    private void initvideoencoder() {
        // 创建一个buffer
        mbufferinfo = new mediacodec.bufferinfo();

        //-----------------mediaformat-----------------------
        // mediacodec采用的是h.264编码
        mediaformat format = mediaformat.createvideoformat(my_mime_type, my_video_width, my_video_height);
        // 数据来源自surface
        format.setinteger(mediaformat.key_color_format,
                mediacodecinfo.codeccapabilities.color_formatsurface);
        // 视频码率
        format.setinteger(mediaformat.key_bit_rate, my_bit_rate);
        // fps
        format.setinteger(mediaformat.key_frame_rate, my_fps);
        //设置关键帧的时间
        format.setinteger(mediaformat.key_i_frame_interval, 10);

        //-----------------encoder-----------------------
        try {
            mencoder = mediacodec.createencoderbytype(my_mime_type);
            mencoder.configure(format, null, null, mediacodec.configure_flag_encode);
            // 创建一个surface
            surface surface = mencoder.createinputsurface();
            // 创建一个codecinputsurface,其中包含gl相关
            minputsurface = new codecinputsurface(surface);
            //
            mencoder.start();
        } catch (exception e) {
            e.printstacktrace();
        }

        //-----------------输出文件路径-----------------------
        // 输出文件路径
        string outputpath = new file(my_output_dir,
                "test." + my_video_width + "x" + my_video_height + ".mp4").tostring();

        //-----------------mediamuxer-----------------------
        try {
            // 输出为mp4
            mmuxer = new mediamuxer(outputpath, mediamuxer.outputformat.muxer_output_mpeg_4);
        } catch (ioexception ioe) {
            throw new runtimeexception("mediamuxer creation failed", ioe);
        }

        mtrackindex = -1;
        mmuxerstarted = false;
    }

    /**
     * releases encoder resources.  may be called after partial / failed initialization.
     * 释放资源
     */
    private void releaseencoder() {
        if (mencoder != null) {
            mencoder.stop();
            mencoder.release();
            mencoder = null;
        }
        if (minputsurface != null) {
            minputsurface.release();
            minputsurface = null;
        }
        if (mmuxer != null) {
            mmuxer.stop();
            mmuxer.release();
            mmuxer = null;
        }
    }


    /**
     * mencoder从缓冲区取数据,然后交给mmuxer编码
     *
     * @param endofstream 是否停止录制
     */
    private void drainencoder(boolean endofstream) {
        final int timeout_usec = 10000;

        // 停止录制
        if (endofstream) {
            mencoder.signalendofinputstream();
        }
        //拿到输出缓冲区,用于取到编码后的数据
        bytebuffer[] encoderoutputbuffers = mencoder.getoutputbuffers();
        while (true) {
            //拿到输出缓冲区的索引
            int encoderstatus = mencoder.dequeueoutputbuffer(mbufferinfo, timeout_usec);
            if (encoderstatus == mediacodec.info_try_again_later) {
                // no output available yet
                if (!endofstream) {
                    break;      // out of while
                } else {

                }
            } else if (encoderstatus == mediacodec.info_output_buffers_changed) {
                //拿到输出缓冲区,用于取到编码后的数据
                encoderoutputbuffers = mencoder.getoutputbuffers();
            } else if (encoderstatus == mediacodec.info_output_format_changed) {
                // should happen before receiving buffers, and should only happen once
                if (mmuxerstarted) {
                    throw new runtimeexception("format changed twice");
                }
                //
                mediaformat newformat = mencoder.getoutputformat();
                // now that we have the magic goodies, start the muxer
                mtrackindex = mmuxer.addtrack(newformat);
                //
                mmuxer.start();
                mmuxerstarted = true;
            } else if (encoderstatus < 0) {
            } else {
                //获取解码后的数据
                bytebuffer encodeddata = encoderoutputbuffers[encoderstatus];
                if (encodeddata == null) {
                    throw new runtimeexception("encoderoutputbuffer " + encoderstatus +
                            " was null");
                }
                //
                if ((mbufferinfo.flags & mediacodec.buffer_flag_codec_config) != 0) {
                    mbufferinfo.size = 0;
                }
                //
                if (mbufferinfo.size != 0) {
                    if (!mmuxerstarted) {
                        throw new runtimeexception("muxer hasn't started");
                    }
                    // adjust the bytebuffer values to match bufferinfo (not needed?)
                    encodeddata.position(mbufferinfo.offset);
                    encodeddata.limit(mbufferinfo.offset + mbufferinfo.size);
                    // 编码
                    mmuxer.writesampledata(mtrackindex, encodeddata, mbufferinfo);
                }
                //释放资源
                mencoder.releaseoutputbuffer(encoderstatus, false);

                if ((mbufferinfo.flags & mediacodec.buffer_flag_end_of_stream) != 0) {
                    if (!endofstream) {
                        log.w(tag, "reached end of stream unexpectedly");
                    } else {

                    }
                    break;      // out of while
                }
            }
        }
    }

    /**
     * generates a frame of data using gl commands.  we have an 8-frame animation
     * sequence that wraps around.  it looks like this:
     * <pre>
     *   0 1 2 3
     *   7 6 5 4
     * </pre>
     * we draw one of the eight rectangles and leave the rest set to the clear color.
     */
    private void generatesurfaceframe(int frameindex) {
        frameindex %= 8;

        int startx, starty;
        if (frameindex < 4) {
            // (0,0) is bottom-left in gl
            startx = frameindex * (my_video_width / 4);
            starty = my_video_height / 2;
        } else {
            startx = (7 - frameindex) * (my_video_width / 4);
            starty = 0;
        }

        gles20.glclearcolor(test_r0 / 255.0f, test_g0 / 255.0f, test_b0 / 255.0f, 1.0f);
        gles20.glclear(gles20.gl_color_buffer_bit);

        gles20.glenable(gles20.gl_scissor_test);
        gles20.glscissor(startx, starty, my_video_width / 4, my_video_height / 2);
        gles20.glclearcolor(test_r1 / 255.0f, test_g1 / 255.0f, test_b1 / 255.0f, 1.0f);
        gles20.glclear(gles20.gl_color_buffer_bit);
        gles20.gldisable(gles20.gl_scissor_test);
    }

    /**
     * generates the presentation time for frame n, in nanoseconds.
     * 好像是生成当前帧的时间,具体怎么计算的,不懂呀??????????????????????????
     */
    private static long computepresentationtimensec(int frameindex) {
        final long one_billion = 1000000000;
        return frameindex * one_billion / my_fps;
    }


    /**
     * holds state associated with a surface used for mediacodec encoder input.
     * <p>
     * the constructor takes a surface obtained from mediacodec.createinputsurface(), and uses that
     * to create an egl window surface.  calls to eglswapbuffers() cause a frame of data to be sent
     * to the video encoder.
     * <p>
     * this object owns the surface -- releasing this will release the surface too.
     */
    private static class codecinputsurface {
        private static final int egl_recordable_android = 0x3142;

        private egldisplay megldisplay = egl14.egl_no_display;
        private eglcontext meglcontext = egl14.egl_no_context;
        private eglsurface meglsurface = egl14.egl_no_surface;

        private surface msurface;

        /**
         * creates a codecinputsurface from a surface.
         */
        public codecinputsurface(surface surface) {
            if (surface == null) {
                throw new nullpointerexception();
            }
            msurface = surface;

            initegl();
        }

        /**
         * 初始化egl
         */
        private void initegl() {

            //--------------------megldisplay-----------------------
            // 获取egl display
            megldisplay = egl14.eglgetdisplay(egl14.egl_default_display);
            // 错误检查
            if (megldisplay == egl14.egl_no_display) {
                throw new runtimeexception("unable to get egl14 display");
            }
            // 初始化
            int[] version = new int[2];
            if (!egl14.eglinitialize(megldisplay, version, 0, version, 1)) {
                throw new runtimeexception("unable to initialize egl14");
            }

            // configure egl for recording and opengl es 2.0.
            int[] attriblist = {
                    egl14.egl_red_size, 8,
                    egl14.egl_green_size, 8,
                    egl14.egl_blue_size, 8,
                    egl14.egl_alpha_size, 8,
                    //
                    egl14.egl_renderable_type,
                    egl14.egl_opengl_es2_bit,
                    // 录制android
                    egl_recordable_android,
                    1,
                    egl14.egl_none
            };
            eglconfig[] configs = new eglconfig[1];
            int[] numconfigs = new int[1];
            // eglcreatecontext rgb888+recordable es2
            egl14.eglchooseconfig(megldisplay, attriblist, 0, configs, 0, configs.length, numconfigs, 0);

            // configure context for opengl es 2.0.
            int[] attrib_list = {
                    egl14.egl_context_client_version, 2,
                    egl14.egl_none
            };
            //--------------------meglcontext-----------------------
            //  eglcreatecontext
            meglcontext = egl14.eglcreatecontext(megldisplay, configs[0], egl14.egl_no_context,
                    attrib_list, 0);
            checkeglerror("eglcreatecontext");

            //--------------------meglsurface-----------------------
            // 创建一个windowsurface并与surface进行绑定,这里的surface来自mencoder.createinputsurface();
            // create a window surface, and attach it to the surface we received.
            int[] surfaceattribs = {
                    egl14.egl_none
            };
            // eglcreatewindowsurface
            meglsurface = egl14.eglcreatewindowsurface(megldisplay, configs[0], msurface,
                    surfaceattribs, 0);
            checkeglerror("eglcreatewindowsurface");
        }

        /**
         * discards all resources held by this class, notably the egl context.  also releases the
         * surface that was passed to our constructor.
         * 释放资源
         */
        public void release() {
            if (megldisplay != egl14.egl_no_display) {
                egl14.eglmakecurrent(megldisplay, egl14.egl_no_surface, egl14.egl_no_surface,
                        egl14.egl_no_context);
                egl14.egldestroysurface(megldisplay, meglsurface);
                egl14.egldestroycontext(megldisplay, meglcontext);
                egl14.eglreleasethread();
                egl14.eglterminate(megldisplay);
            }

            msurface.release();

            megldisplay = egl14.egl_no_display;
            meglcontext = egl14.egl_no_context;
            meglsurface = egl14.egl_no_surface;

            msurface = null;
        }

        /**
         * makes our egl context and surface current.
         * 设置 egldisplay dpy, eglsurface draw, eglsurface read, eglcontext ctx
         */
        public void makecurrent() {
            egl14.eglmakecurrent(megldisplay, meglsurface, meglsurface, meglcontext);
            checkeglerror("eglmakecurrent");
        }

        /**
         * calls eglswapbuffers.  use this to "publish" the current frame.
         * 用该方法,发送当前frame
         */
        public boolean swapbuffers() {
            boolean result = egl14.eglswapbuffers(megldisplay, meglsurface);
            checkeglerror("eglswapbuffers");
            return result;
        }

        /**
         * sends the presentation time stamp to egl.  time is expressed in nanoseconds.
         * 设置图像,发送给egl的时间间隔
         */
        public void setpresentationtime(long nsecs) {
            // 设置发动给egl的时间间隔
            eglext.eglpresentationtimeandroid(megldisplay, meglsurface, nsecs);
            checkeglerror("eglpresentationtimeandroid");
        }

        /**
         * checks for egl errors.  throws an exception if one is found.
         * 检查错误,代码可以忽略
         */
        private void checkeglerror(string msg) {
            int error;
            if ((error = egl14.eglgeterror()) != egl14.egl_success) {
                throw new runtimeexception(msg + ": egl error: 0x" + integer.tohexstring(error));
            }
        }
    }
}

========== the end ==========

android用MediaCodeC将opengl绘制内容录制为一个mp4