Java自学者论坛

 找回密码
 立即注册

手机号码,快捷登录

恭喜Java自学者论坛(https://www.javazxz.com)已经为数万Java学习者服务超过8年了!积累会员资料超过10000G+
成为本站VIP会员,下载本站10000G+会员资源,会员资料板块,购买链接:点击进入购买VIP会员

JAVA高级面试进阶训练营视频教程

Java架构师系统进阶VIP课程

分布式高可用全栈开发微服务教程Go语言视频零基础入门到精通Java架构师3期(课件+源码)
Java开发全终端实战租房项目视频教程SpringBoot2.X入门到高级使用教程大数据培训第六期全套视频教程深度学习(CNN RNN GAN)算法原理Java亿级流量电商系统视频教程
互联网架构师视频教程年薪50万Spark2.0从入门到精通年薪50万!人工智能学习路线教程年薪50万大数据入门到精通学习路线年薪50万机器学习入门到精通教程
仿小米商城类app和小程序视频教程深度学习数据分析基础到实战最新黑马javaEE2.1就业课程从 0到JVM实战高手教程MySQL入门到精通教程
查看: 557|回复: 0

Android 视频通信,低延时解决方案

[复制链接]
  • TA的每日心情
    奋斗
    2024-4-6 11:05
  • 签到天数: 748 天

    [LV.9]以坛为家II

    2034

    主题

    2092

    帖子

    70万

    积分

    管理员

    Rank: 9Rank: 9Rank: 9

    积分
    705612
    发表于 2021-8-27 13:01:22 | 显示全部楼层 |阅读模式

    背景:

      由于,项目需要,需要进行视频通信,把a的画面,转给b。

    运维部署:

      APP1:编码摄像头采集的数据,并且发送数据到服务端

      APP2:从服务端,拉取数据,并且进行解码显示

      服务端:接收APP1提交的数据,发送APP1提交数据到APP2

    应用说明:

      APP1:camera = Camera.open(Camera.CameraInfo.CAMERA_FACING_FRONT);

            Camera.Parameters parameters = camera.getParameters();
            parameters.setPreviewFormat(ImageFormat.NV21);
            parameters.setPreviewSize(width, height);
    
            // 设置屏幕亮度
            parameters.setExposureCompensation(parameters.getMaxExposureCompensation() / 2);
            camera.setParameters(parameters);
            camera.setDisplayOrientation(90);
            camera.setPreviewCallback(new Camera.PreviewCallback() {
                @Override
                public void onPreviewFrame(byte[] data, Camera camera) {
    // 采集视频数据,同时记录采集视频的时间点,解码需要(保证视频连续,流畅,且不花屏需要) stamptime
    = System.nanoTime(); yuv_data = data; } });
      1 public class AvcKeyFrameEncoder {
      2     private final static String TAG = "MeidaCodec";
      3     private int TIMEOUT_USEC = 12000;
      4 
      5     private MediaCodec mediaCodec;
      6     int m_width;
      7     int m_height;
      8     int m_framerate;
      9 
     10     public byte[] configbyte;
     11 
     12     //待解码视频缓冲队列,静态成员!
     13     public byte[] yuv_data = null;
     14     public long stamptime = 0;
     15 
     16     public AvcKeyFrameEncoder(int width, int height, int framerate) {
     17         m_width = width;
     18         m_height = height;
     19         m_framerate = framerate;
     20 
     21         //正常的编码出来是横屏的。因为手机本身采集的数据默认就是横屏的
     22         // MediaFormat mediaFormat = MediaFormat.createVideoFormat(mime, width, height);
     23         //如果你需要旋转90度或者270度,那么需要把宽和高对调。否则会花屏。因为比如你320 X 240,图像旋转90°之后宽高变成了240 X 320。
     24         MediaFormat mediaFormat = MediaFormat.createVideoFormat("video/avc", width, height);
     25         mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar);
     26         mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 125000);
     27         mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, framerate); // 30
     28         mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 1);
     29         try {
     30             mediaCodec = MediaCodec.createEncoderByType("video/avc");
     31         } catch (IOException e) {
     32             e.printStackTrace();
     33         }
     34 
     35         //配置编码器参数
     36         mediaCodec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
     37 
     38         //启动编码器
     39         mediaCodec.start();
     40     }
     41 
     42     public void StopEncoder() {
     43         try {
     44             mediaCodec.stop();
     45             mediaCodec.release();
     46         } catch (Exception e) {
     47             e.printStackTrace();
     48         }
     49     }
     50 
     51     public boolean isRuning = false;
     52 
     53     public void StartEncoderThread(final ISaveVideo saveVideo, final ICall callback) {
     54         isRuning = true;
     55         new Thread(new Runnable() {
     56             @Override
     57             public void run() {
     58                 byte[] input = null;
     59                 long pts = 0;
     60                 while (isRuning) {
     61                     // 访问MainActivity用来缓冲待解码数据的队列
     62                     if(yuv_data == null){
     63                         continue;
     64                     }
     65 
     66                     if (yuv_data != null) {
     67                         //从缓冲队列中取出一帧
     68                         input = yuv_data;
     69                         pts = stamptime;
     70                         yuv_data = null;
     71                         byte[] yuv420sp = new byte[m_width * m_height * 3 / 2];
     72 
     73                         NV21ToNV12(input, yuv420sp, m_width, m_height);
     74                         input = yuv420sp;
     75                     }
     76 
     77                     if (input != null) {
     78                         try {
     79                             //编码器输入缓冲区
     80                             ByteBuffer[] inputBuffers = mediaCodec.getInputBuffers();
     81 
     82                             //编码器输出缓冲区
     83                             ByteBuffer[] outputBuffers = mediaCodec.getOutputBuffers();
     84                             int inputBufferIndex = mediaCodec.dequeueInputBuffer(-1);
     85                             if (inputBufferIndex >= 0) {
     86                                 ByteBuffer inputBuffer = inputBuffers[inputBufferIndex];
     87                                 inputBuffer.clear();
     88                                 //把转换后的YUV420格式的视频帧放到编码器输入缓冲区中
     89                                 inputBuffer.put(input);
     90                                 mediaCodec.queueInputBuffer(inputBufferIndex, 0, input.length, pts, 0);
     91                             }
     92 
     93                             MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
     94                             int outputBufferIndex = mediaCodec.dequeueOutputBuffer(bufferInfo, TIMEOUT_USEC);
     95                             while (outputBufferIndex >= 0) {
     96                                 //Log.i("AvcEncoder", "Get H264 Buffer Success! flag = "+bufferInfo.flags+",pts = "+bufferInfo.presentationTimeUs+"");
     97                                 ByteBuffer outputBuffer = outputBuffers[outputBufferIndex];
     98                                 byte[] outData = new byte[bufferInfo.size];
     99                                 outputBuffer.get(outData);
    100                                 if (bufferInfo.flags == BUFFER_FLAG_CODEC_CONFIG) {
    101                                     configbyte = new byte[bufferInfo.size];
    102                                     configbyte = outData;
    103                                 } else if (bufferInfo.flags == BUFFER_FLAG_KEY_FRAME) {
    104                                     byte[] keyframe = new byte[bufferInfo.size + configbyte.length];
    105                                     System.arraycopy(configbyte, 0, keyframe, 0, configbyte.length);
    106                                     //把编码后的视频帧从编码器输出缓冲区中拷贝出来
    107                                     System.arraycopy(outData, 0, keyframe, configbyte.length, outData.length);
    108 
    109                                     Logs.i("上传I帧 " + keyframe.length);
    110                                     byte[] send_data = new byte[13 + keyframe.length];
    111                                     System.arraycopy(new byte[]{0x01}, 0, send_data, 0, 1);
    112                                     System.arraycopy(IntBytes.longToBytes(pts), 0, send_data, 1, 8);
    113                                     System.arraycopy(IntBytes.intToByteArray(keyframe.length), 0, send_data, 9, 4);
    114                                     System.arraycopy(keyframe, 0, send_data, 13, keyframe.length);
    115                                     if(saveVideo != null){
    116                                         saveVideo.SaveVideoData(send_data);
    117                                     }
    118 
    119                                     if(callback != null){
    120                                         callback.callback(keyframe, pts);
    121                                     }
    122                                 } else {
    123                                     byte[] send_data = new byte[13 + outData.length];
    124                                     System.arraycopy(new byte[]{0x02}, 0, send_data, 0, 1);
    125                                     System.arraycopy(IntBytes.longToBytes(pts), 0, send_data, 1, 8);
    126                                     System.arraycopy(IntBytes.intToByteArray(outData.length), 0, send_data, 9, 4);
    127                                     System.arraycopy(outData, 0, send_data, 13, outData.length);
    128                                     if(saveVideo != null){
    129                                         saveVideo.SaveVideoData(send_data);
    130                                     }
    131 
    132                                     if(callback != null){
    133                                         callback.callback(outData, pts);
    134                                     }
    135                                 }
    136 
    137                                 mediaCodec.releaseOutputBuffer(outputBufferIndex, false);
    138                                 outputBufferIndex = mediaCodec.dequeueOutputBuffer(bufferInfo, TIMEOUT_USEC);
    139                             }
    140 
    141                         } catch (Throwable t) {
    142                             t.printStackTrace();
    143                             break;
    144                         }
    145                     }
    146                 }
    147             }
    148         }).start();
    149     }
    150 
    151     private void NV21ToNV12(byte[] nv21, byte[] nv12, int width, int height) {
    152         if (nv21 == null || nv12 == null) return;
    153         int framesize = width * height;
    154         int i = 0, j = 0;
    155         System.arraycopy(nv21, 0, nv12, 0, framesize);
    156         for (i = 0; i < framesize; i++) {
    157             nv12 = nv21;
    158         }
    159 
    160         for (j = 0; j < framesize / 2; j += 2) {
    161             nv12[framesize + j - 1] = nv21[j + framesize];
    162         }
    163 
    164         for (j = 0; j < framesize / 2; j += 2) {
    165             nv12[framesize + j] = nv21[j + framesize - 1];
    166         }
    167     }
    168 }
    视频编码类Encoder

    其中使用到了,接口用于,把采集和编码后的数据,往外部传递,通过线程提交到服务端。或者通过本地解码显示,查看,编码解码时间差。

    通过使用 ArrayBlockingQueue<byte[]> H264Queue = new ArrayBlockingQueue<byte[]>(10); 队列,对接口提交数据,进行暂时保存,在后台对数据,进行解码或提交到服务端。

      APP2:接入服务端,然后从I帧数据开始拿数据,(且数据是最新的I帧开始保存的数据)。同时需要把,之前采集得到的时间点传给:

    MediaCodec 对象的 queueInputBuffer 方法的时间戳参数(第四个)。

     服务端:一帧一帧接收APP1传入数据,对I帧开始的数据进行记录,同时对非I帧开始的数据,进行丢弃。一次只保存一帧内容。读取数据,并且移除已经添加数据,循环发送给APP2

    public class VideoDecoder {
        private Thread mDecodeThread;
        private MediaCodec mCodec;
        private boolean mStopFlag = false;
        private int Video_Width = 640;
        private int Video_Height = 480;
        private int FrameRate = 25;
        private Boolean isUsePpsAndSps = false;
        private ReceiveVideoThread runThread = null;
    
        public VideoDecoder(String ip, int port, byte type, int roomId){
            runThread = new ReceiveVideoThread(ip, port, type, roomId);
            new Thread(runThread).start();
        }
    
        public void InitReadData(Surface surface){
            try {
                //通过多媒体格式名创建一个可用的解码器
                mCodec = MediaCodec.createDecoderByType("video/avc");
            } catch (IOException e) {
                e.printStackTrace();
            }
    
            //初始化编码器
            final MediaFormat mediaformat = MediaFormat.createVideoFormat("video/avc", Video_Width, Video_Height);
    
            //设置帧率
            mediaformat.setInteger(MediaFormat.KEY_FRAME_RATE, FrameRate);
    
            //https://developer.android.com/reference/android/media/MediaFormat.html#KEY_MAX_INPUT_SIZE
            //设置配置参数,参数介绍 :
            // format   如果为解码器,此处表示输入数据的格式;如果为编码器,此处表示输出数据的格式。
            //surface   指定一个surface,可用作decode的输出渲染。
            //crypto    如果需要给媒体数据加密,此处指定一个crypto类.
            //   flags  如果正在配置的对象是用作编码器,此处加上CONFIGURE_FLAG_ENCODE 标签。
            mCodec.configure(mediaformat, surface, null, 0);
            startDecodingThread();
        }
    
        private void startDecodingThread() {
            mCodec.start();
    
            mDecodeThread = new Thread(new decodeH264Thread());
            mDecodeThread.start();
        }
    
        @RequiresApi(api = Build.VERSION_CODES.LOLLIPOP)
        private class decodeH264Thread implements Runnable {
            @Override
            public void run() {
                try {
                    // saveDataLoop();
                    decodeLoop_New();
                } catch (Exception e) {
                    e.printStackTrace();
                }
            }
    
            private void decodeLoop_New() {
                // 存放目标文件的数据
                ByteBuffer[] inputBuffers = mCodec.getInputBuffers();
                // 解码后的数据,包含每一个buffer的元数据信息,例如偏差,在相关解码器中有效的数据大小
                MediaCodec.BufferInfo info = new MediaCodec.BufferInfo();
                long timeoutUs = 1000;
                byte[] marker0 = new byte[]{0, 0, 0, 1};
                byte[] dummyFrame = new byte[]{0x00, 0x00, 0x01, 0x20};
                byte[] streamBuffer = null;
                while (true) {
                    if(runThread.H264Queue.size() > 0){
                        streamBuffer = runThread.H264Queue.poll();
                    }else{
                        try {
                            Thread.sleep(20);
                        }catch (Exception ex){
                        }
    
                        continue;
                    }
    
                    byte[] time_data = new byte[8];
                    System.arraycopy(streamBuffer, 0, time_data, 0, 8);
                    long pts = IntBytes.bytesToLong(time_data);
                    byte[] video_data = new byte[streamBuffer.length - 8];
                    System.arraycopy(streamBuffer, 8, video_data, 0, video_data.length);
                    streamBuffer = video_data;
    
                    Logs.i("得到 streamBuffer " + streamBuffer.length + " pts " + pts);
                    int bytes_cnt = 0;
                    mStopFlag = false;
                    while (mStopFlag == false) {
                        bytes_cnt = streamBuffer.length;
                        if (bytes_cnt == 0) {
                            streamBuffer = dummyFrame;
                        }
    
                        int startIndex = 0;
                        int remaining = bytes_cnt;
                        while (true) {
                            if (remaining == 0 || startIndex >= remaining) {
                                break;
                            }
                            int nextFrameStart = KMPMatch(marker0, streamBuffer, startIndex + 2, remaining);
                            if (nextFrameStart == -1) {
                                nextFrameStart = remaining;
                            } else {
                            }
    
                            int inIndex = mCodec.dequeueInputBuffer(timeoutUs);
                            if (inIndex >= 0) {
                                ByteBuffer byteBuffer = inputBuffers[inIndex];
                                byteBuffer.clear();
                                byteBuffer.put(streamBuffer, startIndex, nextFrameStart - startIndex);
                                //在给指定Index的inputbuffer[]填充数据后,调用这个函数把数据传给解码器
                                mCodec.queueInputBuffer(inIndex, 0, nextFrameStart - startIndex, pts, 0);
                                startIndex = nextFrameStart;
                            } else {
                                continue;
                            }
    
                            int outIndex = mCodec.dequeueOutputBuffer(info, timeoutUs);
                            if (outIndex >= 0) {
                                //帧控制是不在这种情况下工作,因为没有PTS H264是可用的
                                /*
                                while (info.presentationTimeUs / 1000 > System.currentTimeMillis() - startMs) {
                                    try {
                                        Thread.sleep(100);
                                    } catch (InterruptedException e) {
                                        e.printStackTrace();
                                    }
                                }
                                */
                                boolean doRender = (info.size != 0);
                                //对outputbuffer的处理完后,调用这个函数把buffer重新返回给codec类。
    
                                // TODO:添加处理,保存原始帧数据
                                if (doRender) {
                                    Image image = mCodec.getOutputImage(outIndex);
                                    if (image != null) {
                                        // 通过反射
                                        // 发送数据到指定接口
                                        byte[] data = getDataFromImage(image, COLOR_FormatNV21);
                                    }
                                }
    
                                mCodec.releaseOutputBuffer(outIndex, doRender);
                            } else {
                                // Log.e(TAG, "bbbb");
                            }
                        }
                        mStopFlag = true;
                    }
    
                    // Logs.i("处理单帧视频耗时:" + (System.currentTimeMillis() - c_start));
                }
            }
        }
    
        private static final boolean VERBOSE = false;
        private static final long DEFAULT_TIMEOUT_US = 10000;
    
        private static final int COLOR_FormatI420 = 1;
        private static final int COLOR_FormatNV21 = 2;
    
        private static boolean isImageFormatSupported(Image image) {
            int format = image.getFormat();
            switch (format) {
                case ImageFormat.YUV_420_888:
                case ImageFormat.NV21:
                case ImageFormat.YV12:
                    return true;
            }
            return false;
        }
    
        @RequiresApi(api = Build.VERSION_CODES.LOLLIPOP)
        private static byte[] getDataFromImage(Image image, int colorFormat) {
            if (colorFormat != COLOR_FormatI420 && colorFormat != COLOR_FormatNV21) {
                throw new IllegalArgumentException("only support COLOR_FormatI420 " + "and COLOR_FormatNV21");
            }
            if (!isImageFormatSupported(image)) {
                throw new RuntimeException("can't convert Image to byte array, format " + image.getFormat());
            }
    
            Rect crop = image.getCropRect();
            int format = image.getFormat();
            int width = crop.width();
            int height = crop.height();
            Image.Plane[] planes = image.getPlanes();
            byte[] data = new byte[width * height * ImageFormat.getBitsPerPixel(format) / 8];
            byte[] rowData = new byte[planes[0].getRowStride()];
            if (VERBOSE) Logs.i("get data from " + planes.length + " planes");
            int channelOffset = 0;
            int outputStride = 1;
            for (int i = 0; i < planes.length; i++) {
                switch (i) {
                    case 0:
                        channelOffset = 0;
                        outputStride = 1;
                        break;
                    case 1:
                        if (colorFormat == COLOR_FormatI420) {
                            channelOffset = width * height;
                            outputStride = 1;
                        } else if (colorFormat == COLOR_FormatNV21) {
                            channelOffset = width * height + 1;
                            outputStride = 2;
                        }
                        break;
                    case 2:
                        if (colorFormat == COLOR_FormatI420) {
                            channelOffset = (int) (width * height * 1.25);
                            outputStride = 1;
                        } else if (colorFormat == COLOR_FormatNV21) {
                            channelOffset = width * height;
                            outputStride = 2;
                        }
                        break;
                }
                ByteBuffer buffer = planes.getBuffer();
                int rowStride = planes.getRowStride();
                int pixelStride = planes.getPixelStride();
                if (VERBOSE) {
                    Logs.i("pixelStride " + pixelStride);
                    Logs.i("rowStride " + rowStride);
                    Logs.i("width " + width);
                    Logs.i("height " + height);
                    Logs.i("buffer size " + buffer.remaining());
                }
                int shift = (i == 0) ? 0 : 1;
                int w = width >> shift;
                int h = height >> shift;
                buffer.position(rowStride * (crop.top >> shift) + pixelStride * (crop.left >> shift));
                for (int row = 0; row < h; row++) {
                    int length;
                    if (pixelStride == 1 && outputStride == 1) {
                        length = w;
                        buffer.get(data, channelOffset, length);
                        channelOffset += length;
                    } else {
                        length = (w - 1) * pixelStride + 1;
                        buffer.get(rowData, 0, length);
                        for (int col = 0; col < w; col++) {
                            data[channelOffset] = rowData[col * pixelStride];
                            channelOffset += outputStride;
                        }
                    }
                    if (row < h - 1) {
                        buffer.position(buffer.position() + rowStride - length);
                    }
                }
                if (VERBOSE) Logs.i("Finished reading data from plane " + i);
            }
            return data;
        }
    
        private int KMPMatch(byte[] pattern, byte[] bytes, int start, int remain) {
            try {
                Thread.sleep(30);
            } catch (InterruptedException e) {
                e.printStackTrace();
            }
    
            int[] lsp = computeLspTable(pattern);
    
            int j = 0;  // Number of chars matched in pattern
            for (int i = start; i < remain; i++) {
                while (j > 0 && bytes != pattern[j]) {
                    // Fall back in the pattern
                    j = lsp[j - 1];  // Strictly decreasing
                }
                if (bytes == pattern[j]) {
                    // Next char matched, increment position
                    j++;
                    if (j == pattern.length)
                        return i - (j - 1);
                }
            }
            return -1;  // Not found
        }
    
        private int[] computeLspTable(byte[] pattern) {
            int[] lsp = new int[pattern.length];
            lsp[0] = 0;  // Base case
            for (int i = 1; i < pattern.length; i++) {
                // Start by assuming we're extending the previous LSP
                int j = lsp[i - 1];
                while (j > 0 && pattern != pattern[j])
                    j = lsp[j - 1];
                if (pattern == pattern[j])
                    j++;
                lsp = j;
            }
            return lsp;
        }
    
        public void StopDecode() {
            if(runThread != null){
                runThread.StopReceive();
            }
        }
    }
    视频解码类Decoder

    总结:

      通过对视频的处理,学习到了,一些处理视频的细节点。同时加深了,依赖导致在实际项目中的使用。to android.

     

    哎...今天够累的,签到来了1...
    回复

    使用道具 举报

    您需要登录后才可以回帖 登录 | 立即注册

    本版积分规则

    QQ|手机版|小黑屋|Java自学者论坛 ( 声明:本站文章及资料整理自互联网,用于Java自学者交流学习使用,对资料版权不负任何法律责任,若有侵权请及时联系客服屏蔽删除 )

    GMT+8, 2024-4-28 02:33 , Processed in 0.074511 second(s), 29 queries .

    Powered by Discuz! X3.4

    Copyright © 2001-2021, Tencent Cloud.

    快速回复 返回顶部 返回列表