current position:Home>Introduction to Android | how to use mediacodec codec

Introduction to Android | how to use mediacodec codec

2022-01-27 00:08:55 Android_ Anzi

Android MediaCodec Codec usage

Use MediaCodec codec . Input H.264 Formatted data , Output frame data and send it to the listener .

Next, we'll abbreviate MediaCodec by codec

H.264 Configuration of

Create and configure codec. To configure codec when , If created manually MediaFormat Object words , Be sure to set up "csd-0" and "csd-1" These two parameters . "csd-0" and "csd-1" These two parameters must correspond to the received frame .

input data

to codec When entering data , If the input data is queued , You need to check the condition of the queue .

For example, one frame of data is temporarily used 1M Memory ,1 second 30 frame , The queue may be temporarily used 30M Of memory . When the memory temporarily used is too high , We need to take some measures to reduce the memory occupation . codec Hard decoding will be affected by mobile phone hardware . If the performance of the mobile phone is poor , The speed of encoding and decoding may be slower than the original data input . If we have to, we can discard the old data in the queue , Enter new data .

Decoder performance

Scenes that require high real-time video ,codec No input buffer available ,mCodec.dequeueInputBuffer return -1. For real-time , The I / O buffer will be forcibly released here mCodec.flush().

Problems and exceptions

problem 1 - MediaCodec Is there a specific relationship between the number of input data and the number of output data

about MediaCodec, Is there a specific relationship between the number of input data and the number of output data ? Assume that the input 10 Frame data , How many times you can get output ?

The measured found , There is no 100% guarantee that the input and output times are equal . for example vivo x6 plus, Input 30 frame , Can get 28 Frame result . perhaps 300 Time input , obtain 298 Secondary output .

abnormal 1 - dequeueInputBuffer(0) Keep returning -1

After some mobile phones have been encoded and decoded for a long time , There may be an attempt to get codec When entering a buffer, the subscript always returns -1. for example vivo x6 plus, Running about 20 Minutes later ,mCodec.dequeueInputBuffer(0) Keep returning -1.

processing method : If you keep coming back -1, Attempt to call... In synchronous mode codec.flush() Method , Try... In asynchronous mode codec.flush() Then call codec.start() Method .

Some mobile phones decode too slowly , May return frequently -1. Don't call... Frequently codec.flush(), To avoid abnormal display .

Code example - Synchronous encoding and decoding

This example uses synchronous encoding and decoding

Use synchronous mode

/**
*  decoder 
*/
public class CodecDecoder {
 private static final String TAG = "CodecDecoder";
​
 private static final String MIME_TYPE = "video/avc";
 private static final String CSD0 = "csd-0";
 private static final String CSD1 = "csd-1";
​
 private static final int TIME_INTERNAL = 1;
 private static final int DECODER_TIME_INTERNAL = 1;
​
 private MediaCodec mCodec;
 private long mCount = 0; //  Media decoder MediaCodec With 
​
 //  Buffer queue before sending to codec 
 //  The temporary memory used by this queue needs to be monitored in real time    If it's blocked here, it's easy to cause OOM
 private Queue<byte[]> data = null;
​
 private DecoderThread decoderThread;
 private CodecListener listener; //  Custom listener    When the frame data is decoded, it is sent out through it 
​
 public CodecDecoder() {
 data = new ConcurrentLinkedQueue<>();
 }
​
 public boolean isCodecCreated() {
 return mCodec!=null;
 }
​
 public boolean createCodec(CodecListener listener, byte[] spsBuffer, byte[] ppsBuffer, int width, int height) {
 this.listener = listener;
 try {
 mCodec = MediaCodec.createDecoderByType(Constants.MIME_TYPE);
 MediaFormat mediaFormat = createVideoFormat(spsBuffer, ppsBuffer, width, height);
 mCodec.configure(mediaFormat, null, null, 0);
 mCodec.start();
​
 Log.d(TAG, "decoderThread mediaFormat in:" + mediaFormat);
​
 decoderThread = new DecoderThread();
 decoderThread.start();
​
 return true;
 }
 catch (Exception e) {
 e.printStackTrace();
 Log.e(TAG, "MediaCodec create error:" + e.getMessage());
​
 return false;
 }
 }
​
 private MediaFormat createVideoFormat(byte[] spsBuffer, byte[] ppsBuffer, int width, int height) {
 MediaFormat mediaFormat;
 mediaFormat = MediaFormat.createVideoFormat(MIME_TYPE, width, height);
 mediaFormat.setByteBuffer(CSD0, ByteBuffer.wrap(spsBuffer));
 mediaFormat.setByteBuffer(CSD1, ByteBuffer.wrap(ppsBuffer));
 mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT,
 MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Flexible);
​
 return mediaFormat;
 }
​
 private long lastInQueueTime = 0;
​
 //  Input H.264 The frame data    The queue will be monitored here 
 public void addData(byte[] dataBuffer) {
 final long timeDiff = System.currentTimeMillis() - lastInQueueTime;
 if (timeDiff > 1) {
 lastInQueueTime = System.currentTimeMillis();
 int queueSize = data.size(); // ConcurrentLinkedQueue The query length will be traversed once   In the case of huge amount of data, try not to use this method 
 if (queueSize > 30) {
 data.clear();
 LogInFile.getLogger().e("frame queue  The frame data queue exceeds the upper limit , Automatically clear data  " + queueSize);
 }
 data.add(dataBuffer.clone());
 Log.e(TAG, "frame queue  Add a frame of data ");
 } else {
 LogInFile.getLogger().e("frame queue  Add too fast , Skip this frame . timeDiff=" + timeDiff);
 }
 }
​
 public void destroyCodec() {
 if (mCodec != null) {
 try {
 mCount = 0;
​
 if(data!=null) {
 data.clear();
 data = null;
 }
​
 if(decoderThread!=null) {
 decoderThread.stopThread();
 decoderThread = null;
 }
​
 mCodec.release();
 mCodec = null;
 }
 catch (Exception e) {
 e.printStackTrace();
 Log.d(TAG, "destroyCodec exception:" + e.toString());
 }
 }
 }
​
 private class DecoderThread extends Thread {
 private final int INPUT_BUFFER_FULL_COUNT_MAX = 50;
 private boolean isRunning;
 private int inputBufferFullCount = 0; //  How many times the input buffer is full 
​
 public void stopThread() {
 isRunning = false;
 }
​
 @Override
 public void run() {
 setName("CodecDecoder_DecoderThread-" + getId());
 isRunning = true;
 while (isRunning) {
 try {
 if (data != null && !data.isEmpty()) {
 int inputBufferIndex = mCodec.dequeueInputBuffer(0);
 if (inputBufferIndex >= 0) {
 byte[] buf = data.poll();
 ByteBuffer inputBuffer = mCodec.getInputBuffer(inputBufferIndex);
 if (null != inputBuffer) {
 inputBuffer.clear();
 inputBuffer.put(buf, 0, buf.length);
 mCodec.queueInputBuffer(inputBufferIndex, 0,
 buf.length, mCount * TIME_INTERNAL, 0);
 mCount++;
 }
 inputBufferFullCount = 0; //  There are also buffers that can be used to reset the count 
 } else {
 inputBufferFullCount++;
 LogInFile.getLogger().e(TAG, "decoderThread inputBuffer full.  inputBufferFullCount=" + inputBufferFullCount);
 if (inputBufferFullCount > INPUT_BUFFER_FULL_COUNT_MAX) {
 mCount = 0;
 mCodec.flush(); //  Clear all buffers here 
 LogInFile.getLogger().e(TAG, "mCodec.flush()...");
 }
 }
 }
​
 // Get output buffer index
 MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
 int outputBufferIndex = mCodec.dequeueOutputBuffer(bufferInfo, 0);
 while (outputBufferIndex >= 0) {
 final int index = outputBufferIndex;
 Log.d(TAG, "releaseOutputBuffer " + Thread.currentThread().toString());
 final ByteBuffer outputBuffer = byteBufferClone(mCodec.getOutputBuffer(index));
 Image image = mCodec.getOutputImage(index);
 if (null != image) {
 //  obtain NV21 Formatted data 
 final byte[] nv21 = ImageUtil.getDataFromImage(image, FaceDetectUtil.COLOR_FormatNV21);
 final int imageWid = image.getWidth();
 final int imageHei = image.getHeight();
 //  Here, choose to create a new thread to send data  -  This is where you can optimize 
 new Thread(new Runnable() {
 @Override
 public void run() {
 listener.onDataDecoded(outputBuffer,
 mCodec.getOutputFormat().getInteger(MediaFormat.KEY_COLOR_FORMAT),
 nv21, imageWid, imageHei);
 }
 }).start();
 } else {
 listener.onDataDecoded(outputBuffer,
 mCodec.getOutputFormat().getInteger(MediaFormat.KEY_COLOR_FORMAT),
 new byte[]{0}, 0, 0);
 }
​
 try {
 mCodec.releaseOutputBuffer(index, false);
 } catch (IllegalStateException ex) {
 android.util.Log.e(TAG, "releaseOutputBuffer ERROR", ex);
 }
 outputBufferIndex = mCodec.dequeueOutputBuffer(bufferInfo, 0);
 }
 }
 catch (Exception e) {
 e.printStackTrace();
 Log.e(TAG, "decoderThread exception:" + e.getMessage());
 }
​
 try {
 Thread.sleep(DECODER_TIME_INTERNAL);
 } catch (InterruptedException e) {
 e.printStackTrace();
 }
 }
 }
 }
​
 // deep clone byteBuffer
 private static ByteBuffer byteBufferClone(ByteBuffer buffer) {
 if (buffer.remaining() == 0)
 return ByteBuffer.wrap(new byte[]{0});
​
 ByteBuffer clone = ByteBuffer.allocate(buffer.remaining());
​
 if (buffer.hasArray()) {
 System.arraycopy(buffer.array(), buffer.arrayOffset() + buffer.position(), clone.array(), 0, buffer.remaining());
 } else {
 clone.put(buffer.duplicate());
 clone.flip();
 }
​
 return clone;
 }
}
Code example - Tool function

Some utility functions . For instance from image Remove from NV21 Formatted data .

Tool function

private byte[] getDataFromImage(Image image) {
 return getDataFromImage(image, COLOR_FormatNV21);
 }
​
 /**
 *  take Image according to colorFormat Type of byte data 
 */
 private byte[] getDataFromImage(Image image, int colorFormat) {
 if (colorFormat != COLOR_FormatI420 && colorFormat != COLOR_FormatNV21) {
 throw new IllegalArgumentException("only support COLOR_FormatI420 " + "and COLOR_FormatNV21");
 }
 if (!isImageFormatSupported(image)) {
 throw new RuntimeException("can't convert Image to byte array, format " + image.getFormat());
 }
 Rect crop = image.getCropRect();
 int format = image.getFormat();
 int width = crop.width();
 int height = crop.height();
 Image.Plane[] planes = image.getPlanes();
 byte[] data = new byte[width * height * ImageFormat.getBitsPerPixel(format) / 8];
 byte[] rowData = new byte[planes[0].getRowStride()];
 int channelOffset = 0;
 int outputStride = 1;
 for (int i = 0; i < planes.length; i++) {
 switch (i) {
 case 0:
 channelOffset = 0;
 outputStride = 1;
 break;
 case 1:
 if (colorFormat == COLOR_FormatI420) {
 channelOffset = width * height;
 outputStride = 1;
 } else if (colorFormat == COLOR_FormatNV21) {
 channelOffset = width * height + 1;
 outputStride = 2;
 }
 break;
 case 2:
 if (colorFormat == COLOR_FormatI420) {
 channelOffset = (int) (width * height * 1.25);
 outputStride = 1;
 } else if (colorFormat == COLOR_FormatNV21) {
 channelOffset = width * height;
 outputStride = 2;
 }
 break;
 }
 ByteBuffer buffer = planes[i].getBuffer();
 int rowStride = planes[i].getRowStride();
 int pixelStride = planes[i].getPixelStride();
​
 int shift = (i == 0) ? 0 : 1;
 int w = width >> shift;
 int h = height >> shift;
 buffer.position(rowStride * (crop.top >> shift) + pixelStride * (crop.left >> shift));
 for (int row = 0; row < h; row++) {
 int length;
 if (pixelStride == 1 && outputStride == 1) {
 length = w;
 buffer.get(data, channelOffset, length);
 channelOffset += length;
 } else {
 length = (w - 1) * pixelStride + 1;
 buffer.get(rowData, 0, length);
 for (int col = 0; col < w; col++) {
 data[channelOffset] = rowData[col * pixelStride];
 channelOffset += outputStride;
 }
 }
 if (row < h - 1) {
 buffer.position(buffer.position() + rowStride - length);
 }
 }
 }
 return data;
 }
​
 /**
 *  Whether it is a supported data type 
 */
 private static boolean isImageFormatSupported(Image image) {
 int format = image.getFormat();
 switch (format) {
 case ImageFormat.YUV_420_888:
 case ImageFormat.NV21:
 case ImageFormat.YV12:
 return true;
 }
 return false;
 }

"csd-0" and "csd-1" What is it? , about H264 Video words , It corresponds to sps and pps, about AAC Audio words , The corresponding is ADTS, People who do audio and video development should know , It generally exists in the... Generated by the encoder IDR In frame .

Got mediaFormat

mediaFormat in:{height=720, width=1280, csd-1=java.nio.ByteArrayBuffer[position=0,limit=7,capacity=7], mime=video/avc, csd-0=java.nio.ByteArrayBuffer[position=0,limit=13,capacity=13], color-format=2135033992}
How to store pictures

Image Class in Android API21 And later, the function is very powerful .

Use Image Class to store pictures

 private static void dumpFile(String fileName, byte[] data) {
 FileOutputStream outStream;
 try {
 outStream = new FileOutputStream(fileName);
 } catch (IOException ioe) {
 throw new RuntimeException("rustfisher: Unable to create output file " + fileName, ioe);
 }
 try {
 outStream.write(data);
 outStream.close();
 } catch (IOException ioe) {
 throw new RuntimeException("rustfisher: failed writing data to file " + fileName, ioe);
 }
 }
​
 private void compressToJpeg(String fileName, Image image) {
 FileOutputStream outStream;
 try {
 outStream = new FileOutputStream(fileName);
 } catch (IOException ioe) {
 throw new RuntimeException("rustfisher: Unable to create output file " + fileName, ioe);
 }
 Rect rect = image.getCropRect();
 YuvImage yuvImage = new YuvImage(getDataFromImage(image, COLOR_FormatNV21), ImageFormat.NV21, rect.width(), rect.height(), null);
 yuvImage.compressToJpeg(rect, 100, outStream);
 }

NV21 turn bitmap Methods

Directly into the file

nv21 Save as jpg file

// in try catch
FileOutputStream fos = new FileOutputStream(Environment.getExternalStorageDirectory() + "/rustfisher.jpg");
YuvImage yuvImage = new YuvImage(nv21bytearray, ImageFormat.NV21, width, height, null);
yuvImage.compressToJpeg(new Rect(0, 0, width, height), 100, fos);
fos.close();

get Bitmap Object method , This method is time-consuming and memory consuming

NV21 -> yuvImage -> jpeg -> bitmap

// in try catch
YuvImage yuvImage = new YuvImage(nv21bytearray, ImageFormat.NV21, width, height, null);
ByteArrayOutputStream os = new ByteArrayOutputStream();
yuvImage.compressToJpeg(new Rect(0, 0, width, height), 100, os);
byte[] jpegByteArray = os.toByteArray();
Bitmap bitmap = BitmapFactory.decodeByteArray(jpegByteArray, 0, jpegByteArray.length);
os.close();

codec choice YUV420 Format output OutputBuffer The problem of

hypothesis codec The selected format is COLOR_FormatYUV420Flexible

mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Flexible);

After decoding , The resulting format is COLOR_QCOM_FormatYUV420SemiPlanar32m // 0x7FA30C04

mCodec.getOutputFormat().getInteger(MediaFormat.KEY_COLOR_FORMAT)

Decoded ByteBuffer oBuffer = mCodec.getOutputBuffer(index); The number of elements included is 1413120; Write it down as nv21Codec And by mCodec.getOutputImage(index) Got image Object gets nv21 The number of array elements is 1382400; Write it down as nv21, These are the data we want

Compare this 2 An array, we found , Ahead y Part of it is the same .nv21 front 921600 The element is y data , after 460800 The element is uv data . nv21Codec front 921600 The element is y data , After that 20480 Every byte is 0, And then the next 460800 The element is uv data . final 10240 Bytes are 0

nv21 and nv21Codec Of uv Part of the storage order is the opposite .

Android Audio and video development series tutorials

copyright notice
author[Android_ Anzi],Please bring the original link to reprint, thank you.
https://en.cdmana.com/2022/01/202201270008522684.html

Random recommended