Note that there are some explanatory texts on larger screens.

plurals
  1. PO
    primarykey
    data
    text
    <p>Use the <a href="http://developer.android.com/reference/android/media/MediaCodec.html" rel="noreferrer">MediaCodec</a> API with <code>CONFIGURE_FLAG_ENCODE</code> to set it up as an encoder. No ffmpeg required :)</p> <p>You've already found how to grab the screen in the other question you linked to, now you just need to feed each captured frame to <code>MediaCodec</code>, setting the appropriate format flags, timestamp, etc.</p> <p><strong>EDIT:</strong> Sample code for this was hard to find, but <a href="https://android.googlesource.com/platform/cts/+/6289d680cb75fa5a985464b9f362d4a2a007a7bf%5E!/#F0" rel="noreferrer">here it is</a>, hat tip to Martin Storsjö. Quick API walkthrough:</p> <pre><code>MediaFormat inputFormat = MediaFormat.createVideoFormat("video/avc", width, height); inputFormat.setInteger(MediaFormat.KEY_BIT_RATE, bitRate); inputFormat.setInteger(MediaFormat.KEY_FRAME_RATE, frameRate); inputFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, colorFormat); inputFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 75); inputFormat.setInteger("stride", stride); inputFormat.setInteger("slice-height", sliceHeight); encoder = MediaCodec.createByCodecName("OMX.TI.DUCATI1.VIDEO.H264E"); // need to find name in media codec list, it is chipset-specific encoder.configure(inputFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE); encoder.start(); encoderInputBuffers = encoder.getInputBuffers(); encoderOutputBuffers = encoder.getOutputBuffers(); byte[] inputFrame = new byte[frameSize]; while ( ... have data ... ) { int inputBufIndex = encoder.dequeueInputBuffer(timeout); if (inputBufIndex &gt;= 0) { ByteBuffer inputBuf = encoderInputBuffers[inputBufIndex]; inputBuf.clear(); // HERE: fill in input frame in correct color format, taking strides into account // This is an example for I420 for (int i = 0; i &lt; width; i++) { for (int j = 0; j &lt; height; j++) { inputFrame[ i * stride + j ] = ...; // Y[i][j] inputFrame[ i * stride/2 + j/2 + stride * sliceHeight ] = ...; // U[i][j] inputFrame[ i * stride/2 + j/2 + stride * sliceHeight * 5/4 ] = ...; // V[i][j] } } inputBuf.put(inputFrame); encoder.queueInputBuffer( inputBufIndex, 0 /* offset */, sampleSize, presentationTimeUs, 0); } int outputBufIndex = encoder.dequeueOutputBuffer(info, timeout); if (outputBufIndex &gt;= 0) { ByteBuffer outputBuf = encoderOutputBuffers[outputBufIndex]; // HERE: read get the encoded data encoder.releaseOutputBuffer( outputBufIndex, false); } else { // Handle change of buffers, format, etc } } </code></pre> <p>There are also some <a href="http://code.google.com/p/android/issues/detail?id=37769" rel="noreferrer">open issues</a>.</p> <p><strong>EDIT:</strong> You'd feed the data in as a byte buffer in one of the supported pixel formats, for example I420 or NV12. There is unfortunately no perfect way of determining which formats would work on a particular device; however it is typical for the same formats you can get from the camera to work with the encoder.</p>
    singulars
    1. This table or related slice is empty.
    plurals
    1. This table or related slice is empty.
    1. This table or related slice is empty.
    1. This table or related slice is empty.
    1. VO
      singulars
      1. This table or related slice is empty.
    2. VO
      singulars
      1. This table or related slice is empty.
    3. VO
      singulars
      1. This table or related slice is empty.
 

Querying!

 
Guidance

SQuiL has stopped working due to an internal error.

If you are curious you may find further information in the browser console, which is accessible through the devtools (F12).

Reload