Android’s media playing capabilites have always been an annoying issue. Either you use the high level MediaPlayer API where most of the control is about supplying an URL to play, or you use JNI to write low level C code to do all the work (as of 2014-01-20, Android Studio doesn’t support the NDK). Since Android 4.1 (API 16), there’s a MediaCodec API which gives access to the decoding process using java code. The concept is as follows:
- use MediaExtractor to demux the stream
- decode the buffers using MediaCodec
- render into the surface
This sounds great… until you start using it. Unfortunately, there are several shortcomings to that API:
- no control over the buffers. MediaCodec has both getInputBuffers() and getOutputBuffers() methods which give pointers to buffers whose sizes cannot be chosen. On a Nexus 5, the former is 4 and the later is 19. The only option when one needs bigger buffering would be to copy them around, which incurs a performance drop.
- MediaExtractor has no way to get access to the data-only portion of the stream. It’s basically designed to feed its output to MediaCodec. Feeding a stream composed of multiple parts (as in the case of HTTP Live Streaming) is impossible and you need your own custom parser.
- There’s no control over the rendering. You cannot touch the buffers but just render them into a surface.
It looks like this API was mostly designed for encoding media and the decoding was added as a bonus. It should work fine for decoding a file. For network sources you’ll probably run into streaming issues. Anything more advanced requires diving into the complexities of the NDK and OpenMAX API.
On a related note, this blog post has a good history of Android’s video decoding.
Also it seems that Google is still making improvements on that area so it might not stay half-assed forever.