Compile FFmpeg for Android
When you have to manipulate audio or video on Android, being used to open-source software, you have a single name which comes directly to you: FFmpeg. However, FFmpeg is a C software, meant to be used as an executable, and not officially supporting Android.
There are a lot of partial and/or out-of-date how-to out there on how to get
FFmpeg running on Android, like
halfninja’s build. However,
I needed to use FFmpeg concat
demuxer, introduced in FFmpeg 1.1. Most builds
target 0.9. There’s
a ton
of questions on StackOverflow about getting newer
FFmpeg releases working on Android. So, here’s a full explanation to get
FFmpeg 2.2.3 “Muybridge”
working on Android. I’ll describe the steps for Linux, but everything is pretty
standard shell and should work on any decent OS.
Prerequisites
First, let’s install everything needed.
Android SDK and NDK
Android SDK is available here
while the NDK is available
here. You should also
set two environment variables (ANDROID_SDK
and ANDROID_NDK
) to their
respective installation paths.
On Archlinux, using android-sdk
and android-ndk
AUR packages:
Setting environment variables for Android SDK/NDK
export ANDROID_NDK=/opt/android-ndk/
export ANDROID_SDK=/opt/android-sdk/
FFmpeg sources
Download FFmpeg sources
here and extract them in
$ANDROID_NDK/sources/ffmpeg-2.2.3
. Building third-party libraries in
$ANDROID_NDK/sources
make them easily available to use in other projects.
Building FFmpeg
Configuration
You can tweak the configuration if needed, but here’s the one I used:
FFmpeg configuration
SYSROOT=$ANDROID_NDK/platforms/android-9/arch-arm/
# You should adjust this path depending on your platform, e.g. darwin-x86_64 for Mac OS
TOOLCHAIN=$ANDROID_NDK/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86_64
CPU=arm
PREFIX=$(pwd)/android/$CPU
# Set these if needed
ADDI_CFLAGS=""
ADDI_LDFLAGS=""
./configure \
--prefix=$PREFIX \
--disable-shared \
--enable-static \
--disable-doc \
--disable-ffmpeg \
--disable-ffplay \
--disable-ffprobe \
--disable-ffserver \
--disable-doc \
--disable-symver \
--enable-protocol=concat \
--enable-protocol=file \
--enable-muxer=mp4 \
--enable-demuxer=mpegts \
--enable-memalign-hack \
--cross-prefix=$TOOLCHAIN/bin/arm-linux-androideabi- \
--target-os=linux \
--arch=arm \
--enable-cross-compile \
--sysroot=$SYSROOT \
--extra-cflags="-Os -fpic -marm $ADDI_CFLAGS" \
--extra-ldflags="$ADDI_LDFLAGS"
Compilation
The scariest step is in fact the simplest:
FFmpeg compilation
make clean
# Adapt the jobs count to your machine
make -j3
make install
Expose FFmpeg to Android NDK
To be able to use FFmpeg as a usual NDK module, we need an Android.mk
. It
should be placed in $ANDROID_NDK/sources/ffmpeg-2.2.3/android/arm
.
Android.mk
LOCAL_PATH:= $(call my-dir)
include $(CLEAR_VARS)
LOCAL_MODULE:= libavdevice
LOCAL_SRC_FILES:= lib/libavdevice.a
LOCAL_EXPORT_C_INCLUDES := $(LOCAL_PATH)/include
include $(PREBUILT_STATIC_LIBRARY)
include $(CLEAR_VARS)
LOCAL_MODULE:= libavcodec
LOCAL_SRC_FILES:= lib/libavcodec.a
LOCAL_EXPORT_C_INCLUDES := $(LOCAL_PATH)/include
include $(PREBUILT_STATIC_LIBRARY)
include $(CLEAR_VARS)
LOCAL_MODULE:= libavformat
LOCAL_SRC_FILES:= lib/libavformat.a
LOCAL_EXPORT_C_INCLUDES := $(LOCAL_PATH)/include
include $(PREBUILT_STATIC_LIBRARY)
include $(CLEAR_VARS)
LOCAL_MODULE:= libswscale
LOCAL_SRC_FILES:= lib/libswscale.a
LOCAL_EXPORT_C_INCLUDES := $(LOCAL_PATH)/include
include $(PREBUILT_STATIC_LIBRARY)
include $(CLEAR_VARS)
LOCAL_MODULE:= libavutil
LOCAL_SRC_FILES:= lib/libavutil.a
LOCAL_EXPORT_C_INCLUDES := $(LOCAL_PATH)/include
include $(PREBUILT_STATIC_LIBRARY)
include $(CLEAR_VARS)
LOCAL_MODULE:= libavfilter
LOCAL_SRC_FILES:= lib/libavfilter.a
LOCAL_EXPORT_C_INCLUDES := $(LOCAL_PATH)/include
include $(PREBUILT_STATIC_LIBRARY)
include $(CLEAR_VARS)
LOCAL_MODULE:= libswresample
LOCAL_SRC_FILES:= lib/libswresample.a
LOCAL_EXPORT_C_INCLUDES := $(LOCAL_PATH)/include
include $(PREBUILT_STATIC_LIBRARY)
That’s it! FFmpeg is ready to use!
Using FFmpeg
To use FFmpeg, I’ll stick to halfninja’s
idea: adapt FFmpeg’s main()
to a simple function, and write a JNI
interface around it. A sample project is available on
GitHub.
Adapting FFmpeg’s main()
I used some FFmpeg’s executable source files (ffmpeg.c
, containing main()
,
and directly related ones), and tweaked them: removed every exit()
call and
replaced av_log()
calls to use Android’s LogCat. As FFmpeg’s executable is
meant to be run once, then exited, I also needed to reinitialize some static
variables between every main()
calls.
Update from March 27th 2016: for an up-to-date sample, see this GitHub repository. Thanks Hiko!
JNI interface
The JNI interface is really simple: a simple C wrapper calling FFmpeg’s
main()
, and a Java wrapper around it.
Here’s the C function, excluding usual JNI boilerplate (complete file is available on GitHub):
JNI C wrapper
JNIEXPORT jboolean JNICALL Java_fr_enoent_videokit_Videokit_run(JNIEnv *env, jobject obj, jobjectArray args) {
int i = 0;
int argc = 0;
char **argv = NULL;
jstring *strr = NULL;
if (args != NULL) {
argc = (*env)->GetArrayLength(env, args);
argv = (char **) malloc(sizeof(char *) * argc);
strr = (jstring *) malloc(sizeof(jstring) * argc);
for (i = 0; i < argc; ++i) {
strr[i] = (jstring)(*env)->GetObjectArrayElement(env, args, i);
argv[i] = (char *)(*env)->GetStringUTFChars(env, strr[i], 0);
LOGI("Option: %s", argv[i]);
}
}
LOGI("Running main");
int result = main(argc, argv);
LOGI("Main ended with status %d", result);
for (i = 0; i < argc; ++i) {
(*env)->ReleaseStringUTFChars(env, strr[i], argv[i]);
}
free(argv);
free(strr);
return result == 0;
}
The function simply takes JNI arguments (jobject obj
and jobjectArray args
)
and creates matching char*
parameters. These parameters are then passed to
FFmpeg’s main()
. It then returns true
if everything was fine (FFmpeg
returned 0
), false
otherwise.
The Java part is even simpler. Once again, only the interesting part:
JNI Java wrapper
package fr.enoent.videokit;
public final class Videokit {
// Truncated library loading, see complete file on GitHub
/**
* Call FFmpeg with specified arguments
* @param args FFmpeg arguments
* @return true if success, false otherwise
*/
public boolean process(String[] args) {
String[] params = new String[args.length + 1];
params[0] = "ffmpeg";
System.arraycopy(args, 0, params, 1, args.length);
return run(params);
}
private native boolean run(String[] args);
}
The native run()
method is pretty obvious: it simply calls the previous C
function. However, FFmpeg’s main()
expects to see the executable name as its
first parameter. Even if we don’t compile it as an executable file, I found it
simpler to add this parameter than modifying FFmpeg code to not use it. Hence,
the process()
method, which is the only public interface to call FFmpeg. It
simply adds ffmpeg
as first parameter, then calls run()
.
Call FFmpeg from Java
Once we have the JNI wrapper in place, calling FFmpeg from Java code is really
straightforward. Here’s a sample call which trims the video available on
/sdcard/input.mp4
to keep only 15 seconds of it, and write the result to
/sdcard/output.mp4
:
Using FFmpeg
if (Videokit.getInstance().process(new String[] {
"-y", // Overwrite output files
"-i", // Input file
"/sdcard/input.mp4",
"-ss", // Start position
"0",
"-t", // Duration
"15",
"-vcodec", // Video codec
"copy",
"-acodec", // Audio codec
"copy",
"/sdcard/output.mp4" // Output file
)) {
Log.d(TAG, "Trimming: success");
} else {
Log.d(TAG, "Trimming: failure");
}
Conclusion
While using FFmpeg on Android is really useful when dealing with audio and video files, it wasn’t as easy as one could think to get it working the first time, with an up-to-date FFmpeg version. However, once set up, it works great, with decent performances even on mid-end hardware.