To get started with TensorFlow Lite on Android, we recommend exploring the following example.
Android image classification example
Read TensorFlow Lite Android image classification for an explanation of the source code.
This example app uses image classification to continuously classify whatever it sees from the device's rear-facing camera. The application can run either on device or emulator.
Inference is performed using the TensorFlow Lite Java API and the TensorFlow Lite Android Support Library. The demo app classifies frames in real-time, displaying the top most probable classifications. It allows the user to choose between a floating point or quantized model, select the thread count, and decide whether to run on CPU, GPU, or via NNAPI.
Note: Additional Android applications demonstrating TensorFlow Lite in a variety of use cases are available in Examples.
To build the example in Android Studio, follow the instructions in README.md.
To get started quickly writing your own Android code, we recommend using our Android image classification example as a starting point.
The following sections contain some useful information for working with TensorFlow Lite on Android.
The TensorFlow Lite Android Support Library makes it easier to integrate models into your application. It provides high-level APIs that help transform raw input data into the form required by the model, and interpret the model's output, reducing the amount of boilerplate code required.
It supports common data formats for inputs and outputs, including images and arrays. It also provides pre- and post-processing units that perform tasks such as image resizing and cropping.
To get started, follow the instructions in the TensorFlow Lite Android Support Library README.md.
To use TensorFlow Lite in your Android app, we recommend using the TensorFlow Lite AAR hosted at JCenter.
You can specify this in your build.gradle
dependencies as follows:
dependencies { implementation 'org.tensorflow:tensorflow-lite:0.0.0-nightly' }
This AAR includes binaries for all of the Android ABIs. You can reduce the size of your application's binary by only including the ABIs you need to support.
We recommend most developers omit the x86
, x86_64
, and arm32
ABIs. This can be achieved with the following Gradle configuration, which specifically includes only armeabi-v7a
and arm64-v8a
, which should cover most modern Android devices.
android { defaultConfig { ndk { abiFilters 'armeabi-v7a', 'arm64-v8a' } } }
To learn more about abiFilters
, see NdkOptions
in the Android Gradle documentation.
In some cases, you might wish to use a local build of TensorFlow Lite. For example, you may be building a custom binary that includes operations selected from TensorFlow, or you may wish to make local changes to TensorFlow Lite.
By clicking to accept, you hereby agree that all use of the Android Studio and Android Native Development Kit will be governed by the Android Software Development Kit License Agreement available at https://developer.android.com/studio/terms (such URL may be updated or changed by Google from time to time).
{% dynamic if ‘tflite-android-tos’ in user.acknowledged_walls and request.tld != ‘cn’ %} You can download the Docker file here {% dynamic else %} You must acknowledge the terms of service to download the file. Acknowledge {% dynamic endif %}
docker build . -t tflite-builder -f tflite-android.Dockerfile
docker run -it -v $PWD:/tmp tflite-builder bash
If you use PowerShell on Windows, replace “$PWD” with “pwd”.
If you would like to use a TensorFlow repository on the host, mount that host directory instead (-v hostDir:/tmp).
android update sdk --no-ui -a --filter tools,platform-tools,android-${ANDROID_API_LEVEL},build-tools-${ANDROID_BUILD_TOOLS_VERSION}’
You can now proceed to the “Build and Install” section. After you are finished building the libraries, you can copy them to /tmp inside the container so that you can access them on the host.
Bazel is the primary build system for TensorFlow. To build with it, you must have it and the Android NDK and SDK installed on your system.
Run the ./configure
script in the root TensorFlow checkout directory, and answer “Yes” when the script asks to interactively configure the ./WORKSPACE
for Android builds. The script will attempt to configure settings using the following environment variables:
ANDROID_SDK_HOME
ANDROID_SDK_API_LEVEL
ANDROID_NDK_HOME
ANDROID_NDK_API_LEVEL
If these variables aren't set, they must be provided interactively in the script prompt. Successful configuration should yield entries similar to the following in the .tf_configure.bazelrc
file in the root folder:
build --action_env ANDROID_NDK_HOME="/usr/local/android/android-ndk-r17c" build --action_env ANDROID_NDK_API_LEVEL="21" build --action_env ANDROID_BUILD_TOOLS_VERSION="28.0.3" build --action_env ANDROID_SDK_API_LEVEL="23" build --action_env ANDROID_SDK_HOME="/usr/local/android/android-sdk-linux"
Once Bazel is properly configured, you can build the TensorFlow Lite AAR from the root checkout directory as follows:
bazel build -c opt --fat_apk_cpu=x86,x86_64,arm64-v8a,armeabi-v7a \ --host_crosstool_top=@bazel_tools//tools/cpp:toolchain \ //tensorflow/lite/java:tensorflow-lite
This will generate an AAR file in bazel-bin/tensorflow/lite/java/
. Note that this builds a “fat” AAR with several different architectures; if you don't need all of them, use the subset appropriate for your deployment environment. From there, there are several approaches you can take to use the .aar in your Android Studio project.
Move the tensorflow-lite.aar
file into a directory called libs
in your project. Modify your app's build.gradle
file to reference the new directory and replace the existing TensorFlow Lite dependency with the new local library, e.g.:
allprojects { repositories { jcenter() flatDir { dirs 'libs' } } } dependencies { compile(name:'tensorflow-lite', ext:'aar') }
Execute the following command from your root checkout directory:
mvn install:install-file \ -Dfile=bazel-bin/tensorflow/lite/java/tensorflow-lite.aar \ -DgroupId=org.tensorflow \ -DartifactId=tensorflow-lite -Dversion=0.1.100 -Dpackaging=aar
In your app's build.gradle
, ensure you have the mavenLocal()
dependency and replace the standard TensorFlow Lite dependency with the one that has support for select TensorFlow ops:
allprojects { repositories { jcenter() mavenLocal() } } dependencies { implementation 'org.tensorflow:tensorflow-lite:0.1.100' }
Note that the 0.1.100
version here is purely for the sake of testing/development. With the local AAR installed, you can use the standard TensorFlow Lite Java inference APIs in your app code.
If you want to use TFLite through C++ libraries, you can build the shared libraries:
32bit armeabi-v7a:
bazel build -c opt --config=android_arm //tensorflow/lite:libtensorflowlite.so
64bit arm64-v8a:
bazel build -c opt --config=android_arm64 //tensorflow/lite:libtensorflowlite.so