Tensorflow model to android

Deploying a TensorFlow model to Android

a guide written in blood 💉

Late 2018 update:

This post is obsolete. TensorFlow has gone a long way since writing this and has made deployment to mobile much easier. TensorFlow Mobile is also about to become deprecated according to Google. So before reading this post, make sure you check out TensorFlow Lite and see if it supports the model you want to deploy.
If it doesn’t and you need TensorFlow Mobile, instead of the complex compilation described in the post you can easily attach TensorFlow Mobile to your build.gradle in the following manner:

However, this post still addresses some pitfalls you’re likely to stumble upon and will help you understand the process better. Also, if like us at JoyTunes, you find yourself in the need of tweaking something in tensorflow and compiling the full fledged TensorFlow Mobile yourself, then this post is definitely for you.

There are a lot of tutorials and material out there about TensorFlow, but information about deploying a TensorFlow model into a mobile app (without a server-side component) is very lacking.

At JoyTunes, as part of our constant work on improving the amazing MusicSense™ piano recognition engine, I wanted to create a POC for deploying a model created in TensorFlow in our research environment directly to a production Android app. Since I spent quite a lot of time figuring it out myself, running into a lot of pitfalls, I decided to dump my experience into a post others like me might find useful.

Existing guides

First, you will want to look at the Android example on github. It’s the only official example by TensorFlow explaining how to run a model on Android, and it’s a good place to learn from.

Second, there are two very relevant posts by Amit Shekhar, which unfortunately I stumbled upon only a week after starting this onerous journey.

The first is another example of an Android app running a TF model (the same one from the official TensorFlow example), but with a step-by-step explanation of how it was built (exactly what my post talks about but with a very sunny-side scenario of things):

Источник

Android quickstart

To get started with TensorFlow Lite on Android, we recommend exploring the following example.

Read TensorFlow Lite Android image classification for an explanation of the source code.

This example app uses image classification to continuously classify whatever it sees from the device’s rear-facing camera. The application can run either on device or emulator.

Inference is performed using the TensorFlow Lite Java API and the TensorFlow Lite Android Support Library. The demo app classifies frames in real-time, displaying the top most probable classifications. It allows the user to choose between a floating point or quantized model, select the thread count, and decide whether to run on CPU, GPU, or via NNAPI.

Build in Android Studio

To build the example in Android Studio, follow the instructions in README.md.

Читайте также:  Обучение android studio java

Create your own Android app

To get started quickly writing your own Android code, we recommend using our Android image classification example as a starting point.

The following sections contain some useful information for working with TensorFlow Lite on Android.

Use Android Studio ML Model Binding

To import a TensorFlow Lite (TFLite) model:

Right-click on the module you would like to use the TFLite model or click on File , then New > Other > TensorFlow Lite Model

Select the location of your TFLite file. Note that the tooling will configure the module’s dependency on your behalf with ML Model binding and all dependencies automatically inserted into your Android module’s build.gradle file.

Optional: Select the second checkbox for importing TensorFlow GPU if you want to use GPU acceleration.

The following screen will appear after the import is successful. To start using the model, select Kotlin or Java, copy and paste the code under the Sample Code section. You can get back to this screen by double clicking the TFLite model under the ml directory in Android Studio.

Use the TensorFlow Lite Task Library

TensorFlow Lite Task Library contains a set of powerful and easy-to-use task-specific libraries for app developers to create ML experiences with TFLite. It provides optimized out-of-box model interfaces for popular machine learning tasks, such as image classification, question and answer, etc. The model interfaces are specifically designed for each task to achieve the best performance and usability. Task Library works cross-platform and is supported on Java, C++, and Swift (coming soon).

To use the Task Library in your Android app, we recommend using the AAR hosted at MavenCentral for Task Vision library and Task Text library , respectively.

You can specify this in your build.gradle dependencies as follows:

To use nightly snapshots, make sure that you have added Sonatype snapshot repository.

See the introduction in the TensorFlow Lite Task Library overview for more details.

Use the TensorFlow Lite Android Support Library

The TensorFlow Lite Android Support Library makes it easier to integrate models into your application. It provides high-level APIs that help transform raw input data into the form required by the model, and interpret the model’s output, reducing the amount of boilerplate code required.

It supports common data formats for inputs and outputs, including images and arrays. It also provides pre- and post-processing units that perform tasks such as image resizing and cropping.

To use the Support Library in your Android app, we recommend using the TensorFlow Lite Support Library AAR hosted at MavenCentral.

You can specify this in your build.gradle dependencies as follows:

To use nightly snapshots, make sure that you have added Sonatype snapshot repository.

To get started, follow the instructions in the TensorFlow Lite Android Support Library.

Use the TensorFlow Lite AAR from MavenCentral

To use TensorFlow Lite in your Android app, we recommend using the TensorFlow Lite AAR hosted at MavenCentral.

You can specify this in your build.gradle dependencies as follows:

To use nightly snapshots, make sure that you have added Sonatype snapshot repository.

This AAR includes binaries for all of the Android ABIs. You can reduce the size of your application’s binary by only including the ABIs you need to support.

We recommend most developers omit the x86 , x86_64 , and arm32 ABIs. This can be achieved with the following Gradle configuration, which specifically includes only armeabi-v7a and arm64-v8a , which should cover most modern Android devices.

To learn more about abiFilters , see NdkOptions in the Android Gradle documentation.

Build Android app using C++

There are two ways to use TFLite through C++ if you build your app with the NDK:

Читайте также:  Android get current fragments

Use TFLite C API

This is the recommended approach. Download the TensorFlow Lite AAR hosted at MavenCentral, rename it to tensorflow-lite-*.zip , and unzip it. You must include the four header files in headers/tensorflow/lite/ and headers/tensorflow/lite/c/ folder and the relevant libtensorflowlite_jni.so dynamic library in jni/ folder in your NDK project.

The c_api.h header file contains basic documentation about using the TFLite C API.

Use TFLite C++ API

If you want to use TFLite through C++ API, you can build the C++ shared libraries:

Currently, there is no straightforward way to extract all header files needed, so you must include all header files in tensorflow/lite/ from the TensorFlow repository. Additionally, you will need header files from FlatBuffers and Abseil.

Min SDK version of TFLite

Library minSdkVersion Device Requirements
tensorflow-lite 19 NNAPI usage requires API 27+
tensorflow-lite-gpu 19 GLES 3.1 or OpenCL (typically only available on API 21+
tensorflow-lite-hexagon 19
tensorflow-lite-support 19
tensorflow-lite-task-vision 21 android.graphics.Color related API requires API 26+
tensorflow-lite-task-text 21
tensorflow-lite-task-audio 23
tensorflow-lite-metadata 19

Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.

Источник

Android quickstart

To get started with TensorFlow Lite on Android, we recommend exploring the following example.

Read TensorFlow Lite Android image classification for an explanation of the source code.

This example app uses image classification to continuously classify whatever it sees from the device’s rear-facing camera. The application can run either on device or emulator.

Inference is performed using the TensorFlow Lite Java API and the TensorFlow Lite Android Support Library. The demo app classifies frames in real-time, displaying the top most probable classifications. It allows the user to choose between a floating point or quantized model, select the thread count, and decide whether to run on CPU, GPU, or via NNAPI.

Build in Android Studio

To build the example in Android Studio, follow the instructions in README.md.

Create your own Android app

To get started quickly writing your own Android code, we recommend using our Android image classification example as a starting point.

The following sections contain some useful information for working with TensorFlow Lite on Android.

Use Android Studio ML Model Binding

To import a TensorFlow Lite (TFLite) model:

Right-click on the module you would like to use the TFLite model or click on File , then New > Other > TensorFlow Lite Model

Select the location of your TFLite file. Note that the tooling will configure the module’s dependency on your behalf with ML Model binding and all dependencies automatically inserted into your Android module’s build.gradle file.

Optional: Select the second checkbox for importing TensorFlow GPU if you want to use GPU acceleration.

The following screen will appear after the import is successful. To start using the model, select Kotlin or Java, copy and paste the code under the Sample Code section. You can get back to this screen by double clicking the TFLite model under the ml directory in Android Studio.

Use the TensorFlow Lite Task Library

TensorFlow Lite Task Library contains a set of powerful and easy-to-use task-specific libraries for app developers to create ML experiences with TFLite. It provides optimized out-of-box model interfaces for popular machine learning tasks, such as image classification, question and answer, etc. The model interfaces are specifically designed for each task to achieve the best performance and usability. Task Library works cross-platform and is supported on Java, C++, and Swift (coming soon).

To use the Task Library in your Android app, we recommend using the AAR hosted at MavenCentral for Task Vision library and Task Text library , respectively.

You can specify this in your build.gradle dependencies as follows:

To use nightly snapshots, make sure that you have added Sonatype snapshot repository.

See the introduction in the TensorFlow Lite Task Library overview for more details.

Use the TensorFlow Lite Android Support Library

The TensorFlow Lite Android Support Library makes it easier to integrate models into your application. It provides high-level APIs that help transform raw input data into the form required by the model, and interpret the model’s output, reducing the amount of boilerplate code required.

It supports common data formats for inputs and outputs, including images and arrays. It also provides pre- and post-processing units that perform tasks such as image resizing and cropping.

To use the Support Library in your Android app, we recommend using the TensorFlow Lite Support Library AAR hosted at MavenCentral.

You can specify this in your build.gradle dependencies as follows:

To use nightly snapshots, make sure that you have added Sonatype snapshot repository.

To get started, follow the instructions in the TensorFlow Lite Android Support Library.

Use the TensorFlow Lite AAR from MavenCentral

To use TensorFlow Lite in your Android app, we recommend using the TensorFlow Lite AAR hosted at MavenCentral.

You can specify this in your build.gradle dependencies as follows:

To use nightly snapshots, make sure that you have added Sonatype snapshot repository.

This AAR includes binaries for all of the Android ABIs. You can reduce the size of your application’s binary by only including the ABIs you need to support.

We recommend most developers omit the x86 , x86_64 , and arm32 ABIs. This can be achieved with the following Gradle configuration, which specifically includes only armeabi-v7a and arm64-v8a , which should cover most modern Android devices.

To learn more about abiFilters , see NdkOptions in the Android Gradle documentation.

Build Android app using C++

There are two ways to use TFLite through C++ if you build your app with the NDK:

Use TFLite C API

This is the recommended approach. Download the TensorFlow Lite AAR hosted at MavenCentral, rename it to tensorflow-lite-*.zip , and unzip it. You must include the four header files in headers/tensorflow/lite/ and headers/tensorflow/lite/c/ folder and the relevant libtensorflowlite_jni.so dynamic library in jni/ folder in your NDK project.

The c_api.h header file contains basic documentation about using the TFLite C API.

Use TFLite C++ API

If you want to use TFLite through C++ API, you can build the C++ shared libraries:

Currently, there is no straightforward way to extract all header files needed, so you must include all header files in tensorflow/lite/ from the TensorFlow repository. Additionally, you will need header files from FlatBuffers and Abseil.

Min SDK version of TFLite

Library minSdkVersion Device Requirements
tensorflow-lite 19 NNAPI usage requires API 27+
tensorflow-lite-gpu 19 GLES 3.1 or OpenCL (typically only available on API 21+
tensorflow-lite-hexagon 19
tensorflow-lite-support 19
tensorflow-lite-task-vision 21 android.graphics.Color related API requires API 26+
tensorflow-lite-task-text 21
tensorflow-lite-task-audio 23
tensorflow-lite-metadata 19

Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.

Источник

Читайте также:  Volley android studio пример
Оцените статью