Chapter 10. Using Custom Models in Android

In Chapter 9, you looked at various scenarios for creating custom models using TensorFlow Lite Model Maker, Cloud AutoML Vision Edge, and TensorFlow with transfer learning. In this chapter, you’ll explore how you can use and integrate these into your Android app. Unfortunately, it’s rarely as simple as dropping a model into an app, and as a result, it just “works.” Often there are complications with handling data, as Android will represent things like images and strings differently from TensorFlow, and indeed, the output of the model will need to be parsed from the Tensor-based output to something more representative in Android. We’ll explore this first, then go into some examples of how to use image and language models in Android.

Bridging Models to Android

When creating an app that uses a machine learning model, you’ll have a binary blob with the extension .tflite that you’ll incorporate into your app. This binary expects inputs as tensors (or some emulation of them) and will give you outputs as tensors. That’ll be the first challenge. Additionally, the model only works well when there’s associated metadata. So, for example, if you build a flower classifier, as in Chapter 9, the model will give you an output of five probabilities, and the idea is that each probability matches a particular flower type. However, the model doesn’t output a flower type—such as rose. It simply gives you a set of numbers, so you need the associated metadata ...

Get AI and Machine Learning for On-Device Development now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.