TensorFlow lightens up to land on smartmobes, then embed everywhere

Thanks for coming, TensorFlow Mobile, TensorFlow Lite is what the cool kids will code with now

Google's released an Android/iOS version of TensorFlow.

The Chocolate Factory announced the developer preview of TensorFlow Lite in this Tuesday blog post. The post stated the release will initially target smartmobes, with later versions to target embedded devices.

Google first revealed its desire for machine learning everywhere at its I/O conference in May.

Pushing machine learning out to the devices makes sense, since it reduces latency for those running inference, and Google's not the only company to spot that. Qualcomm, for example, first announced its mobile-specific silicon, Zeroth, in 2013.

Google explained that TensorFlow Lite's architecture assumes that the grunt work of model training will happen upstream, as shown in the graphic below.

TensorFlow Lite architecture

Google listed the tool's components thus:

  • TensorFlow Model: A trained TensorFlow model saved on disk.
  • TensorFlow Lite Converter: A program that converts the model to the TensorFlow Lite file format.
  • TensorFlow Lite Model File: A model file format based on FlatBuffers, that has been optimized for maximum speed and minimum size.

Out on the target smartphone, a C++ API (native on iOS; wrapped in a Java API on Android) loads the TensorFlow Lite model and calls the interpreter.

A fully-loaded interpreter is 300 KB, including all machine learning operators (on its own, the interpreter is just 70 KB). Google notes that the current TensorFlow Mobile is 1.5 MB.

Androids can also offload processing to hardware accelerators if they're available, using the Android Neural Networks API.

Models available to TensorFlow Lite include the MobileNet and Inception v3 vision models; and the Smart Reply conversational model.

For now, TensorFlow Mobile stays on Google's books. Google's announcement stated that it viewed TensorFlow Mobile as the system to support production applications. However: “Going forward, TensorFlow Lite should be seen as the evolution of TensorFlow Mobile, and as it matures it will become the recommended solution for deploying models on mobile and embedded devices”. ?

Biting the hand that feeds IT ? 1998–2017

                                    1. 621831382 2018-02-23
                                    2. 92331381 2018-02-23
                                    3. 5326071380 2018-02-23
                                    4. 4019031379 2018-02-23
                                    5. 8895891378 2018-02-23
                                    6. 9775451377 2018-02-23
                                    7. 298541376 2018-02-23
                                    8. 513211375 2018-02-23
                                    9. 2105531374 2018-02-23
                                    10. 4906741373 2018-02-23
                                    11. 567831372 2018-02-23
                                    12. 5043271371 2018-02-23
                                    13. 2028341370 2018-02-23
                                    14. 3654301369 2018-02-23
                                    15. 2905181368 2018-02-23
                                    16. 5929231367 2018-02-23
                                    17. 7852051366 2018-02-23
                                    18. 3874991365 2018-02-23
                                    19. 9394591364 2018-02-22
                                    20. 3676521363 2018-02-22