Here’s how Google Makes Android ready for AI
Alphabet’s Google is set to embrace AI to keep Android ahead of the game with its Apple counterpart. We get its indication in the company’s recent announcement at Google I/O 2017, an annual developer conference held in Mountain View, California. In a vast array of announcements spread across the wide service spectrum, one statement about Android was: TensorFlow Lite will allow app developers to access some awesome neural networking features in Android OS. The statement clearly shows the tech giant’s intentions to go ahead with AI (Artificial Intelligence) and machine learning implementation in the Android apps.
Let’s understand how Google is about to prepare Android for the future expected to be dominated by AI:
Google has already started investing heftily in the ML (Machine Learning)-specific hardware in tune with Cloud Tensor Processing Unit (TPU) chips foreseeing the ‘AI First’ world. Such hardware can accelerate both the training of new ML algorithms and data processing while using existing models. The company has announced the enhanced second version, TensorFlow Lite, which can run on smartphones and other mobile devices, in the recent Google I/O 2017 conference.
How the Lite version will pave way for AI in Android
Dave Burke, VP of Engineering for Android is of the opinion that the Lite version of TensorFlow will leverage a neural network API to tap into silicon-specific accelerators. In his words: “We think these new capabilities will help power the next generation of on-device speech processing, visual search, augmented reality, and more.” He defined TensorFlow Lite as a library for Android apps designed to make fast and small apps with state-of-the-art techniques.
Google wants to make the Lite framework a part of its open source project TensorFlow and to release the neural network API as the next major Android update in the later part of 2017. It was the last year when Google enriched TensorFlow with Android and iOS support, this year, the company has taken the next step with TensorFlow Lite.
How it works
AI-focused chips in the Lite version can make it possible for Android phones to handle advanced machine learning computations without using much power. Interestingly, as more and more Android apps will use machine learning for offering intelligent experiences to the users, this sort of work will become easier on the device.
Right now, it is difficult to integrate advanced machine learning into apps, especially training models, as the process requires more computational power and a lot of time. Robust hardware and high programming skills are necessary for integrating ML in the applications. Due to these reasons, practically it was not possible to include machine learning in consumer mobile applications. Such apps often offload the process to a massive data center. Now, data process in the cloud has a few downsides as well: users should be willing to transfer their data like images, text, etc. to a company’s servers and require rich connectivity to ensure the low latency of process.
The Lite version of TensorFlow is designed to offer a solution to all these difficulties as a machine learning-specific Digital Signal Processor (DSP). Using this version, apps will get smarter and do interesting things more quickly on their own without depending on data centers. What’s more, as intelligent apps, they can refine image or speech-recognition on the move, and Android devices will have more computing power.
We can certainly expect more chips, focusing on the smart devices, for accelerating machine learning training and inference related to artificial intelligence.
In today’s mobile-driven world, Google can get the benefits from integrating AI systems into Android apps as Apple is far from integrating such AI-friendly tools into iOS applications. It’s now a matter of time when Google will bring a radical change in the look and feel of Android apps.