Top
Google releases developer preview of TensorFlow Lite – A N I T H
fade
78734
post-template-default,single,single-post,postid-78734,single-format-standard,eltd-core-1.1.1,flow child-child-ver-1.0.0,flow-ver-1.3.6,eltd-smooth-scroll,eltd-smooth-page-transitions,ajax,eltd-blog-installed,page-template-blog-standard,eltd-header-standard,eltd-fixed-on-scroll,eltd-default-mobile-header,eltd-sticky-up-mobile-header,eltd-dropdown-default,wpb-js-composer js-comp-ver-5.0.1,vc_responsive

Google releases developer preview of TensorFlow Lite

Google releases developer preview of TensorFlow Lite


Developers were pretty psyched by the announcement at Google I/O back in May that a new version of TensorFlow was being built from the ground up for mobile devices. Today, Google has released a developer preview of TensorFlow Lite.

The software library is aimed at creating a more lightweight machine learning solution for smartphone and embedded devices. The company is calling it an evolution of TensorFlow for mobile and it’s available now for both Android and iOS app developers.

The focus here won’t be on training models but rather on bringing low-latency inference from machine learning models to less robust devices. In layman’s terms this means TensorFlow Lite will focus on applying existing capabilities of models to new data it’s given rather than learning new capabilities from existing data, something most mobile devices simply don’t have the horsepower to handle.

Google detailed that the big priorities when they designed TF Lite from scratch was to emphasize a lightweight product that could initialize quickly and improve model load times on a variety of mobile devices. TensorFlow Lite supports the Android Neural Networks API.

This isn’t a full release so there’s still much more to come as the library takes shape and things get added. Right now Google says TensorFlow Lite is tuned and ready for a few different vision and natural language processing models like MobileNet, Inception v3 and Smart Reply.

“With this developer preview, we have intentionally started with a constrained platform to ensure performance on some of the most important common models,” a post authored by the TensorFlow team read. “We plan to prioritize future functional expansion based on the needs of our users. The goals for our continued development are to simplify the developer experience, and enable model deployment for a range of mobile and embedded devices.”

Interested developers can dig into the TF Lite documentation and get to obsessing.



Source link

Anith Gopal
No Comments

Post a Comment