TensorFlow Lite uses a hardware accelerator for deep learning carried by a mobile device,neural network"Quantize" the value of the parameter to an 8-bit integer or the like before executing it. The inference of deep learning combined with this quantization is said to be 3 times faster than usual TensorFlow. In addition, we also adapted Linux for embedded Linux adopted by Raspai and others. "Deep learning will be available even for small devices sold for 20 to 30 dollars, and the use of deep learning will be advanced in drone etc."
By using TensorFlow.js, users can deploy Machine Learning Models learned (trained) with the regular version of TensorFlow on Web browsers and execute "inference" of machine learning. This demo video is fun to find emoji instructed using smartphone cameras in the real world.
Google firstTensorFlowIt was November 2015 that announced. Since then, more than two years have elapsed, not only the execution environment but also the function as a development tool has been greatly enhanced, steadily evolving as a "platform" of machine learning.
- Tensorflow.js: http://js.tensorflow.org/ You can try 4 different demos (^ ^)