Google Tensor Changed AI For the Better: Here is How


An AI chip is a processor optimized to run machine learning tasks via ML frameworks such as Google’s TensorFlow and Facebook’s PyTorch.

Google designed its own AI chip called Tensor to be used in its new Pixel 6 smartphone. The chip combines a GPU, CPU, ISP (image signal processor), TPU (Tensor processing unit), and a dedicated security chip. Tensor is well over four times faster than Qualcomm Snapdragon 765G that was used in the Pixel 5, according to a Google executive. Google wanted to bring Edge AI to its users better than before, and they needed a more powerful AI chip. They wanted to introduce enhanced computational photography such as removing photobombers and resolving blurry faces to its users helping them perform day-to-day tasks with ease. Plus, they wanted to introduce live translation and transcription for their users better than before. That was not possible using the old Qualcomm chips.
AI development faces many ethical and technical challenges that can fundamentally stop its progress. One of the ethical challenges is that we can’t send all the data to the cloud where the AI models are running due to various privacy acts. We must find ways to run those models at the user level, what is called the Edge AI. One of the technical challenges is that we lack special-purpose hardware for artificial intelligence that lowers the pace of development or adoption of AI products. CPUs are not designed for special computation in AI and, therefore, they lack efficiency when it comes to big AI models. GPUs have much better efficiency compared to CPUs; however, they are not yet optimal.
In this article, I want to share why I think Tensor is an amazing milestone in AI development to address the above challenges. Hope you find it interesting.

Tensor and Tensorflow: A powerful combo

The Google Brain team developed an advanced AI framework named Tensorflow years back. After that, Google designed its own processing unit named Tensor Processing Unit or TPU to perform more efficiently with the Tensorflow. The invention of TPU was a revolution in AI that has significantly expedited the training of huge machine learning models with millions (or, billions) of parameters. Nevertheless, that technology could not be used in low-power devices such as smartphones in Edge AI. The entrance of Google into the AI chip manufacturing club for low-power devices can be the next revolution in this industry. Many companies such as FogHorn and BlinkAI are working in Edge AI using currently existing AI chips in the market. However, the efficacy that Google can create by the combination of TensorFlow and Tensor will be game-changing. Welcome to the club, Google!

Tensor is an AI chip designed by AI!

Isn’t that cool? The story is started from an article published in Nature titles “A graph placement methodology for fast chip design”. To design a processing chip, there is a crucial step referred to as “floor planning” where the engineering team must place a large number of components such that a series of physical requirements including power consumption and performance get satisfied. I don’t go further into its details as I am also not an expert in hardware engineering. However, when you have a large series of choices to make with a series of constraints AI can kick in. You may remember how the AlphaGo project defeated a professional human Go player. This is exactly the same. Tensor is the real outcome of this project that is a new milestone in the AI industry. Kudos, Google!

Tensor helps us build ethical AI

This is a double-edged sword statement. Ethical AI has various aspects from data privacy to AI for all. Tensor helps many users have the opportunity to try the latest AI advancement while they have no concern about their privacy. Why? Because the AI engine is running on the chip, and no data is sent to the cloud for further computation. On the other hand, the more tightly Google binds AI software and hardware, the harder it will be for other companies to compete. I don’t want to see days that other companies can not even compete on performing AI inference, i.e., compete on using AI. We almost lost the game of model training to giant tech companies. It would be a nightmare if we lose the game on AI inference to them as well. That is why I believe “Tensor helps us build ethical AI” is a double-edged sword.
Have question about digital transformation?
get answer

Related articles

your questions and special requests are always welcome
let's talk

Contact us

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.