Artificial Intelligence (AI) is becoming ubiquitous. We now live with an array of algorithms that control our devices, from smartphones to automated human resource management and a chat request on a website to virtual assistants and self-driving cars.
The increasing use of AI technology has led to the creation of vast amounts of data and computing power, which has an outsize environmental impact. To build powerful algorithms, extensive energy is required, resulting in large amounts of carbon emissions. Research found that a single AI can emit as much as 284 tonnes of carbon dioxide. Deep learning, on the other hand, has a terrible carbon footprint, obstructing the speed and privacy of AI applications.
Tiny AI is changing that, a step towards green computing.
Now, researchers are building new algorithms by shrinking existing deep-learning models without losing their capabilities. Even tech giants are developing AI chips that can pack more computational power into tighter physical spaces, and train and run AI on far less energy. The distillation methods used to develop tiny AI – compressed algorithms – can scale down a model 10 times its existing size.
Take for example, Google Assistant software. It was previously a program, approximately 100 gigabytes in size. Two years ago, Google Assistant was trimmed down to roughly half a gigabyte in size and now runs on users’ phones without sending requests to a remote server. Even Apple runs Siri’s speech recognition capabilities and its QuickType keyboard locally on the iPhone. Meanwhile, IBM and Amazon offer developer platforms for making and deploying tiny AI.
Currently, a regular Internet-connected device sends data to the cloud for processing, and after it receives instructions from the cloud, serves the appropriate response. Depending on the internet speed, the wait for the response introduces a lag in the system. The transmission also requires a significant amount of energy and bandwidth. By bringing advanced computing closer to the end-devices, tiny AI enables ultra-low latency.
The reduced size enables programs to be directly installed on the device itself and does not require users to send data to the cloud or a remote server. And so, apart from reducing carbon footprint, tiny AI helps services like voice assistants autocorrect, and digital cameras to get faster and also develop new applications for self-driving cars with faster reaction times. Also, localised AI is better for privacy, since user’s data no longer needs to leave a device to improve a service or a feature.
What does it mean for your business?
While sustainability is a growing concern for businesses, across industry, AI and data are critical to deriving crucial insights in real-time. Tiny AI allows enterprises to make advanced computing and analytics greener and develop products and services that run faster and are more secure, and, in turn, boost revenues.
While some are developing ways to make algorithms shorter, others are developing smaller hardware capable of running complex algorithms, and many more are developing ways to train deep learning models with smaller datasets.
It is already showing massive potential and will be on the path to becoming the largest segment of the edge AI and machine learning market by shipment volume. ABI Research forecasts total shipments of 1.2 billion devices with tiny AI chipsets in 2022. In addition, the proliferation of ultra-low-power ML applications means more brownfield devices will also be equipped with ML models for on-device anomaly detection, condition monitoring, and predictive maintenance.
If you liked reading this, you might like our other stories