Explained: Federated Learning

Datatechvibe-Explains-Federated-Learning

In the not too distant future, our lives will be interwoven with brilliant threads of Artificial Intelligence. Researchers distil the largest, most powerful machine-learning models into lightweight software that can run on “the edge”.

With over 5 billion mobile device users worldwide, enormous proportions of data generated can be used for building intelligent applications. But privacy remains an issue because AI requires data to learn patterns and make decisions. This is where Federated Learning (FL) comes into play.Datatechvibe-Logo-Explain

First introduced by Google AI in 2017 in a post “Federated Learning: Collaborative Machine Learning without Centralized Training Data”, FL enables mobile phones to collaboratively learn a shared prediction model while keeping all the training data on the device, decoupling the ability to do machine learning from the need to store the data in the cloud.

FL allows users to collectively reap the benefits of shared models trained from rich data without the need to store it centrally. Unlike the traditional machine learning pipeline, where data is collected from different sources (e.g. mobile devices) and stored in a central location, FL avoids centralised data collection and model training. It’s all about training multiple machine learning models on mobile devices and then combining the results of all such models into a single model that resides at a server. The user’s data is leveraged to build machine/deep learning models while keeping data private.

“Federated learning is a solution to the problem of building models based on data from many sources (e.g.individual cell phone users) while always keeping it under the control of those sources,” explains Pedro Domingos, a Professor Emeritus of Computer Science and Engineering at the University of Washington.

Also Read: Are Privacy Laws Driving Privacy-Protecting Analytics?

FL brings so much to the table for machine learning in a hyper-connected world, here are a few benefits:

  • It enables devices like mobile phones to collaboratively learn a shared prediction model while keeping the training data on the device instead of storing it in a data centre.
  • Moves model training to the edge, namely devices such as smartphones, tablets, IoT, or even organisations like hospitals required to operate under strict privacy constraints.
  • Makes real-time prediction possible since prediction happens on the device itself.
  • The prediction process works even when there is no internet connectivity.
  • FL reduces the amount of hardware infrastructure required. It uses minimal hardware and what is available in mobile devices is more than enough to run the FL models.

In healthcare, FL is successful specifically in the context of brain imaging by analysing magnetic resonance imaging scans of brain tumour patients and distinguishing healthy brain tissue from cancerous regions. Last year, Intel announced that 30 institutions across nine countries would use the FL approach to train a consensus AI model on brain tumour data.

Also Read: Real-time Data Analytics Predictions for Businesses

With traditional ML, businesses dependent on FinTech face several issues, including getting clearance and lawful consent and preserving the data and the time and cost in collecting and transferring data across networks. Here, FL, by keeping the data local, allows the use of edge devices and edge computing power, analyses credit scores and learns a user’s footprint to prevent fraudulent activities KYC without transferring data to the cloud, and paves the way to prevent risks.

In the insurance sector, FL, without violating the data clause, could help a company identify its users’ patterns, prevent fraudulent or wrongful activity. The algorithms could train and govern according to the data. And not violate the insured’s confidentiality.

Also Read: How AI Is Transforming Renewable Energy

Apple utilises FL to improve Siri’s voice recognition. In blockchain technology, FL updates the model and keeps the organisation’s privacy and data preserved. FL also plays a vital role in cybersecurity. It preserves the data on the device and only shares the updates of that model across connected networks.

Meanwhile, Google, which is attempting a rebrand with a suite of new privacy controls that give people more power over their data, said it was working on its “federated learning” model for determining how to target groups with specific interests rather than individuals.

As more privacy regulations are now in effect to protect the information, many organisations have begun utilising FL. They train their algorithms on various datasets without exchanging data, and FL secures the data collected through different mediums. It also keeps vital information local. Without a doubt, FL could be a game-changing aspect for various industries.