Firms embracing a strategy that accounts for data gravity will produce business benefits that will help them thrive
Organising our “stuff” has always been the most challenging thing in a household. When we see the limit of practical effectiveness in our house, we make tough decisions. And the same goes for businesses that face a challenge in managing data. Of course, that’s on a bigger scale and far more complex.
Marie Kondo, who has become a household name in the art of organising, suggests setting clear goals; without them, tidying up loses meaning. The same should be applied to business data. For enterprises, access to a greater volume of data, stored where it is processed, is better for the business, its applications, and decision-making. But storing and managing data brings problems too, including higher costs and lower system performance.
While planning data management strategies, it’s essential to keep in mind to avoid data gravity before it grows to unmanageable levels.
Since Dave McCrory first coined the term “data gravity” a decade ago, it’s become one of the biggest challenges in IT.
The concept of data gravity is that the larger the amount of data, the more applications, services, and other data will be attracted to it. Therefore, considering where an application lives is important to the success of that workload. The closer they are to the data, the better the latency and performance.
But as more data comes together, it becomes difficult or impossible to move data and applications elsewhere to meet the business’s workflow needs. And soon, costs rise, and workflows become less effective. As data gravity grows, it puts strain on IT infrastructure.
Data centres, the meeting points for all networks and the clouds, are where the data gravity occurs as workloads are magnetised to these digital hotbeds, and customers find it difficult and costly to move information out. According to 451 Research, 34 per cent of enterprises say that cloud storage egress charges have affected their organisation’s use of cloud storage.
According to experts, AI and the internet of things (IoT) risk creating data gravity if organisations fail to plan for data growth.
How can enterprises tackle the problem?
According to Forrester’s study, a complex environment requires a data gravity strategy. As organisations look to improve data-driven decision-making and make data more accessible across the enterprise, they must modernise, simplify, and automate their data management processes.
However, hybrid cloud environments and the deployment of analytic workloads in specialised environments, rather than on the operational platforms where the data originates, introduce layers of complexity and the need to continuously copy and move data.
Embracing a data gravity strategy could alleviate some of this unwanted complexity and cut down on performance, security, governance, and quality issues.
Some organisations are turning to hyperconverged systems, which include storage, processing, and networking in one box. Bringing processing and data closer together delivers proximity and cuts latency.
In the cloud, capacity scales more smoothly, and firms can match data storage more closely to data volumes and ensure cloud storage works with their current analytics applications.
Another option is for businesses to ensure agility for their most valued assets by putting their data in a neutral environment with low-latency connectivity to the interconnected data centre and bringing it to the nexus of all clouds.
If data is an essential asset to businesses in every vertical, then managing ever-increasing data is crucial. Since data gravity cannot be eliminated and needs to be managed, it’s important for firms to ask the right questions while planning data management strategies – what different kinds of data are you keeping at a location? What are you expecting from it? How much should data be housed at that location? Could some be analysed closer to the edge?
Answering these questions and making these decisions in the planning phase will help alleviate data gravity and avoid poor performance.
Data gravity can cause costly data movements for firms using multiple clouds if application and storage architectures are poorly designed. They need to look at the datasets they hold prone to data gravity.
In data planning, resolving issues after implementation is costly. It does not just take significant time to fix but can negatively affect the bottom line and customer experience.
Furthermore, capacity planning and long-term lifecycle management are vital to planning for the effective collection, use, and storage of data. Otherwise, data will revert to a state of chaos.
According to Forrester, firms must understand that embracing a strategy that accounts for data gravity will solve technical issues and produce business benefits that will help them thrive.
Data gravity is growing in magnitude and reaching across organisations and industries. To tackle it, IT capacity planning practices are crucial to define goals and create avenues that help protect data processing speed and minimise cost.
If you liked reading this, you might like our other stories
Keeping Up With Data Regulations In The UAE
How Data in Motion is Gearing Up in Sports