Mainframes: they’re known. They are reliable. They’re the workhorses that have served as an IT foundation for decades. These very reasons are likely why they continue to remain at the heart of many enterprises. In fact, nearly 70 per cent of Fortune 500 companies are still reliant on mainframes, as are 92 per cent of the top 100 banks.
However, they are also holding businesses back due to integration difficulties, rigidness, and lack of real-time data availability. Because of this, data that are stored in legacy databases, including VSAM, Adabas, and IDMS, is overlooked or difficult to access. This data – often referred to as “dark data” – is incredibly valuable and critical, but thanks to the inability to easily source it, it’s just sitting on the mainframe, not being used.
As the pressure on traditional enterprises to compete with the likes of cloud-native startups continues, it also puts in motion an entirely new shift towards digital capabilities and agility. Some enterprises may even start to look to adopt some of the startups’ offerings themselves as a result. Unfortunately, many organisations can’t even consider competing with or adopting new and modern data architectures if they’re still using legacy mainframe applications. Complete modernisation away from monolithic systems and architectures is the only way to prepare to tackle the next wave of innovation – here’s why.
Mainframes can’t keep up with the amount of data today’s enterprises generate
The statistics get more mind-bending every year. In 2009, all the digital storage in the world could hold 487 exabytes of data. By 2025, we’ll produce nearly that amount every day — 463 exabytes.
That includes searches, reports, supply chain data, and more. All of these can hold the keys to cost savings, greater efficiency, and process changes that will reap competitive advantages. But legacy databases aren’t equipped to keep up with the business demand for data today or the volume of data they’ll generate in just a few short years. Continuing to rely on a mainframe limits your ability to broadcast and process real-time data and how much you can improve.
Data now moves too fast for mainframes
When mainframes were built, real-time processing wasn’t yet business-critical. Now every second of delay in data delivery creates risk, especially for highly sensitive processes, in industries such as health care or financial services. Planes, self-driving cars, and other devices offer other examples of high-stress, high-risk environments where data needs to be processed as it’s gathered.
Real-time consistency of data processing isn’t a hallmark of the mainframe. And with the amount and complexity of data generated, a lot of data processing speed is already spoken for.
How to extract the value of your data and evolve from the mainframe
It’s not just that mainframes aren’t equipped for the diversity and speed of data demand but that you’re passing up the opportunity to take advantage of the data hidden in your systems.
There are ways to modernise your data architecture to make use of the data you already have. One insurance company, for example, had years of adjusters’ reports it hadn’t realised were available. The company built an algorithm to analyse those reports and correlate them to instances of fraud, ultimately realising $12 million in subrogation recoveries.
Accessing the data doesn’t have to involve a complex or expensive migration either. A UK government bureau synced its IDMS network databases to Oracle data warehouses to gain greater visibility and reporting. The bureau records and stores corporate reporting data, and this move not only made the data more accessible but reduced costs and streamlined development.
A modern data architecture includes several key features. First is a cloud data lake, which can better keep up with the volume and complexity of data and scale as the volume and velocity of data increase. Next, it’s important to enable self-service for employees who can extract value from the data, as the government bureau did, a move that saves data scientists countless hours preparing data and frees them for more valuable processes like the analysis. Finally, you need to enable real-time data processing, which gives you the processing power you need for your critical operations — even during a spike in data traffic — while also eliminating everyday glitches.
If you’re still relying on a mainframe, you’re likely leaving valuable data and greater efficiencies on the table. Modernising your data architecture doesn’t have to be painful, and can realise valuable rewards for your organisation.
Written BY: Tim Jones, Managing Director of Application Modernization at Advanced, has more than 30 years of IT experience. At Advanced, he helps organisations maximise their investment in critical legacy applications through transformation to modern operating environments, ensuring they remain competitive and ready to take advantage of new and emerging technologies.