Technology
BANKS’ BIG DATA PROBLEMPublished : 10 years ago, on
Why ‘big data’ is smaller than ‘copy data’, and what it means for banks and financial institutions
By Ash Ashutosh, CEO Actifio
Finance involves figures, and lots of them. While hardly alone in this respect, the sheer scale of data the banking and finance industries need to retain and access is largely unmatched in the business world. The rewards for being able to spot trends in the patterns in these countless dots are similarly huge, as are the negative consequences of unmet regulatory requirements. “Big data,” a term which has gained currency and gripped the imagination of people in many industries, is very big indeed in the world of Finance. But is where IT executives should begin?
To start with, the term itself is not always well-defined. ‘Big data’ has been used to describe the analysis of large volumes of various types of data. Big data is also a trend covering multiple new approaches and technologies for storing, processing and analysing data. Such analysis can be useful for businesses looking to understand what people are buying, when, where and how. Technology historian George Dyson put it more bluntly: “Big data is what happened when the cost of keeping information became less than the cost of throwing it away.”
For all the fuss about Big Data, recent research conducted by 451 amongst storage professionals, shows that big data accounts for only 3% of the total data storage footprint. If only 3% of data stored is ‘big’, a reasonable person might ask what makes up the rest.
The short answer is copies. It turns out that the real problem – the problem much bigger than “Big Data,” is data proliferation.
We all see this in our home lives. When you take a photo with your phone, and you create a 1 Meg file. But save it to your computer, edit it, post it on Facebook, Tweet it, email it to a friend, replicate it to your tablet and back it up and you’re 1 Meg photo might be occupying 10 Megs of storage on servers spread across your premises and the cloud.
It’s the same in your business. At work you create new data every time you send or receive an email. Software engineers can make tens or hundreds of database copies to accelerate new application development. A single email shouldn’t gobble up lots of storage space, but the copying of large datasets will quickly amass to petabytes inside the modern enterprise. IDC estimates that 60% of what is stored in data centres is actually copy data –multiple copies of the same thing or outdated versions. The vast majority of stored data are extra copies of production data created by disparate data protection and management tools like backup, disaster recovery, development and testing, and analytics. According to IDC, global businesses will spend $46 Billion to store extra copies of their data in 2014. This ‘copy data’ glut in data centres costs businesses money, as they store and protect useless copies of an original.
While many IT providers are focussed on how to deal with the mountains of data that are produced by this intentional and unintentional copying, far fewer are addressing the root cause of copy data. In the same way that prevention is better than cure, reducing this weed-like data proliferation should be a priority for businesses. Actifio’s recent successful $100m+ funding round is testament to some of the sharpest minds in finance recognising this priority.
Like most CIOs, banking and finance IT heads tend to have similar key strategic priorities – improving resiliency, increasing agility, and moving toward the Cloud to make their systems more distributed and scalable. Often they are held back by old software and hardware. Copy data virtualisation – freeing organisations’ data from their legacy physical infrastructure just as virtualisation did for servers a decade ago – is likely to be the way forward. If business divisions work on a single physical ‘golden’ copy which can spawn innumerable virtual copies then exact duplicates of the same file won’t take up server space.
So how can one quantify the advantages that introducing an effective data management system will bring – what change will you notice? Well, a good example of the benefits of copy data management can be seen at one of our customers – Admiral’s Bank.
“Having Actifio, it’s just an amazing freedom and flexibility to do a lot more with our systems that we could never do in the past,” said Byron Bua, IT vice president at Admiral’s Bank. “To be able to manage all that data, to be able to back it up, to recover that much data, has been really important to us. Our recovery times before the Actifio process were 24-48 hours at a minimum. Putting Actifio in place, we were able to bring those recovery times down to seconds, in some cases. That’s unheard of.”
Integrated copy data management also offers the reductions in complexity and cost that come from collapsing infrastructure. “We’ve been able to get rid of three different pieces of backup software: Veeam, vRanger, and ARCserve tape backup,” said Mr. Bua. “The ROI was just unbelievable. It was a 50% reduction in costs, so we were saving about $750,000 in disk cost over five years.”
“It really handles all of your needs, your backup and recovery needs, your disaster recovery business continuity, and testing and development needs. It just really does enable us to be a much more competitive bank.”
The point is this… “Big Data” is big indeed, and learning to cope with it will be a priority for many years to come across many industries, perhaps none more so than Finance. But preparing for big data starts with getting a handle on “Copy Data,” so you’re not multiplying the big data problem even as you try to solve it.
-
Finance3 days ago
Phantom Wallet Integrates Sui
-
Banking4 days ago
Global billionaire wealth leaps, fueled by US gains, UBS says
-
Finance3 days ago
UK firms flag over $1.4 billion in labour costs from increase in national insurance, wages
-
Banking4 days ago
Italy and African Development Bank sign $420 million co-financing deal