Privacy Policy

Just as your laptop over time becomes clogged up with old files, copies, versions, temporary files, downloads, plug-ins, unread emails and other stuff you will likely never need or look at again, enterprises today bob along uncertainly on a sea of data detritus. And, just as with the PC, that situation leads to issues affecting performance, data governance and even productivity because data and files can’t easily be found or recovered.

In the case of the enterprise, what are the sources of this unhappy situation? We could start by thinking of backups, archives, file shares, content management systems, applications no longer in use, object stores, data volumes that have been kept on a just-in-case basis, cloud services old and new, personal content encouraged by ‘bring your own device’ schemes, systems management software and… well, so many more.

The scale of the problem is larger than you think.

The challenge is ubiquitous: companies in full control of their data are as common as sightings of the great auk. Even well-run data centres can still be repositories of lots of junk. Why? Because IT always has enough on its plate and rarely has time to do a spring clean. It’s also because CIOs are ‘too scared to scrub’. IT leadership often fear that deleting data might come back to bite them; that a file might need to be located to answer a regulatory probe or it might hold the key to something business-critical or might be of value with a discovery tool that inspects log analytics and other digital exhaust fumes.

CIOs hoped that cloud would relieve them of this mess. But instead, companies now have multiple clouds, exacerbating the issue. So, here we are today, running expensive storage subsystems, incurring regulatory and data security risks, seeing performance lag, lacking integration with other systems, overseeing dispirited IT teams that have to chase their admin tail to get things done, and living with flagging service levels that disappoint the business.

Finding that defragmentation solution.

Shine light on that ‘dark data’, however, and the picture begins to look a lot brighter with better insights available and a more orderly approach to data management helping to protect against the risks of poor data governance.

Lower costs, less time squandered in ‘keeping the lights on’, a brisker customer and employee experience, increased brand trust, the ability to move faster and trust in cloud. These are all possible, and this is the aim of mass data defragmentation: to unravel the spaghetti strands that bind the data centre and allow for far greater consolidation, visibility and accountability. Getting to that state is not easy of course: it requires tools to converge platforms; a file system that supports deduplication, indexing and search; very high levels of scalability; and intuitive reporting and visualisation. 

But the rewards at the end of that journey are vast. Manage that process and there are two major advantages here.

First, we save the data centre costs from no longer having to store and manage so much data. How much data falls into the ‘dark category?’ Anywhere from 50 to 90 percent, according to various sources, of your current store is data that serves no great use, so it makes sense to safely delete and destroy. That will reduce costs on storage hardware, storage management software, information security and the storage admins who now have time back to focus where needed. Data centre operations are expensive so any reduction in day-to-day activities is valuable.

Second, no longer alongside valueless data, the remaining data store becomes much easier to wrangle, interrogate and extract value from, so the exercise becomes a win-win.

All too often, companies wait for a crisis to strike before acting to rationalise data: a regulatory query or a legal change such as GDPR comes long and all of a sudden, IT becomes an ‘action stations’ zone where budget and time are allocated to the big, new challenge because it has finally appeared on the radar screens of C-suite executives.  While this all too human sense of urgency can be understood to some degree, it’s a short-term perspective. Look at the positive disruption that new market entrants have created by having a ‘data-driven’ approach to strategy and operations and it becomes clear that becoming data-centric and making data more manageable and powerful should be on the agendas of every organisation.  

There is trouble in their midst though, with the next EU regulations around ePrivacy approaching in 2023, there’s yet another catalyst insight. But far better to bake smarter data management into everyday operations now and start the process of translating your data detritus into data riches, rather than fall foul and be playing catch up over the next decade.