News >> Viewpoints

Control of total cost of ownership (TCO) can be realised by automatically moving data objects to appropriate media, as criticality or value changes, and ensuring that such data is easily located and used regardless of where it physically resides Control of total cost of ownership (TCO) can be realised by automatically moving data objects to appropriate media, as criticality or value changes, and ensuring that such data is easily located and used regardless of where it physically resides.

The core of this practice is a set of policies that govern how diverse data objects are handled from the time they are created until they are ready to be deleted. This also requires the ability to handle a wide range of data types, operating systems, storage devices and other infrastructure technology.

Data lifecycles

Managing data according to lifecycles specifically allows organisations to address a number of data management dilemmas, such as reducing costs by pushing data growth to more cost-effective, scalable storage, and addressing regulatory compliance and corporate governance requirements.

It also assists in improving availability and performance by not wasting high-end resources on seldom-accessed data, delivering appropriate levels of data protection and disaster recovery, as well as alleviating recovery delays of non-critical data and ensuring free space for users and applications.

Over the life of a data object, a number of factors will influence how it is handled. Government regulations, user and application service levels, available media and internal best practices all operate in different and sometimes conflicting ways.

Take the example of e-mail. The government may require e-mails to be kept in an immutable format for seven years, after which it may legally be erased. Users, of course, want to have it in their mailbox forever; but IT cannot support infinite mailboxes and still meet budget and service-level requirements. IT may prefer to move anything older than 30 days off the server.

A more obvious factor impacting on data management is the available hardware. Many organisations have a wide range of online, near-line and offline devices. Understanding these resources is key to an ILM practice, making storage resource management (SRM) tools even more valuable. These tools help make sense of available resources and can help organisations understand what capacities they have at what level of protection, availability and performance.

Weighing up priorities

The results of the data and hardware analyses can then be reconciled with business priorities into policies that will govern how subsets of data will be treated over time. It is this analysis of influences and weighing of priorities that is often the most difficult and time-consuming part of an ILM practice.

The complex landscape of compliance means that the tools and practices used to address this issue must be flexible enough to handle a range of data and storage types, and it must be open-ended to accommodate future rules changes or new technology. The ideal ILM practice will offer a set of features native to the solution, as well as support APIs or configurations that facilitate interoperability with other applications, platforms and devices.

Content is key

Detailed, documented processes help facilitate compliance, but the reason many of the current regulations were put in place is to ensure that the content of that data is available as required. Simply turning over data within a given data range could result in the discovery of unrequested, but nevertheless incriminating data.

By incorporating content indexing into ILM practices, companies can provide for enterprise-wide searches that span a variety of data types. This also allows reviewers to focus on highly relevant data and not waste time reviewing files or messages with no connection to the issue. This helps ensure that searches are thorough, detailed and accurate and reduces the time lost and money spent on the discovery process.