Skip to Main Content
Access your saved content
The growing tide of unstructured data—known as big data—is placing new demands on traditional IT infrastructures.
In this article, Accenture explores the issues involved and proposes some approaches to help companies capitalize on the hidden value in this torrent of information.
The ability to take advantage of big data has clear and significant benefits. But big data also poses challenges to the data center in the form of volume, variety and the need for speed. Accenture’s point of view provides IT teams a spectrum of big data solution patterns, ranging from clusters built on commodity components to pre-engineered solutions like Oracle’s Big Data Appliance built with enterprise class components. We also review Oracle’s Big Data Appliance which is designed to provide the high levels of performance, availability and security required for enterprise systems. Oracle’s solution includes a complete set of big data software, such as Hadoop, NoSQL and connectors to the Oracle database to ease integration with the overall analytics ecosystem.
For years, companies have been contending with a rapidly rising tide of data, which now includes information from a variety of sources such as social media, sensors, machines and individual employees. Companies are rapidly exploring technologies for analyzing this kind of data to gain competitive advantage.
For many, unstructured data represents a powerful untapped resource, one that has the potential to provide deeper insights into customers and operations and ultimately help drive competitive advantage. But this data cannot easily be managed with traditional relational databases and business intelligence tools.
All of this has led to the development of new distributed computing paradigms known collectively as big data, and analytics technologies such as Hadoop, NoSQL and others that handle unstructured data in its native state. These technologies allow companies to explore how to increase the efficiency—and tame the total cost of ownership and flexibility—of their underlying IT infrastructures.
In reality, the advent of big data is bringing new, unprecedented workloads to the data center. Handling those workloads will require a distinct, separate infrastructure and IT will need to find ways to manage both the old and the new simultaneously—and ultimately bring the two together.
The benefits of being able to take advantage of big data are clear and significant. But so too are the challenges that it poses for the data center. Big data will require its own, more cost-effective approach to infrastructure—and in many cases, that approach will represent a shift from past practices. In fact, Accenture believes that big data will often require a more decentralized model than is now current.
This decentralized approach has several advantages. For example, it provides cost-effective flexibility with the ability to scale out quickly to include thousands of relatively inexpensive servers.
Accenture foresees a rebalancing of the database landscape as data architects embrace the fact that relational databases are no longer the only tool in the toolkit. “Hybrid solution architectures” will mix old and new database forms, while advances pioneered in the new infrastructure will be applied to invigorate the older infrastructure. In short, tomorrow’s conversations about data architectures will center on rebalancing, coexistence and cross-pollination between the two infrastructures.
Infrastructure professionals will have to address a number of new challenges. They include:
Managing large-scale big data platforms.
Accommodating new demands on network infrastructure.
Finding space in the data center for large numbers of commodity servers.
Laying down multi-petabyte storage.
Protecting this valuable data.
Developing and implementing IT governance for big data.
Integrating the big data platform with the rest of the IT infrastructure.
IT groups will need to take a comprehensive, multidisciplinary approach to create big data platforms.
IT infrastructure teams should work with other IT professionals who can provide perspectives on analytics, risk and compliance, business applications and IT governance. These varied perspectives can help ensure that data center services are reengineered for the volume, velocity and complexity of big data, and that there is a path to bring the big data and traditional architectures together—with an ongoing focus on the economics involved. A “data-centric” design is more important than it has ever been.
It is also important to recognize that there is not a single “one size fits all” approach to the big data platform. Each company’s situation will be different, which makes careful upfront planning critical. Infrastructure teams need to fully understand the impact that big data will have on the data center.
In other cases, packaged, engineered systems might be appropriate—particularly when time-to-implement is critical. These solutions may involve more upfront hard costs than clusters of commodity servers. But because they bundle technology and software, it is possible to get them in place much more quickly, and avoid the complexities (and additional costs) of implementing Hadoop and connecting hardware, which can be significant.
For example, the Oracle big data appliance can streamline such integration for companies that are using Oracle databases and business intelligence tools, enabling them to handle structured data with a single vendor.
As is the case in all IT projects, big data projects must be aligned with business strategy, looking beyond cost to support business agility and growth.
December 18, 2012
Skip Footer Links