March 18, 2013
Big Data’s Big Gap: The Need for Speed
By: Michael Biltz

Business leaders have bought into the concept that their data contains a treasury of powerful insights that can help their organizations make more money. They’re also getting used to the idea that “data” includes everything from information in corporate data centers to tweets, blogs, and GPS data from mobile phones.

But there’s another aspect of data that business leaders have yet to fully appreciate: data velocity. The concept itself is not new; “velocity” has been part of the “three Vs” construct (together with “variety” and “volume”) for talking about data since 2001, long before “big data” was popularized as a hot technology trend.

Until now the notion of data velocity, however, has been eclipsed largely by the many recent advances in emerging technologies that have unlocked significant increases both in available volume (zettabyte upon petabyte) and variety (spanning all forms of unstructured data from pins on Pinterest to structured records of supply logistics to customers’ purchase histories).

Today, in an environment where instability and market turbulence have become the norm, it’s increasingly important to match the speed of an organization’s actions to its opportunities. If too much time elapses between acquiring the data, using it to generate actionable insights, and actually taking action, a business will lose out to more responsive competitors. More worrisome, if the organization hasn’t already begun using data-driven insight to detect and evaluate opportunities in the first place, it runs even greater risks of falling behind.

That’s why data velocity is one of the IT trends in our recently released Accenture Technology Vision 2013 report, which outlines our predictions on which technology innovations will have a significant impact on organizations—for both their IT departments and their businesses overall—in the next few years.

Companies such as Procter & Gamble are acutely aware of what’s at stake. The consumer goods giant is investing in virtual, “instant on” war rooms where professionals meet in person or over video around continuous streams of fresh and relevant data, inviting the right experts as soon as a problem surfaces. P&G’s objective, CIO Filippo Passerini told InformationWeek, is to give these decision-making forums access to data as soon as possible after it has been collected.

Putting data on skates
A surge of new technologies, including in-memory databases, real-time enhancements to big data technologies and advanced visualization tools, are edging us closer to the promise of real-time computing and creating faster “time to insight.” But even with these rapid advances in technology, it remains crucial for IT groups to rely on non-real-time data where possible, blending fast and slow to solve problems cost-effectively. Applying this type of “hybrid insight” calls not only for changes in architecture but for changes in skills as well. Software-engineering leaders will need to seek out and reward developers who demonstrate a definite “speed mindset.”

Going forward, better decision-making will no longer be about the size of your data—it will be about matching the speed of your insights to how fast your business processes can act on them. Companies that see competitive advantage in “time to insight” are investing not only in tools that can help them accelerate their data cycles, but also in the capabilities that reflect a “need for speed.”

For many organizations, increasing data velocity is no longer just an abstraction or an obscure objective for IT professionals; it is a business necessity that gives them a chance to open up a big lead on their competitors.

To learn more about data velocity and other 2013 technology trends and innovations, download the Accenture Technology Vision report.

Popular Tags

    More blogs on this topic