Accenture Technology Labs Blog
Bold thinking, commentary and application of new technologies that address many of the key business challenges facing organizations today.
Who does LeBron James assist the most? Let’s pretend for a second that you know nothing about the NBA. Not that hard for some of us. How would you have answered this question? Try googling it - and I’m sure you’ll eventually get to the answer, but it’s difficult to find. Business leaders face this challenge in answering their questions every day. For example, when do my east and west coast offices collaborate well and how can I encourage more collaboration?
(The answer to the first question is Chris Bosh, by the way.)
At Accenture Tech Labs we set aside time for our researchers to explore new technologies via their own interests and, sometimes, those explorations grow into projects. Recently, one of our researchers wanted to showcase how fast you can turn data into insights with the right team of data scientists, developers, and data artists. But what data set to use? For that, we returned to our basketball question above and chose the 2012-2013 NBA regular season.
Today, we’re proud to announce that one of these bottom-up projects was implemented last month and we want to share it with you. You can explore the results yourself at http://hotshotcharts.com. We also have a video walk through.
The Basketball Data Insights web app is a data exploration tool built on open source technology that we like to refer to as Hotshot Charts. It allows you to easily explore whom your favorite NBA player assists and where he shoots from, with what accuracy.
Sports fans (fanatics?) aren’t the only ones who are looking to do more with data. Technology is moving analytical capability and data visualization closer and closer to business users who are asking their IT departments to help derive more and more value from their data. It is important for IT professionals to understand all of the options in their toolkit. Whether it’s custom development, via existing tools such as Tableau, QlikView, Spotfire, or emerging tools like Platfora, our Data Insights R&D team believes that businesses that enable conversations with data will have a competitive edge.
We invite you to have that competitive edge over your friends in debates about who will come away with the 2013 NBA Championship via our Hotshot Charts.
The question is no longer "How do we store big data?" but "What do we do with big data?" The simple truth is big data is useless to businesses unless they can effectively explore their data and draw actionable results. This is where data visualization plays its part. Successful visualizations optimize a person's visual perception for faster understanding of data and discovery of insight.
In the mode of exploratory data visualization, I see the model moving towards "conversations with data". This means a very iterative exploration, where users can ask questions of their data and receive answers fast enough that it mimics a back and forth exchange. With the research in building real-time solutions, we no longer have to wait a day, week, or month for data exploration. It can be done in seconds.
Below I provide a meta example of this conversation with data. Using Google Trends I was able to see the progression of "big data" into a trend over the last eight years.
The first point I want to emphasize is the speed in which I could see the information. My first question was "Just how big a trend is 'big data'?" In mere seconds I could see that from as recently as 2011 - 2013 the search frequency had increased by a factor of 10.
The next question was "How does 'big data' compare to other terms in the field such as 'hadoop'?". I was able to visualize this comparison in real time as my questions arose. In this case I was surprised to see that, historically, "hadoop" was actually searched more than "big data".
An important point is, the data was presented using visualizations instead of a table of numbers. The comparison of the lines gives me a very fast sense of "hadoop" being the most searched, but "big data" catching up with "hadoop" right at the beginning on 2013. If I had to rely solely on numbers, it would have taken me much longer to discover those comparisons. The final graph is an extension of the question "How does 'big data' compare to other terms in the field?" There is the addition of "nosql" and "infographic". In my mind "nosql" goes hand in hand with "big data". I was very surprised to see the searching of "nosql" was so much less than "big data" and that it plateaued instead of following the upward trend of the other search terms.
Even this simple example can demonstrate that being able to visually explore data through conversations is a very powerful tool. Imagine how your business would change if you could have conversations with your data. What would you ask?
Sunny Webb, Digital Experiences
As the manager of a project that utilizes 3D printing, I receive a lot of questions about the prosumer 3D printer sitting in my office. Generally there are two types of inquiries: “Can you print me?” and, “Is this going to change the world?”
To address the first question: Hmmm, well technically, that’s not currently within project scope, but I aim to please, so here’s how you can do it.
And while the later question stems from headline-friendly stories like individuals who want to print weapons or consider medical applications – my answer is yes, 3D printing will change the world we live in – but probably not within those extremes of the possibility spectrum.
Let me explain.
One point along my college journey, I found myself enamored with the study of film. So enamored, that I talked a professor into helping me outfit and operate a full production studio. I picked up several life-lessons:
- You can spend a lot of time doing something that doesn’t pay you,
- In life, it is generally wiser to be behind a camera than in front of it,
- Video equipment used to be expensive, thus limiting content contributors.
I say used to be, because thanks to Moore’s law, we can now purchase a video-supporting Digital SLR for fractions of the cost of those high-quality digital video cameras that I purchased in college.
This is relevant to the conversation of 3D printing because the industry of 3D printing will be to manufacturing the same that low-cost video editing software/hardware was to the film industry. To expand on that thought, consider how the original “content creator” definition within the film industry was initially limited to production staff. In the days of Orson Welle’s creating Citizen Kane, Mr. Welles was contractually tied to act as a content creator for the RKO production company. Fast forward to today, and we have video recording hardware that is priced for the average consumer. Lowering this entry barrier led to the opportunity for non-contractual individuals to act as content creators for new material (think Sundance Festival) – which then in turn are acquired by production companies for further distribution (think Napolean Dynamite, Little Miss Sunshine, and Clerks).
This is an interesting shift, and it’s about to occur within manufacturing for the same reasons it did film: the price barrier to participate has been lowered. For those visual learners out there, I drew you a nice illustration:
The Shift of Content Creators
Traditionally, in manufacturing, new products are created by in-house R&D teams, and then distributed to the general public by large manufacturing operations. While the large manufacturing operations are still and will remain a critical component to the consumer supply chain, lower-cost 3D printers will inject a fresh new stream of “content creators” for innovative product concepts. Soon-to-be-gone are the days of relying on consumers to tell manufactures what they want – instead, they will show manufactures what they want. At that point, it will be up to the manufacturer to ingest the proposed product modifications, or to ignore them.
For example, I suddenly come to the revelation my office chair could be improved by adding a small indention in the armrest where I rest my elbow. Inspired by enhanced posture opportunities, I set forth to create the design of the perfect office chair armrest. Now suppose that after designing, I now have the means to print this with a 3D printer, and I show it to all of my co-workers, who erupt in armrest-jealousy. As an individual, it does not make sense for me to print these armrests at large volume scale - rather, I bring my new concept to the chair manufacturer and suggest an enhancement option for their chair. This example is essentially applying Henry Chesbrough’s open innovation model to product design.
Of course, this line of thought raises questions of intellectual property rights and other debatable topics – however in this specific blog entry, I’m talking about the opportunity for the design shift to occur – not what happens when it does.
The key thought of this upcoming shift is not whether 3D printers can replace manufacturing (they can’t) – but how 3D printers create collaborative opportunities for new product concepts between the individual and the manufacture. Manufacturers who embrace this upcoming consumer-driven design shift will surpass their competition because they will be proactively positioned to survive a trend we’ve witnessed in multiple industries.
Enterprises are always looking for ways to help employees communicate with each other more effectively. The reasoning is simple: better communication leads to faster and higher-quality work, which, in turn, drives increased productivity.
The rise in social networking has breathed new life into efforts to improve internal collaboration. Social technology has changed the way consumers interact, and enterprises naturally want to harness that proclivity toward better communication and collaboration within the enterprise. That’s why seamless collaboration is one of the emerging IT trends in our recently released Accenture Technology Vision 2013 report, which outlines our predictions on which technologies will have a significant impact on organizations—for both their IT departments and their businesses overall—in the next few years.
However, many enterprises are viewing social collaboration trends through the wrong lens. Consumers widely use Twitter, but deploying Twitter to employees won’t solve the communication challenges a company faces. Facebook’s e-mail and document-sharing features are not enough to make the wildly popular social network appropriate for the corporate world. At work, people are motivated to get their job done as quickly and effectively as possible. Using social tools as designed today, to follow coworkers en masse, often becomes more of a time sink than a time saver.
But social technologies can and do work for enterprises. A 2012 study by Nucleus Research found that adding social capabilities to CRM drives an average increase in sales staff productivity of 11.8 percent. The key to capturing the benefits of a highly collaborative, social workforce lies in integrating social technologies into the systems and processes that employees use every day.
For example, adding the ability to comment, instant message, or follow a product through its activity stream within an order fulfillment application promotes a free-flowing exchange of ideas otherwise absent within a distributed supply chain. It facilitates dialogue and education, enabling colleagues and business partners to easily share knowledge and learning.
Embedding social collaboration into business processes
Some packaged-software developers are already adding these new capabilities to their applications. SAP, with its Jam tool, and Oracle, with its Social Relationship Management platform, now allow companies to connect collaboration tools into their ERP and CRM packages. In addition, Salesforce.com has integrated its Chatter collaboration tool into its PaaS and SaaS applications. Collaboration platform Jive allows companies to layer collaboration on top of specific tasks, such as software development. The ultimate result: embedded channels combining search, knowledge management, workflow, and collaboration, which deliver the prized ability to help users more easily and effectively do their jobs.
By tying the integrated collaboration experience to business processes, disparate channels evolve from separate applications into a single user experience. For example, startup Clearspire is using social technologies to reimagine how a law firm is built. It has created its own cloud-based platform that embeds social and collaboration efforts between lawyers and their clients into the processes. Matter diaries, budgets, and task applications are shared seamlessly with those who have access, allowing lawyers and clients to work on their cases collaboratively from any location. The collaboration technology isn’t layered onto the process; it has become the process.
Such integrated systems deliver a richer experience for individual users. But just making a tool available doesn’t mean employees will adopt it. If the usefulness of a new tool isn’t obvious, businesses will never see any ROI. Embedding collaboration requires a cultural shift within the enterprise to change the way it looks at both its workers and its business processes.
What seamless collaboration will look like
The new face of collaboration will show up first as social interactions are integrated into business processes. When employees are able to chat, share information, identify specialists, get recommendations, and find the right answers to their questions directly within the context of their work, they’ll quickly become smarter, more responsive, and more productive. It will be clear who’s participating and contributing, just as it is on social-media sites today, and it will be easy for employees to reach out for information.
But that’s just the start of what’s possible. As part of the broader movement to consolidate siloed IT capabilities into business processes, we expect to see deeper convergence of search and knowledge-management activities that complement collaboration. The underlying challenge is to create a user experience that helps employees get the information they need when they need it.
Enterprises that take advantage of converged collaboration have an opportunity to see significant productivity gains. By enabling employees to work smarter, they are more aware of important context for their decisions and actions. Workers will be more likely to identify problems sooner, reliably find the fixes they need, and share the solutions with the right people.
Further, as enterprises quantify their collaboration efforts, they will reveal a more complete picture of how their employees and their business processes actually work—and they’ll be able to make them even more efficient going forward.
To learn more about seamless collaboration and other 2013 IT trends, download the Accenture Technology Vision report.
Business leaders have bought into the concept that their data contains a treasury of powerful insights that can help their organizations make more money. They’re also getting used to the idea that “data” includes everything from information in corporate data centers to tweets, blogs, and GPS data from mobile phones.
But there’s another aspect of data that business leaders have yet to fully appreciate: data velocity. The concept itself is not new; “velocity” has been part of the “three Vs” construct (together with “variety” and “volume”) for talking about data since 2001, long before “big data” was popularized as a hot technology trend.
Until now the notion of data velocity, however, has been eclipsed largely by the many recent advances in emerging technologies that have unlocked significant increases both in available volume (zettabyte upon petabyte) and variety (spanning all forms of unstructured data from pins on Pinterest to structured records of supply logistics to customers’ purchase histories).
Today, in an environment where instability and market turbulence have become the norm, it’s increasingly important to match the speed of an organization’s actions to its opportunities. If too much time elapses between acquiring the data, using it to generate actionable insights, and actually taking action, a business will lose out to more responsive competitors. More worrisome, if the organization hasn’t already begun using data-driven insight to detect and evaluate opportunities in the first place, it runs even greater risks of falling behind.
That’s why data velocity is one of the IT trends in our recently released Accenture Technology Vision 2013 report, which outlines our predictions on which technology innovations will have a significant impact on organizations—for both their IT departments and their businesses overall—in the next few years.
Companies such as Procter & Gamble are acutely aware of what’s at stake. The consumer goods giant is investing in virtual, “instant on” war rooms where professionals meet in person or over video around continuous streams of fresh and relevant data, inviting the right experts as soon as a problem surfaces. P&G’s objective, CIO Filippo Passerini told InformationWeek, is to give these decision-making forums access to data as soon as possible after it has been collected.
Putting data on skates
A surge of new technologies, including in-memory databases, real-time enhancements to big data technologies and advanced visualization tools, are edging us closer to the promise of real-time computing and creating faster “time to insight.” But even with these rapid advances in technology, it remains crucial for IT groups to rely on non-real-time data where possible, blending fast and slow to solve problems cost-effectively. Applying this type of “hybrid insight” calls not only for changes in architecture but for changes in skills as well. Software-engineering leaders will need to seek out and reward developers who demonstrate a definite “speed mindset.”
Going forward, better decision-making will no longer be about the size of your data—it will be about matching the speed of your insights to how fast your business processes can act on them. Companies that see competitive advantage in “time to insight” are investing not only in tools that can help them accelerate their data cycles, but also in the capabilities that reflect a “need for speed.”
For many organizations, increasing data velocity is no longer just an abstraction or an obscure objective for IT professionals; it is a business necessity that gives them a chance to open up a big lead on their competitors.
To learn more about data velocity and other 2013 technology trends and innovations, download the Accenture Technology Vision report.
Ian P. Blitz
Business success has always been built on relationships. Just a few generations ago, consumers were often friends, or at least neighbors, of the local grocer or pharmacist. But that model changed, first with large-scale industrialization and then with the introduction of IT. Over the last few decades, consumers in general have been treated with greater indifference and far less personal attention.
But the pendulum is swinging back. Technology is finally at a point where buyers can be treated like individuals again. Consumers are more than faceless transactions, more than a cookie file or a demographic profile; they’re real people with real differences. Digital transactions are giving way to digital relationships.
The ability to build, maintain and scale digital customer relationships is one of the emerging IT trends in our recently released Accenture Technology Vision 2013 report, which outlines our predictions on which technology trends and innovations will have a significant impact on organizations—for both their IT departments and their businesses overall—in the next few years.
Digital relationships are at the top of the list of emerging IT trends for good reason. Technology innovation has given companies ways to communicate with consumers in a much more personal way through mobile, social media, and context-based services. Individually, these IT innovations represent new types of user experiences, even new sets of sales channels—but that’s not the real opportunity.
Taken in aggregate, digital relationships represent a key new approach to consumer engagement and loyalty, because it enables companies to manage relationships with consumers at scale. The goal is “mass personalization”—and no, that’s no longer an oxymoron. Mass personalization is about using what you know about a consumer from the communications channels he uses to better understand his behavior and needs. Think of it as providing resort service at a motel cost.
Digital consumers have ratcheted up their expectations about how businesses will communicate and respond. Companies that hope to remain competitive need to find new ways to use these technology innovations to make the consumer feel special as never before, to increase engagement and develop intimacy.
Mass personalization can deliver a variety of new benefits. Because shoppers can move—quickly and entirely digitally—from awareness to recommendation to purchase after interacting on blogs, Twitter, Yelp, and other social sources, it’s actually possible to compress the sales cycle. Offering on-the-spot promotions through digital channels also potentially increases impulse purchases. Providers that are better at controlling that experience will benefit by lowering the costs of sales and marketing and generating greater sales volumes.
Personalization also creates a virtuous loop. The more you personalize the experience, the richer the data you’re collecting becomes. Companies can boost the quality of data in much the same way that night-vision goggles amplify available vision: to shine a light on data and behaviors, already present, but previously undetected.
A new level of intimacy with consumers is now possible. But effectively scaling meaningful digital relationships represents a real change in the way companies need to approach their consumer strategies. This shift is being enabled by technology; however, implementing it will require a new, unified approach across IT and the business.
Now is the time to act. The customers are out there; it’s time for businesses to get to know them better than they ever have before.
To learn more about digital relationships and other 2013 tech trends, download the Accenture Technology Vision report.
Nathan Shetterley, R&D Manager – Accenture Technology Labs
Many around the world just celebrated the Lunar New Year, marking a time of renewal and for many a time to reset on what’s important. For us in the Accenture Technology Labs, it’s a time when we renew our annual Technology Vision, which outlines some predictions on which technologies will have a significant impact on organizations – for both their IT departments and their businesses overall – in the next few years. We do this annual report on the future of IT because technology has become pervasive, and is pushing the boundaries of what’s possible in every industry, every market and every business. In fact, we believe that Every Business is a Digital Business – technology innovations now represent trends in both business and technology.
Our premise for the Accenture Technology Vision 2013
is pretty simple: if you don’t know what’s going on, you can’t prepare for it, and you certainly can’t take advantage of it. Within Accenture, we use the Vision as an input to guide our technology R&D investments; externally, we use the Vision to help our clients not just identify and understand key emerging technologies, but also use them to make their business performance even better – and stand out from the competition.
Authored by Paul Daugherty, Accenture’s Chief Technology Officer; Mike Biltz, Director – Accenture Technology Vision and Scott Kurth, Director – Accenture Technology Vision, this year’s Technology Vision focuses the following IT trends and innovations
- Digital Relationships at Scale
- Design for Analytics
- Data Velocity
- Seamless Collaboration
- Active Defense
- Beyond the Cloud
But the Technology Vision is just a starting point. Yes, it provides a lens for us to focus in on the technology landscape and shows us where to turn next, but it is only useful if we can translate the Vision into real solutions, addressing real problems in real industries. That’s why this year’s Vision presents 100- and 365-day plans for each technology trend so that organizations can take the insights and act upon them. And this is also where the Accenture Technology Labs
comes in – as an innovation engine within Accenture, we work closely with start-ups and technology companies to craft inventive concepts that can help enterprises use emerging technologies to measurably improve business performance.
Stay tuned to this blog for more detail on each of the seven Accenture Technology Vision Trends as well as a peek into the research and activities the Technology Labs conducts. Our experts will highlight not only the technologies disrupting the business world, but also how those disruptions present opportunities for companies ready to take advantage of them.
I can be paranoid about many things. Some of us worry if the sky is falling or if world is going to end. Will it end on December 21st, 2012 as many believe that the Ancient Mayans have predicted? We all have fears in life and if some recent news is any indication, (here is a reference to some disturbing news that should get you thinking ) then I should probably be wearing a foil cap, lead lined underwear, and a personal cooling system. Whenever I am at a restaurant, I always moan and groan about those in front of me in the buffet line. One has to ponder many questions that validate the desire to consume said food. First, who touched the buffalo wings? Second and thirdly, when did they touch it and how did they do it without mixing utensils from other food trays? Fourthly, why did they have to touch my beloved buffalo wings that I have been salivating over since standing in the buffet line? Most importantly, why is there this weird green, partially fuzzy, semi-gelatinous blob on the same tray as the buffalo wings (no thank you, I will pass on buffalo hot wings today and instead take the greasy cheesy goodness known as macaroni)! Pondering these basic questions makes my mind race with uncertainty, I begin to distrust what I see before me and I begin think of follow on questions that eat away at me. Why did the disheveled person in front of me in line not wash their hands before exiting the restroom after exiting a stall (sadly, a true story that many have had the horror of witnessing)! Maybe it is my imagination but I find that many people are not very sanitary and kids are worse with their perpetually sticky hands. You have to wonder where they have been, what they have been doing, and how did they arrive at their current state of affairs. In addition to who is around you, you also have to start to ask questions about what happened to the food before you got to the buffet line and what happened before the food itself arrived in its little heated and stainless steel tray. Did the staff play hockey puck with the fish patties, is there MSG in any of this food, how many food violations has the kitchen had total, and so on. Every time one question arises and closes, many more can pop-up. The same types of worries and concern permeate many aspects of the Enterprise environment. During the process of data retrieval, many of the same questions arise. Tracking the origin of information is the domain of data lineage and if we could only do this for the food items I like to eat, I probably would NEVER go out to eat again. What is this foreign concept of data lineage that I speak of, why is it so important, and what can we do about it?
Comparing enterprise data lineage to the local hole-in-the-wall buffet sounds hair-brained, but the very same questions ring true. Just as one might ask about the greasy buffalo hot wings so does one need to ask about data, what is data lineage, and what are the implications. Information and data flows fluidly from all points within a company, passes through systems and subsystems, is consumed and exported by applications, may or may not be modified, and can become an aggregate of several other data points of information. Data may eventually be stored in one or more locations and can reside in databases, documents, spreadsheets, or even emails. Along the way, the origination point of data, its lineage information (who, what, where, when, why, and how) is obscured, may contain gaps, or may be lost. The process of capturing the changes of data over time involves the tracking of its lineage as applications consume it and interact with it. Data lineage is meta-data that captures information about the history and provenance of data, which is critical to answering key business questions such as:
· Who created or modified the data?
· What operations were performed on the data?
· Were there any elements of uncertainty introduced or injected into the data?
· When were modifications or changes introduced to data?
· Which business processes and/or systems touched and processed the data?
· What were the previous values of the data throughout its life thus far?
· From which sources does the data originate?
· What is the reasoning for any modifications to the data?
The next question to ask is why is data lineage important? The lineage of data can be imperative for businesses such as financial institutions that must abide by governmental regulations, such as Basel II  and the Sarbanes-Oxley Act  as enforced by the Federal Reserve  and the U.S. Securities and Exchange Commission (SEC) . Such regulations require knowledge of the life and timeline of data. In such instances, institutions must provide proof to the veracity and authenticity of information within a timely manner after a request for any information (i.e. section 409 of the Sarbanes-Oxley Act ). Acquiring the entirety of the lineage for data can be time consuming and may be error prone due to the various interactions of people and data retrieval. The complications of tracking data lineage increase when branching of information occurs and various departments or people have different versions of the data within an organization. Matters begin to complicate when tracking down changes from the branching of data may merge at later steps in the lifecycle of the data. Utilizing data lineage in an enterprise can begin to tackle several of these business needs:
· Shorter business making cycles
· More efficient and cost-effective compliance and audits
· Enhanced data loss prevention
o Especially in data aggregation situations
o Allow for finer grain access control
· In-depth data analytics
Decision-making occurs in all industries, but the effort varies between them. Pharmaceutical companies, for example, spend significant effort to determine the lineage of clinical trial data. This can prolong the decision making process of whether to advance or kill a drug.
Compliance and auditing is a necessity in financial services and many financial service companies spend significant effort to locate and prepare information for audits, much of which is lineage related. Lineage also plays an important role in determining how reliable are exposure risks reported by banks to regulators required by compliance initiatives such as Basel III.
Without proper lineage information, combining data from multiple sources can result information release and data loss problems. Once data has left an enterprise or entity, it can be very difficult to control its exposure and next to impossible to retract. Additionally, data loss has risks associated with it.
The lack of proper data lineage information can also degrade the impact and value proposition of data analytics. For example, insights have less value if the sources of the data and information are not trusted or are unknown.
Now, imagine opening an Excel spreadsheet or Microsoft Word document and distressing over by the numbers that appear before you. The document is a quarterly earnings statement, but there is something amiss, the numbers do not appear to add up. Normally when this happens, you would go back and check your numbers. If one of your job tasks is to track where the numbers come from, you may end up talking to people, looking at logs, finding out who calculated the numbers, and from what piece or pieces of information it derives. This could take hours to days depending on the complexity. If you are in a hurry or have a meeting, it is next to impossible to get this information in a timely manner. A data lineage tracking tool that is integrated with common office tools is a click of a mouse away and can ease the anxiety. Using our data lineage tool, right-clicking on the questionable numbers, and selecting “Get Data Lineage” produces a quick report of changes (the who, what, where, when, and why of data lineage) and produces a graph depicting the data flow and how it became its current value. Armed with this information, you can now feel confident that you will be able to answer any questions that arise. With this in mind, we decided to see if it were possible to create a data lineage tool that could tackle some of the issues described above.
One of the challenges in building a data lineage solution is the ability for its use by both existing assets and new assets in an enterprise environment. Existing assets and their owners may be unwilling or unable to modify their systems to utilize a lineage system. In order to accommodate both new and existing assets within an enterprise environment, a multiple modal architecture is required that is minimally invasive. To meet this need we designed the system to operate in a mediation mode as well as a monitor mode
The data lineage tool allows for integration with a myriad of external tools using web service calls. The real magic comes into play when the data lineage tool is demonstrated as an integration with the Microsoft Office  applications such as Excel  and Word . The integration with the Microsoft Office suite also allows “copy & paste” operations to persist between applications. Furthermore, the lineage information embeds itself within a value within the same document despite it being “copy & pasted” as well as across different documents and spreadsheets, whether or not a host system has the data lineage add-in install or not. In Excel, a user simply selects the desired asset, resource, and table combination and then retrieves the data lineage. Using the Microsoft Excel ribbon, the data lineage and data flow may also be discovered after highlighting an excel cell of interest (see figure below).
A unique feature of the integration is the ability to monitor and detect anomalies. When a value is determined to be a certain degree different from previous values, red highlights will draw attention to the detected information. This provides a starting point for an individual to investigate problems that may exist in a system.
The data lineage provides only limited information of a value at the table level; however, the data flow can enrich the experience even more by showing how the data arrived at its current value as it passed from one database to another through applications. The data lineage tool is able to track data flow not only within a database, but also across databases. The tool displays information as a directed graph with a clear delineation between applications and resource stores (see figure below).
Data lineage has a broad range of applicability. There are many ways of tackling the problem of tracking down the life cycle of data, each method with its own pros and cons. What we have done is created a method that is minimally intrusive on an enterprise environment. We have done this by creating a tool that can work with both new assets and old assets utilizing different access methods for acquiring the data lineage. Tracking data lineage is becoming increasingly important for many companies and can aide in many business processes. While we can’t help much concerning those juicy buffalo wings I mentioned earlier, we do have a way of finding out how data is manipulated throughout its lifetime. As for the buffet line, nothing short of a QR code cooked into my buffalo wings with a link to an online food database or lineage trail may help me decide whether I can trust the food or not…
1. “Laptops damage sperm? What wi-fi study shows”, http://www.cbsnews.com/8301-504763_162-57332822-10391704/laptops-damage-sperm-what-wi-fi-study-shows/
2. “Laptops May Hurt Mail Fertility, Study Suggests”, http://www.cbsnews.com/2100-500165_162-7044716.html
3. “Laptop WiFi May Damage Sperm, Study Suggests”, http://www.huffingtonpost.com/2011/11/29/laptop-wifi-sperm-damage-electromagnetic-radiation_n_1118726.html
4. Basel Committee on Banking Supervision, http://www.bis.org/bcbs/index.htm
5. Sarbanes Oxley Act of 2002, http://www.sec.gov/about/laws/soa2002.pdf
6. The Federal Reserve Board, http://www.federalreserve.gov
7. United States Securities and Exchange Commission, http://www.sec.gov
8. Microsoft Office, http://office.microsoft.com
9. Microsoft Excel, http://office.microsoft.com/en-us/excel
10. Microsoft Word, http://office.microsoft.com/en-us/word
A curdling scream and gasps can be heard from within the halls of the workspace. Hearing this, you jump into the nearest broom closet only to emerge dressed in a suit made of spandex, a bright red cape, and the letters DQ, short for “Data Quality,” emblazoned on your chest. Running down the hall, you find the commotion, quickly administer prompt justice, and vanquish the evil dirty data that is causing people to collapse and ball up into the fetal position. Taking a few glances at the data, you run it through the Accenture Data Quality Rule Accelerator and POOF! Once again, you are the hero!
Okay, so maybe that’s a bit dramatic. In all likelihood, you wouldn’t be wearing a cape; too often, those things get caught in pesky and annoying doors or cause a face plant into the ground. I do encourage you to wear a facemask though!
Data Quality initiatives aren’t quite as glamorous. In reality, you won’t be leaping tall buildings in a single stride, but rather copious amounts of data and information. You won’t be emerging from a phone booth with a gleaming suit on, but you may run to the broom closet in fear though. What I can tell you is that data is often dirty and often the amount of information that organizations have can be so overwhelming that tackling data quality issues can be difficult, costly, headache inducing, and may leave you with the desire to jump to other projects.
A recent survey of organizations found that most have yet to calculate the ramifications of poor data quality. So what does this mean? This means that most organizations don’t know what to do about the data that they have. When approaching a data quality engagement, those on the ground know how tedious, time consuming, and frustrating it can be to sort through the mounds of information. This only compounds the problem as it causes headaches and the desire of some to avoid all together for fear of spending even more money on an issue that some might see as a money pit.
Furthermore, many of the problems of data quality are no longer focused on names and addresses anymore. Dirty data is frequently encountered on engagements, it is pervasive, and data can be of such poor quality that 30%-80% effort spent in a data integration initiative is spent on data cleanup and understanding. The process of clean up and understanding involves interviews with subject matter experts if they are still around, trying to find documentation, reading all materials, and the manual discovery of and creation of data quality rules that can guide the process to scoring the cleanliness of data.
For this reason, I want to introduce the Accenture Data Quality Rules Accelerator (ADQRA). The ADQRA tool, currently in beta, is a part of the larger R&D initiative surrounding data quality and was created in our Technology Labs with the support of AIMS. The accelerator can take what would normally comprise of a significant amount of time and seed data quality efforts. The accelerator takes a data set and automatically returns with a set of data quality rules that can be used to pin point what data is dirty, which data isn’t, how dirty the data is, and can be help to determine how much effort is needed. The current version of the tool essentially detects inconsistencies in a given dataset, one of the six dimensions of data quality. The discovered data quality rules can then be used to enforce proper data entry or even discover interesting patterns. The ADQRA can do this with in a short amount of time because it is always checking for the stability of a rule as it encounters data. This means that it doesn’t need to look at the entire amount of data to determine a data quality rule. Results are returned relatively quickly and it is tolerant to dirty data.
Using the Data Accelerator is a three-step process: load your data, discover data quality rules, and then browse the rules.
In the current version of the Data Accelerator, data is selected and uploaded.
Once the data is uploaded, the user can tweak the results by manipulating several parameters:
- Maximum number of rules – specifies that the ADQRA should return no more than the specified number of rules and that once it discovers that number of rules to stop and return the results
- Maximum number of conditions – specifies the total number of conditions in a data quality rule where a rule is composed of conditional values on the left side of an “if-then” rule.
- Maximum number of seeds – specifies the of condition combinations
- Coverage – specifies the minimum amount of data a rule should cover for it to be considered interesting
- Error rate – specifies the expected error rate in the data
- Frequency – specifies how frequent the ADQRA checks for rule stability
- Window size – specifies the number of tuples to consider at a given time
Once the rules are discovered, they can be browsed, edited, deleted, approved, and then exported to a format suitable for use with Informatica.
If any of this is remotely interesting to you, then I encourage you to contact Accenture and Accenture Technology Labs about the Accenture Data Quality Rules Accelerator. Take it for a test drive and see how it can help you. With the ADQRA you too might become a data quality super hero (but please, leave the spandex at home).
Alex Kass and Manish Mehta
What if work could be more like our favorite games? This question has been inspiring growing interest in bringing “gamification” to the workplace, which means leveraging games and game mechanics to help shape workplace behavior.
The idea that businesses might leverage the enjoyable – even addictive – power of games to engage and influence both consumers and employees has a powerful appeal. You can see this in the growing number of blogs (e.g., Gamification.co and Zdnet), books, and conferences demonstrate. This appeal has been further enhanced by the emergence of gamification software vendors, such as Badgeville,and Bunchball, who make it relatively easy for an enterprise to get started with gamification. Technologies now exist to help “gamify” existing applications, processes, and interfaces by weaving in game mechanics such as reward points, leaderboards, badges, and the ability to ‘level up’. Enticed in part by the onramp these vendors have created, many companies are exploring gamification, and some, such as LiveOps, have made gamification of work a core part of their service delivery model.
The same techniques that keep Farmville players working their plot seem to have real application in the workplace as well, but how broad and deep are the potential impacts? Is it limited to providing employees with small nudges to perform chores like turning in surveys, filling out expense reports on time, perhaps even moving customers through a check-out line a bit faster? Or is gamification something that can address deep-seated and complex behavior change? Can it be used to help change beliefs and priorities? Can it be used to promote behaviors which will take intensive effort to learn, or which employees don’t even realize they need? And can games be used to create change that lasts even after the game is out of the picture?
The answer to all these questions may well be yes, but it will take an expanded repertoire of gamification techniques, specifically designed to address a broader swath of the behavior-change lifecycle: One key insight from the literature on persuasion and behavior modification on a range of topics from cancer to computing is that behavior change often follows a stereotyped pattern of stages and that each stage presents different challenges. The stages in this behavior-change lifecycle each pose different kinds of behavior-change challenges that must be understood if we’re to create game-based techniques to all the stages.
Here’s a 5-stage version of the behavior-change lifecycle which we have adopted from the literature:
- Raising awareness: Understanding exactly what the as-is behavior patterns are, and recognizing that there is opportunity for improvement.
- Building buy-in: Committing to the commitment of time, energy, and resources, needed to execute the change.
- Learning how: Understanding the mechanisms and techniques that underlie the target behaviors
- Initial adoption: Trying out the target behaviors, getting used to actually executing them.
- Maintaining and refining: Perfecting the new behaviors through extended practice so that they become they eventually become self-sustaining.
Most gamification we have seen focuses on Stage 4 and, to a lesser extent, stage 5: promoting the initial adoption, and then the maintaining of target behavior patterns. Stages 1-3 are often all but ignored. This is fine in situations where raising awareness, building buy-in, and understanding the basic mechanism of the target behaviors are not crucial issues. In some cases, merely motivating initial adoption of behaviors is sufficient to generate buy in because the initial effort is not too great and the advantages become self-evident once the behavior is adopted. For example, if you can motivate someone to exercise for several weeks, he might start feeling good about himself, start enjoying the activity, and thereby start exercising on a regular basis.
But in many more challenging behavior-change scenarios, stages 1-3 play key roles in an effective behavior-change program. Consider some examples: An employee who uses a condescending tone with customers may not even realize they are doing so, which would mean that raising awareness of the problem is a first, crucial step toward sustainable behavior change; an employee who is too blunt or cursory with colleagues may not buy into the need to provide more tender-loving care, because there is no explicit connection made between that behavior and the morale or retention problems that it causes; an employee who does not understand the mechanisms for carrying out a new business process will be unable to respond to incentives to execute the new process – regardless of how well game mechanics are used to provide that incentive.
One game mechanic that can be effectively used for achieving buy-in is cause-and-effect game simulation. These simulations can help raise awareness of the impact of the user’s existing behavior patterns and the need for change. A simple example of such a simulation is Stone City game commissioned by Cold Stone Creamery, in which employees learn to scoop the right portion-size ice cream. An aspect that we see as critical to achieving buy-in – and thus to sustaining motivation beyond the confines of the game – is the game illustrates the long term repercussions of incorrect portioning behavior on the profitability of the company. Simulations can make long-term consequences, which motivate change, visible in a compressed timeframe. Outside the enterprise, games such as World of Warcraft motivate hours of detailed work, planning, and skill building, by making clear connections between that work and a big mission that players find meaningful.
The concept of gamification is currently enjoying a successful stint as a kind of ‘child star’ but now it is time to see whether it can transition to equally-successful work in adult roles. The true potential is not fully known, but we expect that as more organizations recognize the need for a more extensive behavior-change toolkit, exploration of more advanced gamification will produce a range of effective –and affordable – techniques to produce complex and and sustained behavior change.
The Accenture Technology Labs blog will feature the opinions and perspectives from the very people that are driving innovation today for Accentu...