The "soft side" of IT doesn’t only refer to software anymore. Today it’s also about ethics. Successful innovation requires soft skills like empathy and cultural intelligence alongside the quantifiable skills like engineering and data science.
In many technology firms, those could be fighting words, but Accenture has a long history and core values that celebrate the role soft skills play in successful business. That’s one of the reasons we’ve been doing research in data ethics since 2013 and have been publishing in the space since 2014.
In this half-decade of exploration, systemic opportunities for technical innovation in the ethics space have risen to the surface. First, we want to ensure that risks associated with digital investments are identified as close to their origin as possible. This prevents risks from ballooning as solutions are applied at scale. Second, we want to make the “right” decisions easier for companies to make. And third, we want to make it easier for both consumers and organizations to have more agency over data they’ve shared.
We know that it’s critical for companies to be transparent about the values-based decisions that make it into products and services. For tech firms, this means software developers and data scientists must be able to identify values-based decisions they’re making in their code. That’s no small feat. That’s why Accenture Labs is partnering with Fjord, design and innovation from Accenture Interactive, to research how best to engage developers and scientists in identifying these values-based decisions, documenting them quickly, and submitting that documentation to an aggregation point. We want to find instances where these decisions create out-sized risk, catch the risk early, and then mitigate.
Mitigation could simply be new tasks added to a developer’s agile queue, or it could be coaching from a domain expert. The goal here is two-fold: First, we want software developers and data scientists to see this form of “ethics triage” as something that makes them write better code, and thus something they’ll work into their regular practice; and second, we want to minimize risk at its source.
We also want to establish technology solutions that make it easier for organizations to choose a more ethically sustainable path. Our integrated design and innovation team in Dublin is developing technology that will help data scientists discover bias in datasets and offer options for remediation. We’re also collaborating with organizations such as the Carnegie Endowment for International Peace to assemble a stakeholder community around deep fakes (multimedia created or altered by algorithmic means for a dubious intent) with the goal of giving organizations the necessary tools within their data supply chains to combat the spread of misinformation and deceptive actors.
Another tool that is in demand across industries is a privacy-preserving, trusted data exchange. Accenture Labs has filed a half-dozen patents on just such a system (the first of which has been issued). Such blockchain-based systems will give both consumers and companies more control over data they’ve shared. These solutions could be used as a system of record for everything from IoT-generated data to personally identifiable information. Data disclosers can share and curate datasets, set prices for sharing the data they have disclosed, specify who can and cannot have access to the data, decide what kind of research/insights the data they’ve disclosed can be used for, and irrefutably audit where data has gone, what’s it’s been used for, and how much they were paid as a result.
In all, these technologies address critical infrastructure needs that will help to build and maintain trust in digital products and services. It’s critical that we—as an industry—continue to develop the tools necessary to support innovation at scale that is both ethical and a net benefit to all peoples.
For more about our work on data ethics, visit accenture.com/dataethics. To discuss opportunities to establish stronger governance around digital ethics, contact Steven Tiell.