As a follow up to the Accenture Labs report, “Building Digital Trust: The role of data ethics in the digital age,” released in mid-2016, we have worked with a range of collaborators to publish six supplemental white papers that explore the unique facets of data ethics.
In the initial report and my introductory blog post, we outlined why organizations should begin taking steps now to reduce their exposure to digital risk by integrating a wide array of data ethics practices throughout their data supply chains. These actions are critical to maintaining digital trust with customers and business partners, especially in terms of mitigating external and internal security risks.
In fact, organizations might be getting exposed to untold security risks without even realizing it. Because let’s face it: sharing data is inevitable in the digital age. Organizations regularly share data internally between different groups. They share it across business partners as they endeavor to create new offerings and even new industries in the platform economy. And organizations increasingly share data with the public.
In “The Ethics of Data Sharing: A guide to best practices and data governance,” white paper, we teamed with data ethicist, Jake Metcalf at Ethical Resolve, to examine how data sharing, data aggregating, and data analytics go hand-in-hand in a big data environment.
At the same time, these capabilities make data sharing best practices all the more essential from a security and risk management perspective. The paper also takes a closer look at the:
Dichotomy of closed corporate data and open government data sharing policies
Ethical arguments and concerns related to each end of the spectrum
Unaddressed “middle tier” where there are good reasons to share data in a limited fashion but clear guidelines were previously absent for such exchanges.
For digital businesses participating in the platform economy, this “middle tier” of data sharing with external partners is the most complex, yet most frequently used, form of data sharing. These platform collaborations necessitate widespread and constant data sharing, bringing new and difficult-to-predict risks and governance challenges. That’s why it is imperative to understand the extent of the data supply chain and how far data could reach when shared with alliance partners, regular partners, or under contract as an exchange of value.
To that end, Accenture Labs and the Data Ethics Research Initiative posted the following guidelines for sharing data among external partners in order to improve transparency and reduce risk:<
Ongoing collaboration and mutual accountability are necessary between data-sharing partners. Since datasets can reflect bias and require interpretation, partners should be accountable to each other for sensitive, high-quality interpretive work that seeks to address possible bias and potential harms.
Build common contracting procedures, but treat every contract and dataset as unique. Universal criteria does not exist for ethical data sharing and governance. Rather, standardized contracting procedures can build community norms about what review processes are necessary complements to data sharing.
Develop ethical review procedures between partners. Partners should determine in advance how ethics concerns can be escalated and resolved both within their own organizations and between their organizations.
Be mutually accountable for interpretive resources. When methods and models are developed for machine-learning systems and other modes of data analytics, assumptions should be enumerated.
Minimalist approaches to data sharing are preferable. Increased levels of openness generally increase risk to data originators and each partner. Data holders and recipients should carefully audit the datasets for such risks before sharing all or some of the data under consideration.
Identify potential risks of sharing data within sharing agreements. Assume that identified risks at the outset of a sharing relationship will not sufficiently describe the universe of risks, and practitioners will learn to be more systemic in risk management through experience.
Repurposed data requires special attention. The hardest harm to predict and mitigate is that which can result from future repurposing of data, especially if combined with other datasets. Data-sharing partners should have explicit agreements on the parameters of repurposing.
When ethical principles or regulations are unclear, emphasize process and transparency. Where there is no existing industry standard, transparent and consistent decision-making processes are the best buffer against harms and a way to reinforce public trust.
Published research requires additional attention. If generalized scientific knowledge derived from datasets is to be published, all involved parties should agree to the publication in advance, ensure they’ve undertaken reasonable attempts to protect the data-subjects from harm and obtained their informed consent.
Treat trust as a networked phenomenon. When drafting terms of service agreements, privacy policies and end-user license agreements, be sensitive to the tensions between legal compliance and trust with your users, other partners and the public.
I encourage you to read the white paper for more detail on these data sharing best practices. You can also use the 100-day and 365-day plans in the report to enhance best practices and governance for your organization.
To read more posts, please subscribe.