Data security lapses are costly. For retailers that have experienced a security breach, 12 percent of their loyal customers say they have stopped shopping at that retailer, and 36 percent will shop at those retailers less frequently.4 The dollar damages can be just as painful – more than $230 million in direct expenses in one high-profile example of mishandled customer data, a lapse that would ultimately knock more than $1 billion off the company’s net earnings for the year.
The issue of data security has also become an important part of the national conversation. According to a recent study by TRUSTe, 45 percent of US citizens think online privacy is more important than national security.5 The same study found that while 92 percent of US Internet users are concerned about online privacy, only a little more than half (55 percent) say they trust most companies with their personal information online, and 91 percent say they avoid companies that do not protect their privacy.
Adherence to ethical standards can amplify customer growth just as concerns about ethical practices can amplify attrition. Case in point: Threema, a secure messaging service for smartphones that offers end-to-end encryption, doubled its user base from 200,000 to 400,000 in the 24 hours after Facebook’s acquisition of WhatsApp in February 2014.6 As an independent company, WhatsApp had earned a high degree of trust among users that the service would protect the privacy of their communications. But once the company was acquired by Facebook, the WhatsApp community immediately identified with the privacy concerns of Facebook users.
The lesson: If the entity in question is a platform, the effects of lax ethical data practices are compounded and can ripple throughout systems—even more so if decision making is being performed by automated algorithms. These risks were shown in the Flash Crash of May 6, 2010, when the Dow Jones plunged 9 percent in two minutes caused by a confluence of systemic volatility with algorithmic trading organizations.7 To the data ethicist, this event begs the question, “If autonomous trading algorithms were required to adhere to a stringent code of ethics, would this Flash Crash have been less severe or perhaps been prevented altogether?”
Nor are these issues confined to financial markets. Decision making by autonomous agents is of particular concern when seen in the context of the Internet of Things. As connected networks of small, embedded sensors become more pervasive, autonomous decision making is being pushed to the edge of networks, where the digital and physical worlds intersect.
At this intersection, sensors and analytical systems make autonomous decisions that affect the real world, such as real-time traffic management and responsive street-lighting systems. To protect public safety, these sense-and-respond systems must be governed by ethical algorithms. This is not alarmist. In 2014, a German steel plant suffered significant physical damage to production machines as the result of a cyberattack.8
Ethical handling of data must no longer be taken for granted. In the digital era, proper controls and policies for managing data throughout its supply chain are necessary for reducing business risks that can quickly impact the bottom line.