No artificial intelligence introduction would be complete without addressing AI ethics. AI is moving at a blistering pace and, as with any powerful technology, organizations need to build trust with the public and be accountable to their customers and employees.
At Accenture, we define “responsible AI” as the practice of designing, building and deploying AI in a manner that empowers employees and businesses and fairly impacts customers and society—allowing companies to engender trust and scale AI with confidence.
Every company using AI is subject to scrutiny. Ethics theater, where companies amplify their responsible use of AI through PR while partaking in unpublicized gray-area activities, is a regular issue. Unconscious bias is yet another. Responsible AI is an emerging capability aiming to build trust between organizations and both their employees and customers.
Data privacy and the unauthorized use of AI can be detrimental both reputationally and systemically. Companies must design confidentiality, transparency and security into their AI programs at the outset and make sure data is collected, used, managed and stored safely and responsibly.
Transparency and explainability
Whether building an ethics committee or revising their code of ethics, companies need to establish a governance framework to guide their investments and avoid ethical, legal and regulatory risks. As AI technologies become increasingly responsible for making decisions, businesses need to be able to see how AI systems arrive at a given outcome, taking these decisions out of the “black box.” A clear governance framework and ethics committee can help with the development of practices and protocols that ensure their code of ethics is properly translated into the development of AI solutions.
Machines don’t have minds of their own, but they do make mistakes. Organizations should have risk frameworks and contingency plans in place in the event of a problem. Be clear about who is accountable for the decisions made by AI systems, and define the management approach to help escalate problems when necessary.