RESEARCH REPORT

In brief

In brief

  • New techniques to modify images, audio and video will enable the creation of content that’s more realistic than today’s more obvious fakes.
  • Platform companies should seek to mitigate exposure with a balanced series of activities across five key dimensions.
  • In this latest thinking, we outline a new generation of content tampering tools and highlight ways that companies can prepare better for the future.


It used to be relatively straightforward for platform companies to deal with manipulated content. When Twitter identified a fake celebrity account it could ban it while simultaneously providing an accreditation badge for the real one. YouTube could run a background match against unlicensed music and remove it. Algorithms could detect and flag photoshopped images. That was then. The new generation of manipulated media will be a lot harder to identify and subtler in its effect.

To get ahead of the problem, platform companies must act now. While technology will undoubtedly help, it is unlikely to provide a silver bullet.

New techniques to modify images, audio, and video will enable the creation of content that’s far more realistic and subtle than today’s more obvious fakes. What’s more, identity is becoming harder to validate, as trolls and bots adopt new ways to mask their true source. Sometimes the results will be harmless; but often the intent is malicious. And platform companies are coming under increased pressure and scrutiny to respond.

Left unchecked, the implications for today’s platform companies could be profound across several fronts:

  • Erosion of trust: could see users migrating to sources of more reliable information.
  • Legal action: as users/special interest groups see personal harm from the maliciously targeted actions of trolls on the platform.
  • Loss of advertisers: as legitimate brands seek safer environments against which to place their messaging, or as platforms decide to remove fake advertising accounts from their sites.
  • Regulatory intervention: if companies fail to get on top of the problem, regulators and government agencies may intervene, forcing platforms to implement costly catch-up measures.


Platform companies should seek to mitigate exposure with a balanced series of activities across five key dimensions:

  1. Invest in technology
  2. Enhance content policies
  3. Focus on reporting and review mechanisms
  4. Focus on human agents
  5. Educate users

The problem of manipulated content isn't going away. In fact, it's only going to get worse. Companies therefore need to realize the degree and urgency of the problem and act now, before it's too late.

Kevin Collins

Managing Director – Software & Platforms


Martin Stoddart

Principal Director – Software & Platforms

MORE ON THIS TOPIC


Subscription Center
Stay in the Know with Our Newsletter Stay in the Know with Our Newsletter