You will be responsible for analyzing and reviewing user profiles, audio, videos, and text-based content and/or investigating, escalating and/or resolving issues that are reported by users or flagged by the system. Due to the nature of the role, the user may be exposed to flashing lights or contrasting light and dark patterns.
Content moderation is meaningful work that helps keep the internet safe. It may also be challenging, at times. In the context of this role, individuals may be directly or inadvertently exposed to potentially objectionable and sensitive content (e.g., graphic, violent, sexual, or egregious). Therefore, content moderators need strong resilience and coping skills. We care for the health and well-being of our people and provide the support and resources needed to perform their job role responsibilities. Active participation in Accenture’s well-being support program, designed specifically for the Trust & Safety community, provides valuable skills to promote individual and collective well-being.
Review, classify and/or remove content according to client guidelines, using specific tools and channels.
Understand and remain updated on changing client policies and guidelines
Investigate, resolve, and relay complex content issues to the broader Trust and Safety team
Serve as an advocate for the user community
Participate in process improvement initiatives that improve quality and efficiency of work
Participate in continuous training programs and workgroup discussions for optimal development in the role
Engage in conversation around socially sensitive topics with the purpose of keeping our communities safe