
Child pornography images they were part of a database of content removed from TikTok, which moderators had to refer to in the course of their activity. One of the former employees interviewed by Forbes was tasked with teaching an AI to recognize images that violated the platform's policies.
Social networking platforms use algorithms and the work of human moderators to remove violent, sexually explicit or illegal content. In this case, however, TikTok and Teleperformance handled the sexually explicit images of minors in a rough and dangerous way: not only by giving access to the material to large numbers of people, but also exposing the moderators themselves to potential trauma.
“While I was working I thought: this is someone's son. This is someone's daughter. And the parents don't know that we have these images, these videos, these traumas, saved somewhere. " a former moderator told Forbes.
Most countries have very strict policies on how to manage and preserve child pornography images. In the United States, for example, it is mandatory to report them to the competent authorities and keep them for no more than ninety days. The federal law of the country also. explicitly indicates to provide access to the material to as few people as possible. Even from a legal point of view, therefore, the allegations are quite serious.
A spokesperson for TikTok said that "the training materials are subject to strict access controls and do not include visual examples of child abuse" . Teleperformance also dismissed the allegations from former employees, but the company's Trust and Safety division president refused to answer specific questions asked by Forbes about the Drr file and how it was stored and accessible on corporate servers.