Facebook is testing a change that will let users know when their posts have been removed as a result of automation, and the new experiment comes in response to the Oversight Board, which said the social network should be more transparent with users about how their posts are removed.
The company revealed the new test in a new report that provides updates on how Facebook handles oversight board policy recommendations, and the test comes in response to one of the first cases adopted by the oversight board, which dealt with an Instagram post aimed at raising awareness of breast cancer that the company removed under Nudity rules.
Facebook has reinstated the post, saying its automated systems made a mistake, and updated Instagram’s rules to allow “health-related nudity,” but the oversight board also recommended that Facebook alert users in cases where the post is being removed automatically and not as a result of a human content reviewer. Facebook previously said it would test this change, and it is now in effect.
“We launched a test on Facebook to assess the impact of telling people more about whether automation was involved in enforcement,” Facebook wrote in its report. “People in the test now see whether the technology or Facebook content reviewer made the enforcement decision on private content. With them, we will analyze the results to see if people have a clearer understanding of who has removed their content, while also monitoring a potential rise in recidivism and appeal rates.”
The company added that it will provide an update on the testing later this year, while the report also threw some additional insights into how the company is working with the oversight board.