Dec. 6 (UPI) — The oversight board appointed by Facebook to review its “cross-check” program found that the system is unfairly skewed towards avoiding public relations backlash for clients and VIPs, according to its report released on Tuesday.
The board accused the social media giant of being focused on dodging the perception of censorship, as opposed to fulfilling its stated goal of protecting the free speech and safety of users.
Under the cross-check program, Facebook’s parent company Meta keeps a list of users who are eligible for extra review when found to be in violation of company policies around hate speech, misinformation, violence and other subjects. Celebrities, clients, politicians, and media outlets have all been included in the program, which raises concerns that less visible users could be removed for erroneous violations without the same review process.
Meta asked the oversight board, to review the cross-check program after details concerning its operation were reported by the Wall Street Journal in September 2021.
“Meta’s cross-check program prioritizes influential and powerful users of commercial value to Meta and as structured does not meet Meta’s human rights responsibilities and company values, with profound implications or users and global civil society,” said oversight board director Thomas Hughes in a press release.
“While Meta characterizes cross-check as a program to protect vulnerable and important voices, it appears to be more directly structured and calibrated to satisfy business concerns,” the oversight board found.
Additionally, the board suggested business concerns overshadow free speech and safety concerns “correlating highest priority within cross-check to concerns about managing business relationships suggests that the consequences that meta wishes to avoid are primarily business-related and not human rights-related.”
The oversight board urged Meta to embrace more transparency “the company should set out clear, public criteria for inclusion in its cross-check lists, and users who meet these criteria should be able to apply to be added to them.”
While its findings are non-binding the board has given Meta a list of 32 recommendations to fix potential missteps in the program. Meta has said it would review the recommendations.