Facebook makes changes to its policies based on the recommendations from its Oversight Board. The body announced the first round of its rulings on content moderations last month. Some of the decisions toppled Facebook’s original actions on content moderations. The board recommends the social networking platform. It is committing to action to address 11 of the recommendations of the board. It had already acted one some of the recommendations. It is evaluating feasibility to implement five more recommendations. The move could loosen up its approach to misinformation on the Covid-19 pandemic.
The company detailed responses to the actions recommended by the board. Facebook bringing more transparency, updating Instagram policies, and introducing a Transparency Centre. It was also calibrating the use of automation. It put efforts in making automated detection better, more transparency around automation.
Facebook said that “Technology allows us to detect and remove harmful content before people report it, sometimes before people see it. We typically launch automated removals when they are at least as accurate as those by content reviewers. We’ll continue to evaluate which kind of reviews or appeals should be done by people and which can be safely handled by automated systems, and how best to provide transparency about how decisions were made. We’ll continue to assess and develop a range of tools to address health misinformation, considering the least intrusive to expression wherever possible,”.
The company also restated its obligation to work with scientists of the World Health Organisation and public health authorities on undertaking the misinformation.