Meta is rolling back its covid misinformation rules in countries like the US, where the pandemic national emergency has been lifted as recommended by its independent oversight board in April this year, The Washington Post reported Friday morning (via engaged).
In a update on its July announcement that it had asked the Meta Oversight Board to investigate the safety of this, Meta cited the end of the World Health Organization’s global emergency declaration as the reason for the change:
Our Covid-19 rules on disinformation will no longer be in effect globally, as the global public health emergency that triggered those rules has been lifted.
Now the company says it will adjust its rules by region. In his transparency center page given the board’s recommendations, Meta says that because the WHO has lowered the emergency status of the pandemic, it won’t directly address some of the board’s concerns.
Among those concerns is the advice that Meta reassess what misinformation it is removing and is taking steps to increase transparency on government requests to remove covid content. Instead, Meta says its response to the board’s fourth recommendation — that the company should create a process to assess the risk of its misinformation moderation policy — taps into the spirit of the first recommendation. It says it will “consult internal and external experts” to gauge the status of covid around the world and share local enforcement details in “future quarterly updates”.
The WHO ended its global emergency declaration on May 5, 2023, six months after Twitter stopped enforcing its own misinformation rules shortly after Elon Musk bought it in November 2022. Both TikTok and YouTube continue to have policies around covid disinformation, although YouTube recently changed its rules around misleading election information.