YouTube will remove content that promotes “cancer treatments that have been shown to be harmful or ineffective” or that “dissuades viewers from seeking professional medical treatment.” the video platform announced today. The app comes as YouTube tries to optimize its medical moderation guidelines based on what it has learned from trying to tackle misinformation on topics like covid-19, vaccines and reproductive health.
Going forward, Google’s video platform says it will enforce its medical misinformation policies when there is a high risk to public health, when guidance is publicly available from health authorities, and when a topic is prone to misinformation. YouTube hopes this policy framework is flexible enough to cover a wide range of medical topics, while striking a balance between minimizing harm and allowing for debate.
Videos are not allowed to discourage viewers from seeking professional medical treatment.
In its blog post, YouTube says it would take action against treatments that are actively harmful, as well as those that are unproven and are suggested instead of established alternatives. A video could not, for example, encourage users to take vitamin C supplements as an alternative to radiation therapy.