Instagram today reinforces its moderation policy and adds a new warning that warns people who break rules when their account is about to be deleted.
The warning shows users a history of the posts, comments, and stories that Instagram had to remove from their account, as well as why they were deleted. "If you publish something that goes against our guidelines again, your account may be deleted," the page reads.
Instagram gives users the chance to challenge the moderation decisions directly via the warning instead of going to the internet via the help page. Some types of content may be invoked initially (such as photos that have been removed for nudity or hate speech) and Instagram plans to expand the available types of content appeal over time.
The change helps users understand why they are in trouble and should take away the shock of suddenly discovering that your account has disappeared. Although it is likely that a large number of banned accounts will be deleted due to obvious line violations, Instagram – like its parent company Facebook – has regularly had moderation issues when it comes to nudity and sexuality, with users removing photos before posting photos # 39; s from breastfeeding or period blood. This update will not prevent these errors (that kind of photos should be allowed), but it would make the decision more attractive.
In addition to the new warning, Instagram also gives the moderating team more leeway to banish bad actors. Instagram's policy has been to ban users who post "a certain percentage of content that is in violation", but it now prohibits people who repeatedly violate their policies within a certain period of time. The details here are all as vague as ever, because Instagram doesn't want to provide details and bad actors want the system to play, but it sounds like this could lead to less problematic accounts that continue at a technical point.