Home INDIA 2.2 Million Videos In India Removed By Youtube Over Violation Of Guidelines

2.2 Million Videos In India Removed By Youtube Over Violation Of Guidelines

0 comments
2.2 million videos in India removed by YouTube for violating guidelines
<!–

–>

The videos were removed for violating community standards. (Representative)

New Delhi:

YouTube removed more than 2.25 million videos in India for violating its Community Guidelines between October and December 2023, with the country topping the list of video removals ahead of countries like the US and Russia.

Singapore is second on that list with 1,243,871 videos removed, and the United States (788,354) is third, according to YouTube’s data on video removals by country/region of upload.

Indonesia ranked fourth (770,157), while for Russia, the number of videos removed was 516,629, according to YouTube’s Community Guidelines Enforcement Report, which provides global data on the flags YouTube receives and how Google’s platform enforces policies.

Globally, YouTube removed more than 9 million videos during the period (Q4 2023) for violating community standards. More than 96 percent of these videos were first flagged by machines rather than humans.

The videos were removed for violating community standards regarding parameters such as harmful or dangerous content, child safety, violent or explicit content, nudity and sexual content, misinformation and others.

The latest report shows that over 2.25 million videos (2,254,902) were removed for violating YouTube’s Community Guidelines between October and December 2023 in India. India topped the list of 30 countries in terms of video removals.

Globally, 20.5 million (20,592,341) channels were removed by YouTube for violations of the Community Guidelines in the quarter ending December 2023.

The report explains that when a channel is terminated, all videos are deleted. The number of such videos removed due to channel termination during this period was 95.5 million (95,534,236).

“A YouTube channel will be terminated if it receives three Community Guidelines strikes within 90 days, has one instance of serious abuse (such as predatory behavior), or is determined to be completely committed to violating our guidelines ( as is often the case with spam accounts),” YouTube said.

Google’s video streaming platform says it works hard to maintain a safe and vibrant community.

“We have community guidelines that set the rules of the road for what we don’t allow on YouTube,” it says.

For example, it does not allow pornography, incitement to violence, intimidation or hate speech. YouTube says it relies on a combination of people and technology to flag inappropriate content and enforce these guidelines. Flags may come from automated flagging systems, from members of the Priority Flagger program, or from users in the broader YouTube community

(Except for the headline, this story has not been edited by WhatsNew2Day staff and is published from a syndicated feed.)

You may also like