YouTube says it will start removing violent content aimed at children rather than relying solely on age restrictions
- YouTube will now remove violent and adult content directly for children
- Previously, the company only had age restrictions for the content
- The titles, tags and descriptions of video & # 39; s are taken into account
- After a 30-day grace period, accounts that repeatedly offend are banned
- The company's decision follows a recent settlement with the FTC
- YouTube has also announced that it will no longer display targeted ads for children
YouTube removes all & # 39; adult & # 39; or & # 39; violent & # 39; content aimed at children amid increasing pressure to make its platform safer for minors.
According to The edge, YouTube says it removes unsafe content by checking video titles, descriptions, and tags and starts banning offenders after a grace period.
Targeted content includes all material that touches on sex, violence, death or other topics that are deemed unsuitable for a young audience.
Although it may be strange to think that a YouTube format platform is not already & # 39; violent & # 39; and & # 39; adult & # 39; has moderated content aimed at children, The Verge notes that YouTube has only age-related access to date.
YouTube has made another important change to content targeted at children by choosing to remove video & # 39; s forcibly or & # 39; adult & # 39; theme & # 39; s. Photo file
YouTube reportedly announced the change two days ago, in silence, through a YouTube Help community forum.
The platform said it will remove content if and when it is found, but will not have & # 39; strikes & # 39; hand out to makers until after a 30-day cancellation period intended to familiarize users with the policy.
However, videos & # 39; s uploaded before the rule change do not receive any alerts, although they can still be deleted.
As part of the push, YouTube will also include age-limiting other types of content that they fear may be misunderstood as for children, such as adult cartoons.
An example, the platform said, would be cartoon aimed at children portraying inappropriate subject such as a character & # 39; injecting needles & # 39 ;.
Following a secret settlement with the Federal Trade Commission (FTC) about alleged violations of the Children & # 39; s Online Privacy Act (COPPA), YouTube recently also agreed to end targeted advertisements in children's content.
Critics say that the use of targeted advertising for children violates laws that prevent companies from collecting data about individuals under the age of 13 without the permission of their legal guardians.
Targeted ads use data collected from numerous sources to promote products based on user preferences and linked to a robust audience of children are crucial to YouTube's business model.
According to a Bloomberg report, research firm Loup Ventures estimates that YouTube gets between $ 500 and $ 750 million annually from children only.
Although YouTube has a separate app for children that does not use targeted ads, it still has a lot of children's content on the main site, for which the use of data-driven product placement still applies.
The platform has begun to take children's safety more seriously under pressure from supervisors and concerned parents.
The step to end targeted advertising for children's content is another important step for YouTube, which has begun to change policies amid increasing pressure from regulators
Earlier this month, YouTube confirmed to Bloomberg that it had adjusted its algorithm to & # 39; trusted creators & # 39; in July.
According to YouTube makers interviewed by Bloomberg, the tweak has eroded viewers, with 98 percent fewer viewers, while others have noticed a significant increase in viewers.
In February, the company killed more than 400 channels amid concerns about child abuse and exploitation.
Although the prevention of targeted advertisements can reach the content of children, supervisors and concerned parents can temporarily appease, but implementing the strategy may be easier said than done.
In order to successfully remove the ads, YouTube should first determine what content is aimed at children and then think of a way to successfully identify and remove the ads.
HOW DOES YOUTUBE MAKE HIS KIDS APP SAFER?
YouTube is finally rolling in changes to the privacy settings on the Kids app.
After several problems with the service were reported, it has now started rolling out updates.
With the new features, parents can filter content in the app, so that only channels that are rated by people are displayed instead of algorithms.
There will be three updates later this year.
Collections by trusted partners and YouTube Kids staff
YouTube Kids employees offer collections of trusted channels on various topics.
This can be done from the Profile Settings and parents can choose from collections such as Sesame Workshop and PBS KIDS.
YouTube continues to add more partners over time.
Content approved by parents
YouTube is introducing a feature later this year that allows parents to specifically select each video and channel available to their child in the Kids app.
Improved search control
Starting this week, turning off the search will limit the YouTube Kids experience to channels verified by the YouTube Kids team.
. (TagsToTranslate) Dailymail (t) sciencetech