Ultimate magazine theme for WordPress.

Microsoft releases a tool to scan for ‘care’ of children in online chats

Microsoft has released a new tool for identifying child predators who take care of children for abuse in online chats. Project Artemis, based on a technique that Microsoft has used on the Xbox, will now be made available to other online companies with chat functions. It comes at a time when multiple platforms are dealing with child predators targeting children for sexual abuse by catching conversations in chat windows.

Artemis recognizes specific words and speech patterns and marks suspicious messages for review by a human moderator. The moderator then determines whether the situation should escalate by contacting the police or other law enforcement officers. If a moderator finds a request for sexual exploitation of children or images of child abuse, the National Center for Missing and Exploited Children is notified for further action.

“Sometimes we shout at the platforms – and there is abuse on every platform that has online chatting – but we have to applaud them for establishing mechanisms,” says Julie Cordua, CEO of non-profit technology organization Thorn, who works to prevent online sexual abuse prevent children. “If someone says,” Oh, we have no abuse, “I say to them,” Well, you look? “

In December, The New York Times discovered that online chat platforms were fertile ‘hunting grounds’ for predators who take care of their victims by first letting them become friends and then insinuated into the life of a child, both online and offline. Most major platforms face some degree of abuse by child predators, including Microsoft’s Xbox Live. In 2017, as the Times noted, was a man sentenced to 15 years in prison for threatening children with rape and murder via the Xbox Live chat.

Detection of online child sexual abuse and policies for its treatment can vary widely from company to company, with many of the companies involved being wary of potential privacy violations, the Times reported. In 2018 Facebook announced a system to catch predators that checks if someone has fast contact with many children and how often they are blocked. But Facebook also has access to much more data about its users than other platforms.

The Microsoft utility is important, according to Thorn, because it is available to any company that uses chat and helps set an industry standard for what predator detection and monitoring should look like, which helps in the development of future prevention tools. Chats are difficult to check for possible child abuse, because a conversation can contain so much nuance, Cordua says.

Child predators can lurk in online chat rooms to find victims as they would offline, but with much more direct access, says Elizabeth Jeglic, a psychology professor at John Jay College of Criminal Justice in New York written extensively about protecting children from online sexual abuse, in particular the often subtle practice of care. “They may talk sexually with a child within 30 minutes,” she says. “Personally, it’s harder to access a child, but a predator can go online, test the waters, and if it doesn’t work, move on to the next victim.”

Cordua does not stop at one platform. “They try to isolate the child and follow them on multiple platforms so that they can have multiple exploitation points,” she says. A predator can ask a child for a photo and then increase the requirements for videos, which increases the level of sexual content. “The child is plagued by guilt and fear, and this is why the predator is about platforms: he can say ‘oh I know all your friends on Facebook, if you don’t send me a video, I’ll send that first photo to everyone your junior high. “

Artemis has been in development for more than 14 months, Microsoft says, starting in November 2018 on one Microsoft “360 Cross-Industry Hackathon,” that was co-sponsored by two child protection groups, the WePROTECT Global Alliance and the Child Wignity Alliance. A team from Roblox, Kik, Thorn and The Meet Group worked with Microsoft on the project. It was led by Hany Farid who the PhotoDNA tool for online detection and reporting of images of sexual exploitation of children.

However, some details about how the Artemis tool will work in practice are unclear and are likely to vary depending on which platform it uses. It does not state whether Artemis would work with chat programs that use end-to-end coding, or what steps are being taken to prevent potential PTSD among moderators.

Thorn will manage the program and handle licenses and support to get participating companies on board, Microsoft says.

Cordua says that while Artemis has some initial limitations – it currently only works in English – the tool is a huge step in the right direction. Since every company that uses the tool can customize it for its own audience (chats on gaming platforms will of course be different from those on social apps), there is ample opportunity to customize and refine the tool. And, she says, it’s about time platforms moving away from the failed practices of self-control and towards proactive prevention of childcare and abuse.

In its blog post, Microsoft adds that the Artemis tool is “absolutely no panacea”, but a first step toward detecting online care of children by sexual predators, calling it “serious” problems.

“The first step is that we need to become better at identifying where this is happening,” says Cordua. “But all companies that host a chat or video should do this or they are complicit in allowing the abuse of children on their platforms.”