Take a fresh look at your lifestyle.

Privacy concerns about Russia’s ‘most popular search engine’, Yandex, because it uses face recognition

A Russian search engine is accused of offering an unregulated face recognition system to members of the public – violating personal privacy.

Experts have slammed the function as “bad” and “creepy” while calling it a “clear privacy problem.”

Yandex, like Google, Bing and other search engines, allows users to enter an image and see similar results.

But only Yandex, who claims to perform more than 50 percent of Russian searches on Android, produces images of the exact same person.

MailOnline tested the image search facilities of Yandex, Bing, Google and specialized site TinEye by submitting a photo that was not available online.

The photo that was not available online before publication in this article

The photo that was not available online before publication in this article

As first noted by blogger Nelson Minar, only Yandex then produced other images of the same person in the results.

Other platforms returned similar-looking photos of different people, protecting the identity of the person in the original photo.

Yandex does not claim to use facial recognition to power its image search engine, but it does say it uses machine learning and deep learning neural networks.

Felix Rosbach, product manager at the German software development company comforte AG, told MailOnline: ‘This is not only bad for Yandex, this is (unfortunately) the future in which we live.

“The use of machine learning for face recognition makes virtually every service possible to identify users.

“When you use this technology on your iPhone to search for all your friends’ photos, it can be useful.

“When third parties use this technology to correlate information freely available on the Internet to create user records, it gets scary.”

The selfie taken on my desk (above) was sent to Yanex when searching for images (photo). The first two images were other photos of me that are available online. One from my private Facebook and one from another MailOnline article

The selfie taken on my desk (above) was sent to Yanex when searching for images (photo). The first two images were other photos of me that are available online. One from my private Facebook and one from another MailOnline article

The selfie taken on my desk (above) was sent to Yanex when searching for images (photo). The first two images were other photos of me that are available online. One from my private Facebook and one from another MailOnline article

To test the feature, this morning I submitted a selfie that was taken on my desk and that was in no way posted online (until the publication of this article) in Yandex.

The first two results were other images of my face that are available online – one from my personal Facebook and one from an earlier MailOnline article.

By following the links provided, a person can relatively easily search for more information.

The only similarity between the images is my face, where all three images have nothing in common but my functions.

Lighting, clothing, distance to the camera, facial expression and background are all clearly different, indicating that the site uses some form of face recognition.

A user only has to enter an image of an unknown person on the site and this immediately marks any online presence.

It does not protect people against strangers, stalkers and potential criminals, who may want to know their name and information.

Javvad Malik, a security awareness advocate at KnowBe4, told MailOnline that the feature is reminiscent of the FindFace app, which was launched in Russia a few years ago.

“With all these apps there is a clear healthcare problem and it is not difficult to think of scenarios in which it can be misused,” he says.

“The best precaution that users can now take is to be wary of which personal photos they upload online.”

When the process was repeated in Google, Bing and TinEye – a reverse image search engine specializing in copyright fraud – no images of my face were produced.

Instead, these sites have taken photos with easily identifiable similar functions.

For example, Google tagged the photo as a “gentleman” and mainly showed stock photos of young white men wearing a shirt and tie.

When the same anonymous and unpublished image was placed in Google’s image search function (photo), the photo was simply tagged as a “gentleman” and mainly stock photos were shown of young white men with a white shirt and tie, but not my face of other online locations

When the photo was submitted to Microsoft’s Bing platform, the results did not reveal my identity. It produced headshots of white men, some with a similar neutral expression, some were headshots of people in shirts and two results offered Harry Potter’s Goyle – a young white man in a green and white tie – as a “similar image”

The specialized site TinEye, a reverse image search engine that specializes in detecting copyright fraud, did not find any images of my face. This is correct because the image does not exist on the web and it did not offer comparable photos

The specialized site TinEye, a reverse image search engine that specializes in detecting copyright fraud, has not found any images of my face. This is correct because the image does not exist on the web and it did not offer comparable photos

The specialized site TinEye, a reverse image search engine that specializes in detecting copyright fraud, has not found any images of my face. This is correct because the image does not exist on the web and it did not offer comparable photos

Bing looked like Google and did not reveal my identity.

Instead, it produced generic images of white men.

Some had a similarly neutral expression, others were professional headshots of people in shirts and two results offered Harry Potter’s Goyle – a young white man in a green and white tie – as a “similar image.”

The more specialized site TinEye was the most accurate, it found no results for that specific image online. This is correct because the image did not exist on the internet.

It did not suggest comparable results.

On 17 December, Yandex announced an enormous change in its technology and algorithms on its website.

It has issued an update called VEGA, which claims “it offers 1500 improvements to Yandex Search that help our 50 million daily search users in Russia find the best solutions to their questions.”

He added: ‘By contributing their knowledge, experts improve our algorithms and we help our Search users, who continue to grow; In the past year the search share of Yandex on Android in Russia increased by 4.8% to 54.7% at the beginning of December. “

The company, which appeared at CES in Las Vegas this month and has been on the NASDAQ since 2011, called linking machine learning with “human knowledge” the “most important improvement.”

Andrey Styskin, head of Yandex Search, said in a blog post: “Our new search update combines our latest technologies with human knowledge.

“At Yandex, our goal is to help consumers and businesses navigate the online and offline world better.”

The blog post does not specify which newest technologies it has integrated into its search.

MailOnline has approached Yandex for comment.

WHAT CAN USERS PROTECT YOURSELF FROM ABUSE OF FACE RIGHT ONLINE?

Felix Rosbach, product manager at the German software development company comforte AG, says that there is only so much that an individual can do to protect themselves against this technology.

He said to MailOnline: “The only thing you can do as a private person to protect your data is to ensure that your social media profiles are not publicly available and only by sharing data with trusted parties.”

But Mr. Rosbach adds that unfortunately very few users can do when colleagues post photos of you publicly.

He calls on the search engines themselves to ensure that members of the public are protected.

He said: “Search engines must ensure that these functionalities cannot be misused.

‘But because machine learning software becomes widely available, there will always be a page or app that can offer this service.

‘Instead, companies must and can ensure that sensitive consumer data is always protected.

“And it’s not just about securing access to data – it’s about powerful data protection to ensure that data is useless in the event of data loss, incorrect configuration, or a data leak.”

.