A YouTuber feeling blue on shutdown decided to get a serotonin shot by looking at some adorable dogs.
Ryder, host of Ryder Calm Down, connected his computer to an AI-compatible camera module and uploaded a popular object detection algorithm to recognize when a dog came into view.
When his camera identifies a dog, the automated text-to-speech voice shouts, “There’s a dog outside!” through a loudspeaker.
The program can also be reversed so that passersby walking their dogs hear Ryder’s automatic voice shout, “I like your dog!”
Scroll down for video
YouTuber Ryder from Ryder Calm Down trained his machine learning module to alert him when a dog passed by hoping to increase his serotonin and dopamine levels
“Quarantine isn’t great,” says Ryder Calm Down’s Canadian social media personality Ryder, “but looking at dogs makes me feel better.”
He starts his latest video by explaining that “when you look at dogs and cats, the amount of serotonin and dopamine in your brain increases.”
It’s true: Researchers have long believed that images of cute animals release the powerful pleasure chemicals.
This is called the ‘baby schedule’ effect, the same thing happens when we look at human babies.
Ryder combined a camera module and Raspberry Pi computer loaded with YOLOv3, a popular object detection algorithm trained to recognize about 80 different objects, including people, cars, chairs, umbrellas and dogs
In short, round faces, soft bodies, big eyes and other adorable features arouse warm feelings and motivate caring behavior, increasing the survival of our offspring.
Ryder decided he could use a dopamine boost by peeking at some dogs in real life.
Not wanting to ‘lurk’ out the window all day, he built an AI-powered machine that would automatically recognize when a dog came along with its owner.
‘Quarantine isn’t great,’ says Ryder Calm Down’s Canadian social media personality Ryder, ‘but when I watch dogs I feel better’
“We’re building a machine that looks out the window and notifies me when dogs walk past so I can watch them and feel better,” he explains in a video.
Ryder says the real inspiration for the device was his friend Heather, “who once climbed a six-foot fence to pet a dog.”
He constructed his dog detector with Raspberry Pi, a simple one-board computer popular with hobbyists, combined with an AI camera module using the YOLO object detection model, trained to recognize dozens of objects, including people, cars, umbrellas, chairs— and yes, dogs.
Ryder entered a custom code and attached a speaker, pointed the device out into the street, and waited.
Thinking the dogs and their owners needed a little boost too, Ryder trained the system to make the speaker shout, “I like your dog!” to passers-by in his voice. “I turned the megaphone and opened the window,” he said
An owner seemed a bit confused by the AI system’s disembodied compliment. Ryder confirmed he was in a better mood afterwards, ‘but I’m not sure if it was the dogs or were building something that actually worked once’
Sure enough, the automated text-to-speech voice announced loudly, “There’s a dog outside!”
Thinking the dogs and their owners needed a little boost too, Ryder trained the system to scream, “I like your dog!” with passers-by, even when he was not at home.
“We can use the same technology to tell people they have a cool looking dog, even if we don’t really know,” Ryder said. “I turned the megaphone and opened the window.”
In the video above, the woman on the receiving end seems baffled when she is confused about the disembodied compliment.
The experiment made him feel better, Ryder affirmed, “but I’m not sure if it was the dogs or were building something that actually worked once.”
He’s sort of a DIY tech wizard: in other videos, Ryder has constructed a robot that pours you wine every time you get a Slack notification, an app that lets strangers control his Christmas lights remotely, a zoom button for emergencies and an FM radio that only plays his favorite songs.
Canine compliments aren’t even the wildest use for machine learning: University of Helsinki researchers recently trained AI to detect which faces a user finds attractive.
Finnish researchers wanted to find out if a computer could identify facial features that we find attractive without any oral or written input.
Thirty volunteers were tied to an electroencephalography (EEG) monitor and then shown a variety of fake faces.
By measuring their EEG value while each face was presented, the team was able to determine their unconscious preference.
They then fed that data into an AI that learned their preferences and created new images tailored to the individual volunteer.
“By interpreting their views, the AI model that interprets brain responses and the generative neural network that models the facial images can produce a whole new facial image by combining what a particular person finds attractive,” said project leader Tuukka Ruotsalo.