Home Tech Humans forget. AI assistants will remember everything

Humans forget. AI assistants will remember everything

by Elijah
0 comment
 Humans forget. AI assistants will remember everything

Making these tools work together will be key to getting this concept off the ground, says Leo Gebbie, an analyst covering connected devices at CCS Insight. “Instead of having that kind of disjointed experience where certain apps use AI in certain ways, what you want is for AI to be that overarching tool that when you want to extract something from any app, any experience, any content, you have the immediate ability to do so. look at all those things.”

When the pieces fit together, the idea sounds like a dream. Imagine being able to ask your digital assistant, “Hey, who was that guy I talked to last week who had the really good ramen recipe?” and then have him say a name, a summary of the conversation, and a place to find all the ingredients.

“For people like me, who don’t remember anything and have to write everything down, this will be great,” Moorhead says.

And there is also the delicate issue of keeping all that personal information private.

“If you think about it for half a second, the most important problem is not recording or transcribing, but solving the privacy problem,” Gruber says. “If we start having memory apps or recovery apps or whatever, then we’re going to need this idea of ​​consent to be understood more broadly.”

Despite his own enthusiasm for the idea of ​​personal assistants, Gruber says there is a risk that people are too willing to let their AI assistant help with (and monitor) everything. He advocates for encrypted private services that are not tied to a cloud service or, if they are, one that can only be accessed with an encryption key that is saved on the user’s device. The risk, Gruber says, is a kind of Facebook of AI assistants, where users are attracted by the ease of use but remain largely unaware of the privacy consequences until later.

“Consumers should be told to get angry,” Gruber says. “They should be told to be very, very suspicious of things that already look like that and to feel the creep factor.”

Your phone is already siphoning off all the data it can get from you, from your location to your shopping habits to which Instagram accounts you double-tap the most. Not to mention, people have historically tended to prioritize convenience over security when adopting new technologies.

“The obstacles and barriers here are probably much lower than people think,” Gebbie says. “We have seen the speed at which people will adopt and embrace technology that will make their lives easier.”

This is because there is real upside potential here as well. Getting to interact with and benefit from all that collected information could even alleviate some of the pain of years of spying by app and device makers.

“If your phone is already taking this data, and it’s all currently being collected and used to ultimately show you ads, is it beneficial that you actually get an element of utility from this?” Gebbie says. “You’ll also have the ability to leverage that data and get those useful metrics. Maybe that’s something really useful.”

That’s kind of like being handed an umbrella after someone has stolen all your clothes, but if companies can stick the landing and make these AI assistants work, then the conversation around data collection may lean more. towards how to do it responsibly and in a way that provides true utility.

It’s not a perfectly rosy future, because we still have to trust the companies that ultimately decide which digitally collected parts of our lives appear relevant. Memory may be a fundamental part of cognition, but the next step beyond that is intentionality. It’s one thing for AI to remember everything we do, but another for it to decide what information is important to us later.

“We can get so much power and so many benefits from a personal AI,” says Gruber. But, he warns, “the advantages are so great that it should be morally compelling that we get the right one, that we get one that protects privacy, is secure, and is done well. Please, this is our chance. “If it is done for free, not privately, we will miss the once-in-a-lifetime opportunity to do it the right way.”

You may also like