This morning, Microsoft set the launch date for its AI-powered Copilot feature and showed off some of its capabilities for the first time. At a panel on “Responsible AI” following the announcement, company executives discussed the danger of over-reliance on its generative software, which was shown creating blog posts, images and emails based on customer prompts. users.
Six months after the company fired the team dedicated to upholding responsible AI principles in the products it marketed, executives attempted to make a clear statement on stage: Everything is fine. Responsible AI still exists at Microsoft. And Copilot is not going to take your job.
“The product being called Copilot is really intentional,” said Sarah Bird, who leads AI responsible for core AI technologies at Microsoft. “It’s really great to work with you. “Definitely not good to replace you.”
Bird referenced a demo from the launch event that showed Copilot composing an email on behalf of a user. “We want to make sure that people actually verify that the content of those emails is what they mean,” Bird said. Panelists mentioned that Bing chat includes quotes, which human users can then go back and verify.
“This type of user experience helps reduce over-reliance on the system,” Bird said. “They use it as a tool, but they don’t depend on it to do everything for them.”
“We want to give people the ability to verify content, as if they were doing research,” Divya Kumar, general manager of search marketing and artificial intelligence at Microsoft, further told the audience. “The human factor is going to be very important.”
The panelists acknowledged that Copilot (at least, at this stage) will be vulnerable to misinformation and misinformation, including that which other generative AI tools could create. Microsoft has prioritized adding tools like citations and content credentials (which adds a digital watermark to AI-generated images on Bing) to ensure people see Copilot builds as starting points rather than replacements for their own job.
Panelists urged the audience not to fear the impact generative tools could have. “My team and I are taking this very seriously,” said Chitra Gopalakrishnan, associate director of compliance at Microsoft. “From development to deployment, all of these features go through rigorous ethical analysis, impact analysis and risk mitigation.”
However, the panelists later acknowledged that generative tools could dramatically change the landscape of viable careers.
“When you have a powerful tool to partner with, what you need to do is different,” Bird said. “We know that some of the jobs are going to change.”