The BBC, the UK’s largest news organisation, set out the principles it plans to follow when evaluating the use of generative AI, including for research and production of journalism, archives and “personalised experiences”.
in a blog entryBBC nations director Rhodri Talfan Davies said the broadcaster believes technology provides opportunities to deliver “more value to our audiences and society”.
The three guiding principles are that the BBC will always act in the best interests of the public, prioritize talent and creativity while respecting artists’ rights, and be open and transparent about production made with AI.
The BBC said it will work with technology companies, other media organizations and regulators to safely develop generative AI and focus on maintaining trust in the news industry.
“Over the coming months, we will launch a series of projects exploring the use of Gen AI in both what we do and how we work, taking a targeted approach to better understand both the opportunities and risks,” Davies said in the role. “These projects will assess how Gen AI could potentially support, complement or even transform BBC activity across a range of fields, including journalistic research and production, content discovery and archiving, and personalized experiences.”
The company did not specify these projects in an email to The edge.
But as the BBC determines how to best use generative AI, it has also blocked OpenAI and Common Crawl web crawlers from accessing BBC websites. Joins CNN, The New York Times, Reutersand other news organizations to prevent web crawlers from accessing their copyrighted material. Davies said this move is to “safeguard the interests of license fee payers” and that training AI models on BBC data without their permission is not in the public interest.