Home Tech A British AI startup with government ties is developing technology for military drones

A British AI startup with government ties is developing technology for military drones

0 comments
A British AI startup with government ties is developing technology for military drones

A company that has worked closely with the UK government on AI security, the NHS and education is also developing AI for military drones.

Consulting firm Faculty AI has “experience in developing and deploying AI models in UAVs,” or unmanned aerial vehicles, according to a defense industry partner company.

The college has become one of the most active companies selling artificial intelligence services in the United Kingdom. Unlike companies like OpenAI, Deepmind or Anthropic, it does not develop models itself, but rather focuses on reselling models, especially OpenAI, and advising on their use in government and industry.

The professors gained particular prominence in the United Kingdom after working on data analysis for the Vote Leave campaign ahead of the Brexit vote. Boris Johnson’s former adviser Dominic Cummings handed government work to the college during the pandemic and included its chief executive, Marc Warner, in meetings of the government’s scientific advisory panel.

Since then, the company, officially called Faculty Science, has carried out testing of AI models for the UK government’s AI Safety Institute (AISI), created in 2023 under former Prime Minister Rishi Sunak.

Governments around the world are racing to understand the security implications of artificial intelligence, after rapid improvements in generative AI sparked a wave of hype around its possibilities.

Weapons companies are interested in putting AI in drones, from “loyal companions” that could fly alongside fighter jets, to loitering munitions that are already capable of waiting for targets to appear before firing at them.

Recent technological advances have raised the possibility of drones that can track and kill without a human “in the know” making the final decision.

in a Press release Announcing a partnership with London-based Faculty, British startup Hadean wrote that the two companies are working together on “subject identification, tracking object movement, and exploring the development, deployment, and operations of autonomous swarms.”

It is understood the College’s work with Hadean did not include gun attacks. However, the College did not respond to questions about whether it was working on drones capable of lethal force, nor did it provide further details about its advocacy work, citing confidentiality agreements.

A spokesperson for the College said: “We help develop new AI models that will help our defense partners create safer and more robust solutions,” adding that it has “rigorous internal ethical policies and processes” and follows ethical guidelines on AI. of the Ministry of Defense.

The spokesperson said the college has a decade of experience in AI safety, including in combating child sexual abuse and terrorism.

Scott Trust, the ultimate owner of The Guardian, is an investor in Mercuri VC, formerly GMG Ventures, which is a minority shareholder in Faculty.

The college, led by executive director Marc Warner, continues to work closely with the AISI. Photography: arutoronto/Facultad AI

“We have worked on AI safety for a decade and are world-leading experts in this field,” the spokesperson said. “That’s why governments and model developers trust us to ensure border AI is secure, and defense customers to apply AI ethically to help keep citizens safe.”

Many experts and politicians have urged caution before introducing more autonomous technologies into the military. In 2023, a House of Lords committee called on the UK government to seek to establish a non-binding treaty or agreement to clarify the application of international humanitarian law as it relates to lethal drones. In September, the Green Party called for laws to completely ban lethal autonomous weapons systems.

Professors continue to work closely with the AISI, putting it in a position where its judgments could influence UK government policy.

In November, AISI hired faculty to investigate how large language models “are used to assist criminal or undesirable behavior.” The AISI said the winner of the contract – the faculty – “will be an important strategic contributor to the AISI safeguards team, directly contributing key information to AISI system security models.”

skip past newsletter promotion

The company works directly with OpenAI, the startup that started the latest wave of AI enthusiasm, to use its ChatGPT model. Experts have previously expressed concern about a potential labor conflict in the work the College has done with AISI, according to Politico, a news website. The school did not detail which company models it had tested, although it tested OpenAI’s o1 model before its launch.

The government has previously said of the AI ​​Faculty’s work for AISI: “The most important thing is that they do not conflict over the development of their own model.”

Natalie Bennett, Green Party MP, said: “The Green Party has long expressed serious concerns about the ‘revolving door’ between industry and government, raising issues ranging from staff seconded from gas companies to work in energy policy to former defense ministers who are going to work for arms companies.

“That a single company has accepted a large number of government contracts to work on AI and at the same time works with the AI ​​Safety Institute to test large language models is a serious concern; not so much ‘poacher turned gamekeeper’ but playing both roles. at the same time.”

Bennett also highlighted that the UK government “has yet to make a full commitment” to ensuring there is a human being aware of autonomous weapons systems, as recommended by the Lords Committee.

The college, whose largest shareholder is a Guernsey-registered holding company, has also sought to cultivate close links across the UK government, winning contracts worth at least £26.6 million, according to government disclosures. These include contracts with the NHS, the Department of Health and Social Care, the Department of Education and the Department of Culture, Media and Sport.

Those contracts represent a major source of income for a company that made sales worth £32m in the year to March 31. I lost £4.4m during that period.

Albert Sánchez-Graells, professor of economic law at the University of Bristol, warned that the UK depends on the “self-control and responsibility of technology companies in the development of AI”.

“Companies that support AISI’s work should avoid organizational conflicts of interest arising from their work for other parts of the government and broader market-based AI businesses,” Sánchez-Graells said.

“Companies with such broad portfolios of AI activities as the College have questions to answer about how they ensure their advice to AISI is independent and impartial, and how they avoid leveraging that knowledge in their other activities.”

The Department of Science, Innovation and Technology declined to comment, saying it would not go into detail about individual commercial contracts.

You may also like