Home US How AI faces its ‘Oppenheimer moment’ and why humans must act

How AI faces its ‘Oppenheimer moment’ and why humans must act

0 comments
How AI faces its 'Oppenheimer moment' and why humans must act

Regulators have warned that AI is facing its “Oppenheimer moment” and are urging humans to act before it is too late.

The remarks were made at a conference in Vienna on Monday, referencing J. Robert Oppenheimer, who helped invent the atomic bomb in 1945 before advocating for controls on the spread of nuclear weapons.

The event included civil, military, and technology officials from more than 100 countries discussing their nation’s control over AI militarization.

The US Pentagon has given millions of dollars to artificial intelligence startups, the European Union is working on a database to evaluate battlefield targets, and the Israeli military has an algorithm that creates a “hit list.” death”.

“This is the Oppenheimer Moment of our generation,” said Austrian Foreign Minister Alexander Schallenberg. “Now is the time to agree international rules and norms.”

“This is the Oppenheimer Moment of our generation,” warned Austrian Foreign Minister Alexander Schallenberg (above), whose government hosted a two-day conference on restricting AI in war zones. “Now is the time to agree international rules and norms,” ​​he said.

Above, the Thermonator, the $9,420 flamethrowing robot dog from an Ohio-based company.

During his keynote speech on Monday, Schallenberg described artificial intelligence (AI) as the most significant advance in warfare since the invention of gunpowder more than a millennium ago. Above left, the Thermonator, the $9,420 flamethrowing robot dog from an Ohio-based company.

At this week’s conference, a former AI investor at Google’s parent company worried: “Silicon Valley’s incentives might not be aligned with the rest of humanity.”

AI was designed to help improve the lives of humans, allowing them to give up mundane tasks to focus on the greater good, but it has since taken a turn that could destroy humanity if left unregulated.

During his opening remarks, Schallenberg described AI as the most significant advance in warfare since the invention of gunpowder more than a millennium ago.

The only difference is that AI is even more dangerous, he continued.

“Let’s at least make sure that the deepest and most far-reaching decision – who lives and who dies – is left in the hands of humans and not machines,” Schallenberg said.

The Austrian minister stated that the world needs to “ensure human control”, with the worrying trend of military AI software replacing humans in the Decision-making process.

The statements come just weeks after it was discovered that the Israeli arm had been using an artificial intelligence system. to complete their “kill list” of suspected Hamas terrorists, resulting in the deaths of women and children.

a report of magazine +972 cited six Israeli intelligence officers, who admitted to using an AI called ‘Lavender’ to classify up to 37,000 Palestinians as suspected militants, marking these people and their homes as acceptable targets for airstrikes.

During his keynote speech on Monday, Schallenberg described artificial intelligence (AI) as the most significant advance in warfare since the invention of gunpowder more than a millennium ago.

During his keynote speech on Monday, Schallenberg described artificial intelligence (AI) as the most significant advance in warfare since the invention of gunpowder more than a millennium ago. Above left, the Thermonator, the $9,420 flamethrowing robot dog from an Ohio-based company.

Civilian, military and technological leaders from more than 100 countries met Monday in Vienna (above) in an effort to prevent, as physicist Anthony Aguirre said,

Civilian, military and technology leaders from more than 100 countries met Monday in Vienna (above) in an effort to prevent, as physicist Anthony Aguirre put it, “the future of killing robots.”

Costa Rican Foreign Minister Arnoldo André Tinoco expressed concern at the conference that terrorists and other non-state actors will soon employ AI-powered weapons of war, requiring a new legal framework. Above, an American Reaper drone.

Costa Rican Foreign Minister Arnoldo André Tinoco expressed concern at the conference that terrorists and other non-state actors will soon employ AI-powered weapons of war, requiring a new legal framework. Above, an American Reaper drone.

Lavender was trained with data from Israeli intelligence’s decades-long surveillance of Palestinian populations, using the fingerprints of known militants as a model for what signal to look for in the noise, according to the report.

But the technology has also been added to drones used in the Ukraine war, which are helping the nation search for targets that are unloading munitions without human guidance.

Austria’s top disarmament official, Alexander Kmentt, who led the organization of Monday’s conference, warned that traditional “arms control” treaties would not work for software like AI.

“We are not talking about a single weapons system but rather a combination of dual-use technologies,” Kmentt said. “A classical approach to gun control doesn’t work.”

Kmentt argued that currently existing legal tools, such as export controls and humanitarian laws, would be a better and faster solution to the crisis, which is already in progress, rather than waiting to craft a new “masterpiece” treaty. .

Costa Rican Foreign Minister Arnoldo André Tinoco also expressed concern that terrorists and others will soon use weapons of war powered by artificial intelligence. non-state actors, which will require a new legal framework.

“The ready availability of autonomous weapons removes the limitations that ensured that only a few could enter the arms race,” he said.

«Now students with a 3D printer and basic programming knowledge can make drones capable of causing many victims. Autonomous weapons systems have forever changed the concept of international stability.

You may also like