Jump to content

Petition to Pause ALL Major AI Developments Circulates the Internet… Signed by Elon Musk and Other Notable Tech Figures


Recommended Posts

Posted

Petition to Pause ALL Major AI Developments Circulates the Internet… Signed by Elon Musk and Other Notable Tech Figures

By Survive the News

 MAR 29, 2023

 

A petition started by the Future of Life Institute is circulating online with some heavy hitters signing on in agreement.  The petition states:

AI systems with human-competitive intelligence can pose profound risks to society and humanity, as shown by extensive research[1] and acknowledged by top AI labs…Contemporary AI systems are now becoming human-competitive at general tasks,[3] and we must ask ourselves: Should we let machines flood our information channels with propaganda and untruth? Should we automate away all the jobs, including the fulfilling ones? Should we develop nonhuman minds that might eventually outnumber, outsmart, obsolete and replace us? Should we risk loss of control of our civilization? Such decisions must not be delegated to unelected tech leaders. Powerful AI systems should be developed only once we are confident that their effects will be positive and their risks will be manageable.”

The petition goes on to call for a “public and verifiable” pause of at least 6 months all AI systems that are more powerful that the ChatGPT-4.  It also calls for independent oversight and rigid auditing to ensure they are “safe beyond a reasonable doubt”.

The petition has garnered signatures from some notable figures in the tech world:  Elon Musk (CEO of SpaceX, Tesla and Twitter), Steve Wozniak (co-founder of Apple), Jaan Tallinn (co-founder of Skype, Future of Life Institute), Evan Sharp (co-founder of Pinterest), Emad Mostaque (CEO of Stability AI), and several others ranging from MIT and Harvard professors and executives, to other CEOs of AI start-ups, and other notable figureheads around the world.

(Look on Lorenzo Green @mrgreen Twitter account: https://twitter.com/mrgreen )

 

The late physicist Stephen Hawking was famously critical of artificial intelligence.  In 2017, The Independent quoted Hawking saying:

“I fear that AI may replace humans altogether,” he said in an interview with Wired magazine, seen by Cambridge News.

“If people design computer viruses, someone will design AI that improves and replicates itself. This will be a new form of life that outperforms humans.”

In that same article, Elon Musk said:

…he [Musk] should be “on the list of people who should absolutely *not* be allowed to develop digital superintelligence”.

You can sign the Future of Life Institute’s petition here: https://futureoflife.org/open-letter/pause-giant-ai-experiments/

Article here: https://www.survivethenews.com/petition-to-pause-all-major-ai-developments-circulates-the-internet-elon-musk-and-other-notable-tech-figures-sign-on/

 


Posted (edited)

Lorenzo Green @mrgreen special thread about: 

 

 

Elon Musk Signs Open Letter To Pause "Giant AI Experiments"

if the companies working on these LLMs won't do it, the letter says the government should step in

by Brandon Gorrell

The Future of Life Institute published an open letter titled “Pause Giant AI Experiments,” which calls on all organizations training AI more powerful than GPT-4 to pause for the next six months.

Signed by over 1,100 people, the letter has the support of notable figures such as Elon Musk, Steve Wozniak, and Yuval Noah Harari.

Musk has been outspokenly critical of OpenAI, and tweeted in February that “OpenAI was created as an open source (which is why I named it “Open” AI), non-profit company to serve as a counterweight to Google, but now it has become a closed source, maximum-profit company effectively controlled by Microsoft. Not what I intended at all.”

From the letter:

[W]e call on all AI labs to immediately pause for at least 6 months the training of AI systems more powerful than GPT-4. This pause should be public and verifiable, and include all key actors. If such a pause cannot be enacted quickly, governments should step in and institute a moratorium.

The letter argues for the pause because “AI labs [are] locked in an out-of-control race to develop and deploy ever more powerful digital minds that no one – not even their creators – can understand, predict, or reliably control.” The letter continues:

AI labs and independent experts should use this pause to jointly develop and implement a set of shared safety protocols for advanced AI design and development that are rigorously audited and overseen by independent outside experts. These protocols should ensure that systems adhering to them are safe beyond a reasonable doubt. This does not mean a pause on AI development in general, merely a stepping back from the dangerous race to ever-larger unpredictable black-box models with emergent capabilities.

AI research and development should be refocused on making today's powerful, state-of-the-art systems more accurate, safe, interpretable, transparent, robust, aligned, trustworthy, and loyal.

Last month, a change.org petition called “Unplug The Evil AI Right Now” circulated in the wake of Microsoft’s release of the Bing chatbot.

The Future of Life Institute is a non-profit whose mission is to reduce the risk of nuclear war and improve AI governance. Prominent physicist Max Tegmark is its president, and Skype co-founder Jaan Tallin and Victoria Krakovna, a researcher at the Deep Mind Institute, are on its board. The Institute’s website lists Musk as an external advisor.

-Brandon Gorrell

Here: https://www.piratewires.com/p/elon-musk-letter-stop-ai-training

 

"Ja Rule" "Signed" The AI Open Letter Today

but last night, the non-profit that published the letter told us they had "tightened our vetting process" for signatories

by Brandon Gorrell

22 min ago

Sam Altman, Bill Gates, and Ja Rule were some of the top signatories that have erroneously appeared on an open letter called “Pause All Giant AI Experiments,” published by the non-profit The Future of Life Institute.

Before the letter’s press embargo was lifted, both Altman and Gates were listed at the top of the list of signatories. Altman’s inclusion in the letter was particularly surprising, because it appeared that he was calling on his own company to stop training the rumored next version of large language model GPT-5.

When I emailed Future of Life Institute last night for comment on Altman and Gates’ inclusion on a previous version of the letter, Anthony Aguirre, the non-profit’s VP and Secretary of the Board responded, “The signatures of Sam Altman and Bill Gates were fake, and we removed them and tightened our vetting process as soon as we were made aware of them.”...

MORE: https://www.piratewires.com/p/ai-open-letters-fake-signers

 

 

Edited by msfntor
twitter linked rather...
  • 2 weeks later...
Posted

Tech leaders openly admit AI's getting out of control

by Steve Jordahl

Mar 30, 2023

People are concerned that those at the forefront of technology are calling for a six-month moratorium on AI research while some ethical concerns are addressed.

Apple Computers co-founder Steve Wozniak, SpaceX founder and Twitter owner Elon Musk, and a host of other tech leaders have signed on to an open letter asking that all artificial intelligence labs immediately pause the training of AI systems for at least six months.

"Powerful AI systems should be developed only once we are confident that their effects will be positive and their risks will be manageable," said the letter issued by the Future of Life Institute.

The concern is that the systems are coming up with human-competitive intelligence and pose a profound risk to humanity.

"AI stresses me out," Musk said earlier this month.

Fox commentator Douglas Murray calls the letter "extraordinary."

"We could be in trouble," he says. "This is unprecedented."

The letter asks for a set of shared safety protocols for advanced AI that would ensure that it remains under human control.

"This has been coming on us for 25 years now, so it shouldn't be that much of a surprise that the treadmill of technology is running faster than we humans can run," Murray recently told "Fox & Friends."

Talk host and reporter Richard Randall says it is time to pay attention:

"I've got to believe that Elon Musk knows a lot more about this and the dangers of it," says Randall. "The fact that he's saying that there ought to be at least a six-month moratorium on this concerns me greatly."

He warns that AI could soon be replacing people in highly skilled jobs.

"You could honestly go forward with newspapers and television programs and law review articles and medical journals without any human input at all, with all of it being generated by AI," Randall poses. "No one would know the difference."

Tristan Harris, co-founder of the Center for Humane Technology, warns that the world is witnessing the birth of "the nuclear age."

Here on afn.net: https://afn.net/science-tech/2023/03/30/tech-leaders-openly-admit-ai-s-getting-out-of-control/

 

 

AI Is Getting Out of Control in Blender | ControlNet (2:39)

 

VERY SCARY: AI bot lays out plans to destroy humanity

320,388 views • Apr 12, 2023 • #FoxNews

Jake Denton, a Heritage Foundation Tech Policy Center research associate, weighs in on A.I.'s threat to humanity as the Biden administration asks for public input on regulation ideas.

"All I have to say is that AI does not “just become self aware”; if it said those things, someone programmed it to do so…"

"Just to put some people at ease. Chat GPT is a language tool. it can create realistic human-like sentences, but it doesn't have any thoughts or feelings. It produces fictional thoughts and feelings based on information taken from the internet. It's not a threat right now. but it is certainly the start of something that could be a threat."

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...