The AI 'Pause' Proposal Is Deceptive and Alarmingly Hazardous

It's wrong for a handful of billionaires to decide what's good and safe for the world – even well-intentioned AI leaders, says technologist and crypto investor Peter Vessenes.

AccessTimeIconApr 12, 2023 at 12:45 p.m. UTC
Updated Apr 12, 2023 at 4:50 p.m. UTC
AccessTimeIconApr 12, 2023 at 12:45 p.m. UTCUpdated Apr 12, 2023 at 4:50 p.m. UTC
AccessTimeIconApr 12, 2023 at 12:45 p.m. UTCUpdated Apr 12, 2023 at 4:50 p.m. UTC

Last month, several tech giants signed a letter calling for a six-month pause on training artificial intelligence (AI) models more powerful than GPT-4.

This letter is dangerous and should provoke thoughtful citizens. In it, the signatories claim that a pause will allow humanity more time to understand and respond to the potential risks of AI. The letter itself serves to rally public support for OpenAI and its allies as they consolidate their dominance, build an extended innovation lead and secure their advantage over a technology of fundamental importance to the future. If this occurs, it will irreparably harm Americans – our economy and our people.

Peter Vessenes is a technologist and investor with deep roots in digital currency, tokenization and blockchain. An early adopter of Bitcoin, he co-founded the Bitcoin Foundation in 2012 and was the first crypto industry person to advise the U.S. Senate GAO, the U.S. Treasury Department and other agencies about Bitcoin.

GPT-4 and similar foundation models promise to increase human capacity by 1,000 times, driving social change in many arenas of life. The industry as currently structured is likely to solidify a cabal deciding who benefits from this technology.

Imagine if, in 1997, Microsoft and Dell had issued a similar “Pause” letter, urging a halt to browser innovation and a ban on new e-commerce sites for six months, citing their own research that the internet would destroy brick-and-mortar stores and aid terrorist finance. Today we'd recognize this as self-serving alarmism and a regulatory capture attempt.

The “Pause” letter is no different. A few outspoken, charming leaders are making a power grab in the guise of protecting us from AI dangers. They have positioned themselves as the sole arbiters of what tech the world gets to see and use, and deciders of what makes an AI “accurate, safe, interpretable, transparent, robust, aligned, trustworthy and loyal.”

It's wrong for a handful of billionaires to decide what's good and safe for the world. Even well-intentioned AI leaders should not hold such power. Absolute power corrupts absolutely.

The world is in a race toward next-generation foundation models. Nobody racing will halt or even slow research and development. Independent AI labs and foreign rivals, eager to integrate advanced AI into their systems, won't pause; they will continue relentlessly.

How can we ensure all humans benefit from the 1,000x improvement AI offers us? The only way is through free and open development, including sharing capabilities, methodologies and network checkpoints. For example, for years, EleutherAI has been leading the way, resisting scam warnings that GPT-2 and GPT-3 were “too dangerous” for the public, fearlessly releasing research and models. We need 100 to 1,000x the scale of EleutherAI, globally. (Note: This author is not affiliated with EleutherAI.)

We should not pause. Right now we all must prioritize, invest in, contribute to and broadly publish genuinely open AI models. We must remain on guard against those who seek to control humanity's destiny.

Eleven years ago, early pioneers in the blockchain space pulled together and formed the Bitcoin Foundation, a model for crypto-industry organization balancing out the needs and goals of private individuals with those of the companies that would make crypto thrive. The Foundation encompassed many kinds of people and many goals, but we were united in wanting Bitcoin to thrive outside the control of any single group or cabal. I believe something similar is needed for AI technology, taking the best lessons from the last decade and more of decentralized technology and matching it with a fierce commitment to independence, humanity and openness.

The proposed pause would consolidate control of AI development among the wealthy and powerful. Instead, let us race forward together and in the open. I invite the like-minded, the skeptical and the curious to discuss how we can achieve a better future. Connect with others who are committed to keeping AI free and open for all at freeaimovement.com.

Edited by Christie Harkin.


Learn more about Consensus 2024, CoinDesk's longest-running and most influential event that brings together all sides of crypto, blockchain and Web3. Head to consensus.coindesk.com to register and buy your pass now.


Disclosure

Please note that our privacy policy, terms of use, cookies, and do not sell my personal information has been updated.

The leader in news and information on cryptocurrency, digital assets and the future of money, CoinDesk is an award-winning media outlet that strives for the highest journalistic standards and abides by a strict set of editorial policies. In November 2023, CoinDesk was acquired by Bullish group, owner of Bullish, a regulated, institutional digital assets exchange. Bullish group is majority owned by Block.one; both groups have interests in a variety of blockchain and digital asset businesses and significant holdings of digital assets, including bitcoin. CoinDesk operates as an independent subsidiary, and an editorial committee, chaired by a former editor-in-chief of The Wall Street Journal, is being formed to support journalistic integrity.

Peter Vessenes

Peter Vessenes is a technologist and investor with deep roots in digital currency, tokenization and blockchain. An early adopter of Bitcoin (he started mining in 2010), he co-founded the Bitcoin Foundation in 2012 where he served as its Executive Director and Chairman.


Read more about