Not since Facebook/Meta’s failed digital currency Libra/Diem has a crypto project elicited as much angst and hyperbole as Worldcoin, the biometrics proof-of-human project co-founded by Open AI founder Sam Altman.
Clearly, many people are enthusiastic about this project, which Worldcoin posits as a way to empower humanity in the face of artificial intelligence’s rapid expansion, with its proof of unique personhood solution intended to distinguish people from deep-fake bots and underpin an equitable distribution of all that AI-generated wealth.
The token leapt more than 40% within minutes of Worldcoin’s mainnet going live on Monday as some 250,000 people who’d scanned their irises via one of Worldcoin’s silvery orbs received an airdropped token. Some 2 million people worldwide have now signed up to be scanned, with Altman tweeting this week that this was happening at a rate of one person every eight seconds.
Jeff Wilser visited Worldcoin’s offices in Berlin and delivered an in-depth feature about Worldcoin’s incubation and launch, including the nugget that the CEO had never held a job before helming this complex logistical and regulatory effort.
Many are upset by this project. The hew and cry was especially loud from within the crypto community, with lots of allusions to a Big Brother-like leviathan gaining access to highly sensitive personal data.
Read more: Jeff Wilser - The Untold Story of Worldcoin’s Launch: Inside the Orb
Our own David Z. Morris has done a good job staking out the critic’s side of it. In a column a month ago, Morris acknowledged the potential benefits of Worldcoin’s universal basic income (UBI) ambitions but added that Altman and his cofounders “have hit on a way to make this appealing premise seem utterly dystopian.” He warned of the dangers of a centralized entity gathering retina prints and noted that the $5000 cost per orb – along with logistics challenges of distributing it worldwide – make a mockery of any plans for a “universal” rollout. (In a side note, Morris added that the name “orb” is “creepy as hell,” suggesting it implies “the Eye of Sauron, Foucault’s Panopticon, the Saudi Intelligence Orb, Saruman’s palantir, and the for-profit spy firm named after it.”)
On the other side are backers like Jake Brukhman, a partner at Coinfund, which invested in Worldcoin in 2021. On CoinDesk TV, Brukhman predicted the project would onboard billions of people into cryptocurrency, with the financial inclusion benefits that come with it. Brukhman, Altman and other supporters inside and outside Worldcoin dismiss the privacy concerns by highlighting that neither the company’s servers nor its devices store any raw human data, converting the scans instead into unique, non-discoverable hash codes.
A more balanced, but still cautionary take came from Ethereum founder Vitalik Buterin, who in a blog post applauded Worldcoin’s commitment to privacy and the sophisticated technology it is using to protect people’s data. But he highlighted “four major risks” with the project, noting that there were no ways within a centralized model to guarantee that people’s data is absolutely safe. He said it’s impossible to know whether a “backdoor” was built into the hardware of the orb that would allow the company, or perhaps a government, to access the data at some point.
I kinda sit in the middle.
Notwithstanding Vitalik’s point on the impossibility of perfect privacy, I think fears of a major breach of people’s biometric data and the threats that could pose to them are likely overblown – or at least they’re no greater than the privacy threats we face elsewhere. (We store far more data on our iPhones using similar device-localized cryptographic protections, for example, and let’s not forget that the biggest crypto exchanges must collect “know-your-customer” (KYC) identifying information on all their clients.)
My concern is with the corporate centrality of it all and with the misaligned incentives it will foster. Why is UBI even the responsibility of a private company? Doesn’t that create uneasy dependency among its poorer recipients? And what exactly is the token for? Worldcoin seems to be hoping it will form the basis of an ecosystem of decentralized AI applications as it promotes its software development toolkit for developers.
Read more: Eliza Gkritsi - Worldcoin Could Enable Wider Distribution of Crypto Than Even Bitcoin, Says CoinFund
But, for now, it feels like the project is designed to bootstrap participation via speculative fervor, which is in turn fueled by the buzz around a high-profile project and founder, which creates a juicy exit for early token holders and sets post-launch investors up for losses. (Sure enough, the WLD token suffered sharp declines later into the week.) With many taking issue with the tokenomics of the Worldcoin rollout, which severely limits total circulating supply, the whole thing looks to many like a hyped-up money grab. Rampant profiteering around something as important as people’s identity cannot end well.
This just brings home a point I’ve made before about the lessons to draw from Web2 as we barrel into this new AI era. The risk is not with the technology per se – we’ve known for years that AI is capable of destroying us. It’s that if we concentrate control over these technologies among a handful of overly powerful companies that are incentivized to use them as proprietary “black box” systems in pursuit of profit, they will rapidly move into dangerous, humanity-harming territory, just as the Web2 platforms did.
Still, there’s at least one positive that could arise out of the Worldcoin project. It is drawing attention to the need for some kind of proof of humanity, which could give oomph to the many interesting projects seeking to give people greater control over their identities in the Web3/AI age. The answer to proving and elevating authentic humanity could lie in capturing the “social graph” of our online connections, relationships, interactions and authorized credentials via decentralized identity (DID) models or initiatives like the decentralized social networking protocol (DSNP) that’s part of Project Liberty. Or it might yet lie in a biometrics solution like the one Worldcoin is working on, though hopefully with a more decentralized, less corporatized structure. What’s clear is that we must do something.
Consider a somewhat salacious example: already AI bots have created extremely life-like female digital characters to appear in pornographic videos sold to OnlyFans patrons who think they are real performers. You might not hold pornography in high regard and probably think that if gullible, desperate men fall for this, they deserve no sympathy. But consider what this means for the human performers.
For all the criticism it receives, OnlyFans, or more specifically the direct-to-patron model it’s built on, has been applauded by sex work advocates for finally giving performers a safe environment in which to earn income on their own terms. If they can’t adequately prove that they’re human and are outcompeted for clients’ dollars by an army of fake bots, what are their options? Might they have to return to street prostitution, where, yes, they can easily prove their humanity but face the risk of violence at the hands of clients and pimps?
Every person deserves dignity in the digital age. Achieving that is going to require balancing reliable solutions that differentiate humans from machines with commitments to protect our privacy and our most essential personal data.