What happens when robots and humans can’t be differentiated by traditional means? How can you prevent your identity being stolen by AI? It turns out one of the world’s pioneers of artificial intelligence is working on a solution to that problem. Sam Altman of OpenAI is developing Worldcoin, a system used to verify the “humanness” of people accessing their accounts using retinal scans. The Wall Street Journal’s Angus Berwick reports:
Sam Altman wants to save us from the AI-dominated world he is building. The trouble is, governments aren’t buying his plan, which involves an attempt to scan the eyeballs of every person on Earth and pay them with his own cryptocurrency.
Altman’s OpenAI is creating models that may end up outsmarting humans. His Worldcoin initiative says it is addressing a key risk that could follow: We won’t be able to tell people and robots apart.
But Worldcoin has come under assault by authorities over its mission. It has been raided in Hong Kong, blocked in Spain, fined in Argentina and criminally investigated in Kenya. A ruling looms on whether it can keep operating in the European Union.
More than a dozen jurisdictions have either suspended Worldcoin’s operations or looked into its data processing. Among their concerns: How does the Cayman Islands-registered Worldcoin Foundation handle user data, train its algorithms and avoid scanning children?
Altman, the billionaire figurehead of the artificial-intelligence revolution, has tried to push back and open doors for Worldcoin. The project is a lesser-known part of the OpenAI chief executive’s sprawling business empire, but it plays a vital role in his vision for society’s future, by attempting to ascribe all humans a unique signature.
Worldcoin verifies “humanness” by scanning irises using a basketball-sized chrome device called the Orb. Worldcoin says irises, which are complex and relatively unchanging in adults, can better distinguish humans than fingerprints or faces.
Users receive immutable codes held in an online “World ID” passport, to use on other platforms to prove they are human, plus payouts in Worldcoin’s WLD cryptocurrency.
Worldcoin launched last year and says it has verified more than six million people across almost 40 countries. Based on recent trading prices, the total pool of WLD is theoretically worth some $15 billion.
The project says its technology is completely private: Orbs delete all images after verification, and iris codes contain no personal information—unless users permit Worldcoin to train its algorithms with their scans. Encrypted servers hold the anonymized codes and images.
However, several authorities have accused Worldcoin of telling Orb operators, typically independent contractors, to encourage users to hand over iris images. Privacy advocates say these could be used to build a global biometric database with little oversight.
Damien Kieran, the project’s chief privacy officer, said any groundbreaking venture like Worldcoin inevitably draws scrutiny, and the initiative was working with regulators to address concerns.
The project has paused the image-sharing option for users while it develops a new process, he said, and is continually improving its ability to keep people secure. Current training materials don’t ask operators in any way to induce users to share biometric data, he said.
“We’ve built a technology that by default is privacy-enhancing,” Kieran said in an interview. “We don’t collect data to harvest it. We don’t sell data. In fact, we couldn’t sell it, because we don’t know who the data belongs to.”
Read more here.
If you’re willing to fight for Main Street America, click here to sign up for the Richardcyoung.com free weekly email.