JUSTIN OKARA, CIPP/E
Partner
In a world where AI-generated content has become nearly indistinguishable from human-created work, determining who is real online has never been more challenging. Each day, we scroll past AI-generated profiles on dating apps, encounter sophisticated bots in comment sections, and struggle to verify whether we’re interacting with actual people or machines. This digital identity crisis threatens the foundation of trust that our online interactions depend upon.
Enter Tools for Humanity’s World Project (formerly Worldcoin) – a bold and controversial attempt to solve this crisis by creating a global system that verifies your humanity through your eyes. As a privacy lawyer watching this unfold, I find myself both intrigued by its promise and deeply concerned about its implications.
The World Project proposes a deceptively simple solution: use biometric data – specifically, iris scans – to verify human identity online. At the centre of this system is the Orb, a sleek silver device that scans your eyes and converts your unique iris pattern into an encrypted “IrisCode” stored on a blockchain. This code becomes your World ID – essentially a digital certificate of your humanity.
Unlike passwords or traditional IDs that can be stolen or forged, World claims that iris biometrics offer unparalleled security. They’re difficult to replicate and, when properly implemented, could resist sophisticated spoofing attempts. The company emphasises that they don’t store actual iris images but rather mathematical representations that supposedly cannot be reverse-engineered.
The vision is compelling: imagine logging into websites, social platforms, and financial services with simple proof that you're human – no more endless CAPTCHAs or vulnerable passwords. World ID advocates highlight that it allows users to anonymously and securely verify they are real and unique humans (not bots) for various online activities like signing into social apps, participating in online voting, or purchasing concert tickets. This authentication could be particularly valuable in real-world scenarios where bots are used to buy multiple tickets or limited hardware releases, creating unfair advantages and market distortions.
An estimated one billion people around the globe do not possess government IDs, and World offers a means for digital inclusion and financial access. Furthermore, World ID Credentials provide access to the World Network for about 1.2 billion passport holders worldwide who may be far from an Orb verification device, greatly broadening the system's potential reach beyond conventional biometric verification.
The World Project didn’t begin purely as an identity solution. Its early rollout in 2021 featured a cryptocurrency component – Worldcoin (WLD) – offered as an incentive for people willing to scan their irises. This approach raised immediate red flags: Was this a legitimate identity solution or merely a scheme to boost cryptocurrency adoption?
As regulatory scrutiny intensified, the project evolved. Today, World positions itself primarily as an identity platform, with the cryptocurrency aspect becoming secondary. The company has announced partnerships with Match Group (parent company of Tinder and other dating platforms) to verify user identities and combat catfishing. They’ve also launched the World Card, a crypto-linked debit card developed with Visa, allowing users to spend WLD tokens at any Visa-accepting merchant.
A significant recent development is World ID Credentials, which allows individuals to connect valid forms of ID (starting with NFC-enabled passports) to their World ID without sharing any information with Tools for Humanity, the World Foundation, or any other third party. According to World, all personal information is securely stored only on the user’s device. This approach aims to provide the security benefits of verified identity while maintaining user privacy through technical safeguards. The system is currently being piloted in select countries including Chile, Colombia, Malaysia and South Korea.
This pivot represents a strategic shift – from cryptocurrency-first to identity-first – likely designed to navigate the regulatory challenges that have emerged across multiple jurisdictions.
World’s ambitious global rollout has met with varying degrees of acceptance and resistance. After initiating data collection in several international markets beginning in 2023, World announced its official launch in the United States this month, deploying 7,500 Orbs across six major cities, including Atlanta, Austin, Los Angeles, Miami, Nashville, and San Francisco.
This U.S launch is particularly significant, coming after regulatory setbacks across multiple jurisdictions with arguably stronger privacy protections than the U.S. The timing suggests a calculated strategy: World appears to be taking advantage of the fragmented U.S. privacy landscape, which lacks comprehensive federal data protection legislation compared to the EU GDPR and other national privacy legislations.
Far from avoiding markets with established data protection frameworks, World attempted early operations in several countries, only to face swift regulatory pushback. Spain ordered the company to cease operations and delete collected biometric data. Germany, Portugal, and France launched legal inquiries into World’s data collection practices. The UK’s Information Commissioner’s Office opened investigations over concerns about consent and data processing legitimacy. Beyond Europe, Argentina’s data protection authority also initiated probes into the company’s activities.
This pattern of regulatory resistance in countries with robust data protection frameworks suggests that World’s U.S. expansion may be exploiting a regulatory gap rather than embracing stringent oversight. The current U.S. administration is generally favourable toward cryptocurrency innovation, combined with the absence of comprehensive federal privacy legislation, likely creates a more permissive environment for World’s biometric data collection practices.
In May 2025, akin to the backlash from European nations and others, World encountered a major legal hurdle when Kenya’s High Court determined that its collection and processing of biometric data breached the nation’s Data Protection Act (2019). The court’s conclusions were damning:
World failed to conduct a proper Data Protection Impact Assessment (DPIA) before collecting sensitive biometric data
The consent obtained from Kenyans was neither fully informed nor truly voluntary
The practice of offering cryptocurrency in exchange for biometric data raised serious ethical concerns
The cross-border transfer of this data lacked adequate safeguards
The Court ordered World to cease all data collection activities in Kenya and permanently delete all biometric data collected from Kenyans, with the process to be supervised by the Office of the Data Protection Commissioner (ODPC).
For affected Kenyan citizens, the implications are profound and troubling. First is the uncertainty surrounding their biometric data – once iris scans are uploaded and processed, can deletion truly be verified? If data was already transferred outside Kenya or uploaded to decentralised systems before the court order, it might persist beyond the company’s control. This uncertainty is particularly alarming because biometric data, unlike passwords, is immutable – it cannot be changed if compromised.
Second, the deletion likely invalidates their World IDs entirely, as these depend on the iris codes generated from the Orb scans. Any services or platforms requiring World ID verification may now be inaccessible to these users, effectively nullifying their digital identities established through this system.
Curiously, while the court focused on data protection violations, it did not address the cryptocurrency (WLD tokens) distributed as participation incentives. Those who received tokens appear to retain them, even though the method of acquisition was deemed unlawful. Most troubling, perhaps, is that the ruling provides no compensation or remedies for affected individuals beyond data deletion – no guidance on alternative identity verification methods, no recourse for lost service access, and no compensation for privacy violations.
The Kenya case represents a landmark moment for data protection in the Global South. It demonstrates that emerging economies can effectively enforce privacy rights against powerful tech companies, even without the resources of European data protection authorities. The ruling can become a reference point for other African nations developing their own approaches to biometric data governance.
Perhaps most importantly, the Kenya ruling exposes a fundamental tension: while World promotes itself as a solution for the unbanked and those lacking traditional IDs, these same vulnerable populations may have limited understanding of the complex data implications or recourse in case things go wrong. The court recognised this power imbalance, highlighting how financial incentives can undermine true consent, especially in economically disadvantaged communities.
World’s privacy claims warrant careful scrutiny. Though the company insists it stores only encrypted IrisCodes and not raw images, conflicting reports have emerged. Hong Kong authorities, for instance, banned World after discovering the company was retaining iris images for up to ten years – directly contradicting their public statements.
Similarly troubling were allegations from Spanish authorities that the company had scanned children’s eyes, raising questions about age verification and consent procedures. These incidents highlight a critical concern: the gap between World’s privacy promises and its actual practices.
Even if we take World at its word that only encrypted codes are permanently stored, these codes themselves constitute sensitive personal data under most privacy laws. Unlike passwords, biometric data cannot be changed if compromised. Once your iris pattern is leaked or misused, that vulnerability remains perpetual.
World’s introduction of World ID Credentials, which allows connecting official ID documents without sharing personal information with third parties, represents a potential improvement in their approach to privacy. According to World all credential information is securely stored only on the user’s device, addressing some of the centralised data collection concerns. The system allows users to selectively prove aspects of their identity (such as age or nationality) while remaining anonymous, using the World ID protocol’s anonymising infrastructure.
However, questions remain about the technical implementation and whether this system adequately addresses the fundamental concerns about biometric data collection and storage for those who choose the iris-scanning verification method. While World ID Credentials may provide a more privacy-preserving alternative to iris-scanning, the company’s history of regulatory challenges raises questions about how consistently these privacy protections will be implemented and maintained.
The blockchain storage approach presents additional complications. While blockchain can provide transparency and resistance to tampering, it also creates challenges for data protection principles like the “right to be forgotten.” How can data truly be deleted from an immutable blockchain if required by court order or user request?
As World continues its global expansion, we must confront the fundamental question: does the benefit of proving our humanity online justify the permanent surrender of our biometric data?
The case for World ID is strongest in contexts where verification really matters. World's proponents point to several compelling use cases:
Combating sophisticated financial fraud and preventing identity theft
Ensuring fair access to limited resources like event tickets and product releases
Preventing bot manipulation of online voting and polls
Providing secure, anonymous verification for social platform access without revealing personal details
In a future where deepfakes become indistinguishable from reality, having a reliable way to verify human identity could become essential infrastructure. The ability to anonymously confirm uniqueness without sharing personal information has genuine value in our increasingly digital world.
Yet the risks cannot be dismissed. Beyond privacy concerns, World’s system could eventually create a new form of digital exclusion – where proving humanity requires submission to biometric scanning. Those unwilling or unable to provide their biometric data might find themselves increasingly locked out of digital spaces.
The regulatory response across multiple jurisdictions suggests a growing consensus: current biometric-based digital identity verification implementations may be fundamentally incompatible with privacy rights. The seemingly coordinated actions of data protection authorities reflect deep concerns about the proportionality of collecting permanent biometric identifiers for digital verification purposes.
The immutable nature of biometric data amplifies these concerns. The Kenya case illustrates a troubling reality: verification remains difficult even when courts order data deletion. Once biometric data enters digital systems, particularly those leveraging blockchain technology, true deletion may be technically impossible. This creates a permanent risk that cannot be mitigated after the fact.
The jurisdictional challenges are particularly vexing. With users and data spread across countries with vastly different legal standards, which laws apply? World’s strategic entry into the U.S. market appears to exploit a regulatory gap rather than embrace robust oversight. The absence of comprehensive federal privacy legislation in the U.S., combined with generally favourable stance toward cryptocurrency innovation, likely creates a more permissive environment for biometric data collection that would be quickly challenged in jurisdictions with stronger privacy protections.
World’s attempts to navigate European markets have largely failed the GDPR compliance test. The comprehensive investigations launched across multiple EU member states demonstrate that Europe’s strict requirements around consent, data minimisation, purpose limitation, and cross-border transfers may be fundamentally incompatible with World’s current business model. These regulatory actions suggest that rather than adapting to meet stringent privacy standards, World appears to be seeking jurisdictions with less developed or less rigorously enforced data protection regimes.
Rather than an all-or-nothing approach to biometric identity, we should perhaps seek a middle ground. Technologies that verify humanity without permanent storage of biometric data exist – from liveness detection to zero-knowledge proofs that can verify attributes without revealing underlying data.
World ID Credentials, which connects official ID documents without sharing information with third parties or central databases, represents a step in this direction. The on-device storage of credential information and the ability to selectively verify attributes (like age or nationality) while remaining anonymous aligns with privacy-preserving identity verification principles. However, the implementation details and privacy implications require thorough independent evaluation, particularly given the company’s history of regulatory conflicts.
Regulatory frameworks are already demonstrating their power to shape this technology’s future. The global nature of the investigations and the landmark Kenya ruling show that existing data protection laws can be effectively applied to novel biometric systems, though more specialised regulations may be needed as these technologies proliferate.
The United States’ approach – or lack thereof – stands in stark contrast. Without comprehensive federal privacy legislation, Americans are left with a patchwork of state laws and sector-specific regulations that may prove insufficient against the unprecedented challenges posed by biometric identity companies like World can experiment with sensitive data collection practices that would be quickly challenged in other jurisdictions.
As for World itself, greater transparency would go a long way. Independent technical audits of their systems, clearer explanations of data flows, and more robust consent mechanisms could help build the trust necessary for widespread adoption. Detailed information about how World ID Credentials protects user data while connecting to official ID documents would help address some of the privacy concerns that have been raised.
The era of digital identity is here, whether we embrace it or not. The question is not if we’ll need ways to prove our humanity online, but how we’ll balance that need with our fundamental right to privacy. World ID represents one possible future – a future we should approach with both open minds and healthy scepticism.
At this point, I am not convinced that the trade-off proposed by World – trading our invaluable biometric data for digital convenience – is a fair bargain. Although the capability to anonymously confirm human individuality offers significant advantages in fighting bots and guaranteeing equitable resource access, whether digital or otherwise, gaining entry into the online realm should not require the permanent forfeiture of our biological distinctiveness.
World ID offers a compelling vision for verifying humanity online through iris scans or passport credentials, potentially solving digital identity challenges and bot manipulation while allowing anonymous verification for activities like voting and ticket purchasing.
While World ID Credentials now stores information exclusively on users' devices and allows selective attribute verification without third-party sharing, regulatory actions across multiple countries highlight ongoing privacy concerns, especially regarding biometric data collection through the Orb.
From Kenya to the EU, data protection authorities have challenged World's practices, raising questions about consent, data sovereignty, and the fundamental compatibility of biometric identity systems with privacy rights.
As digital identity verification becomes increasingly necessary, we must carefully weigh the benefits of secure, anonymous humanity verification against the permanent surrender of immutable biometric data, seeking solutions that provide verification without compromising privacy.
Data Protection
© 2025 Okara & Onuko Company Advocates. All rights reserved. The information on this website is for general information purposes only and should not be construed as legal advice. No action based on this content should be taken or omitted without seeking professional legal counsel.