The Spanish Data Protection Agency (AEPD) made an unprecedented decision this Wednesday. For the next three months, World Coin Orb will not be allowed to operate in the country. Since July 2023, these devices have scanned the irises of around 400,000 Spaniards to authenticate their accounts and reward them with a range of cryptocurrencies (with a cash value of around $80) .
Data previously collected by Worldcoin, a company associated with ChatGPT godfather Sam Altman, is currently blocked. The data cannot be processed or shared until an international investigation determines whether it is legal for private companies to collect this type of data.
This is the first time AEPD has taken such precautionary measures in Spain. Underscoring the exceptional nature of the move, the agency’s head, Mar España, said: “We acted with urgency because the situation required it. To avoid potentially irreparable damage. , our decision is justified. Failure to act will deprive people of the protections they are entitled to.”
Why suddenly freeze a collection of high-resolution photos of users’ irises? “Because there’s a state of alarm in our society. Lines forming at shopping centers… [for the orbs] And the fact that cryptocurrencies were involved forced AEPD to act quickly,” Borja Adzuala points out. He is a consultant and expert in digital law and has expressed his concerns about the WorldCoin orb. “The question is not whether people will pay for iris images, but whether that data will be treated correctly.”
The value of biometric data
There are many different types of personal data. Name, address, and telephone number are most commonly used in daily procedures. All of these can be used to identify a specific person, but they share another characteristic: they can be modified by interested parties.
However, other personal data remains for life. This is so-called biometric data, which refers to each person’s unique characteristics, such as physiological, physical, or behavioral. This type of information can be encrypted and often does not change over time. We all have the same DNA from the moment we are born until we die. The same goes for fingerprints (unless you burn them in). As your face evolves over the years (you gain weight, age, lose hair), there are algorithms that can establish your own patterns. For example, you can measure the distance between your eyes or the distance from your nose or mouth to your eyes. This allows you to recognize people with a high success rate over time.
According to David Arroyo, Principal Researcher at the Cybersecurity and Privacy Protection Working Group of Spain’s National Research Council (CSIC), among the different types of biometric data, the iris is the most accurate in identifying an individual. . He warned: “If an image of your iris is stolen, and thus the alphanumeric template in which its biometric characteristics are stored, your identity could be disguised in a number of situations.” doing. Iris scans are much more accurate than facial recognition. They are less commonly used because the required sensors are more expensive and the coordination of these systems is more complex. ”
In addition to its value as a personal identifier, iris analysis provides many other physiological and behavioral information. “Through your gaze and the way their pupils dilate, you can tell what a person likes, what they’re afraid of, what they’re interested in. Certain cognitions, such as whether they have Parkinson’s disease, “You can also look at the characteristics of privacy is power (2020).
Iris scans are typically limited to high-security environments as an additional means of identification to gain access to certain facilities. “This allows for very robust authentication, but it also comes with a lot of privacy issues, because the iris is something that is directly and unambiguously associated with a specific individual,” Arroyo points out. Masu.
special treatment
The uniqueness of biometric data makes its legal treatment more stringent than other forms of data. “European law considers this a special category of data. Biometric data can be obtained if Spanish law explicitly allows it in certain cases or with consent.” says Ricard Martínez, Director of Privacy and Digital Transformation at the University of Valencia. “Spanish regulations probably state that health data and biometric data must be capable of consent. But that doesn’t mean everything is possible. or engage in unlawful activities or violate fundamental rights. It’s more complicated than it seems.”
It is important to use this data appropriately. In 2021, the AEPD fined Spanish supermarket chain Mercadona 3.5 million euros ($3.83 million) for using cameras equipped with facial recognition systems in 48 stores. The company claimed that it introduced the technology to detect people who are prohibited from entering its facilities. The agency determined that the goal of identifying people with criminal convictions did not justify collecting facial patterns from every customer who visited the chain’s supermarkets.
Returning to the case of World Coin, the orb scans the iris and converts the image into an alphanumeric code. That template is what identifies the user. “The issue is not that WorldCoin collected this data from 400,000 people, but rather that WorldCoin has made all these databases and images available to other algorithms and has not revealed exactly why. No,” said lawyer Jorge García Herrero. He specializes in data privacy regulation enforcement.
The big danger with biometric data is that it can be used for illegal purposes. In China, for example, facial recognition systems are used to monitor and persecute Uyghurs. There are also allegations that when the Taliban regained control of Afghanistan in 2021, they turned to biometric technology such as iris scanning to find and crack down on those who collaborated with the former regime. Biometrics are an unparalleled tool when you want to suppress groups, and of course biometric data can also be used to impersonate people.
What if I don’t care about privacy?
“I’m a civilian. Google already has all my data. I don’t think my eyes contribute much,” the young man shrugged. Two weeks ago, when he was interviewed by EL PAÍS, he was preparing to undergo an iris scan in a shopping mall in Madrid.
This is a recurring debate. But Carissa Veris of the University of Oxford thinks that’s a mistake. “We tend to think that when something is personal, it is personal. But in reality, as we saw in the Cambridge Analytica case, it is personal. When you share personal data, you put others at risk,” she explained, referring to the consulting firm’s scandal during the 2016 US presidential election. state. The company accessed the personal information of her 50 million Facebook users to create profiles of American voters and target them with personalized election ads.
“You may not care about your privacy, but I don’t think of privacy as a right, but rather as a duty, because it can put the entire environment at risk.” says David Arroyo of CSIC. “This type of data is used to create profiles of other people. From there, more sophisticated attacks are launched, such as phishing and disinformation,” he emphasizes. Even if the right to rectification is later exercised and the collected biometric data is eventually deleted, it is already used to train the system. In other words, to be more efficient.
What worries experts in the World Coin scandal is that it is contributing to the normalization of a double-edged sword: iris scanning. “Once it establishes itself as a legitimate form of authentication, everyone will eventually use it,” Bellis laments. “I’m very upset that using facial recognition to unlock our phones has become the norm. I think such technology has come to be perceived as natural by people. I hope the same thing doesn’t happen with the iris scan.”
Apply our weekly newsletter Get more news coverage in English from EL PAÍS USA Edition


