To get health care in Spain’s primary care clinics and hospitals, patients must give their names and surnames, show a health card if they have one, or another identity document if they don’t, and say what ails them. For residents of Ceuta and Melilla, however, there will soon be a new requirement: sitting for a photo that the health service will convert into a unique cipher of their facial features. In 2021, the Instituto de Gestión Sanitaria (INGESA), the part of the Ministry of Health that manages healthcare in the two autonomous cities of Ceuta and Melilla, awarded a more than 700,000 euro contract to a temporary joint venture (UTE) of the firms Dedalus and Facephi. The objective? To create a unified clinical history system and, incidentally, to install an artificial intelligence (AI) system for facial recognition of patients.

According to an official document to which Civio has exclusive access, the system has been in operation in primary care clinics since November 2024, although neither INGESA nor any of the companies in the joint venture confirmed this. The cameras are installed in at least two primary care clinics in the city of Melilla, while in hospitals in both Ceuta and Melilla the system is still in the testing phase. Dr Enrique Roviralta, president of the Ceuta Medical Association and the Ceuta Doctors’ Union, says that no one has informed the association or union about the system and that members are “completely unaware of its operation and the timetable for its inauguration.”

The reason INGESA is implementing this system is unknown and shrouded in generalities. The justification for the public contract states that it is being done for “improving the quality and safety of care by improving data quality and the interoperability of information systems.” Regarding facial recognition component, the justification explains, that it is “to improve and facilitate identification without sacrificing the usability of the system.” This even though health cards in Ceuta and Melilla are among the few in Spain that already incorporate photos, like Spain’s National Identity Card (DNI in Spanish).

Some 2022 public statements by INGESA and its then-director, Belén Hernando, hint at why they consider this system necessary: “It will avoid duplications, impersonations, treatment errors and other problems derived from the basic identification of the patient,” Hernando said. However, INGESA declared on another occasion a different purpose that had little to do with patient safety and more to do with internal issues. The system “will serve to bill people who are not entitled to publicly funded healthcare” reported the newspaper El Faro de Ceuta in 2021. Yet, advocacy officer Pablo Iglesias of Médicos del Mundo in Madrid says that “We are not aware of any situation of identity theft or anything like that.” Roviralta concurs: “We are not aware of any incidents in identifying patients. They are currently identified with their documentation.” Neither INGESA nor either of the companies in the joint venture responded to Civio about the specific purpose of the AI system.

In 2021, Spain’s Court of Auditors warned that INGESA was inappropriately providing certain services free. These were mainly provided to people without a formal right to free healthcare in Ceuta or Melilla in one of three situations: emergencies, pregnancies and childbirth, and care for minors. Under the law at the time, the public system should have billed those users afterward, although, according to the Court of Audit, this rarely happened in 2016.

In 2018, a new law extended basic health coverage to all people regardless of their administrative situation. However, a report by the platform ‘Yo sí sanidad universal’ (Yes to universal healthcare) points out that the Melilla health service is still billing people without health cards for emergency services. Iglesias, of Médicos del Mundo, warns that “billing for emergency care is now quite widespread in Spain.”

In fact, the Institute’s documents, to which Civio obtained access, list people without the right to a national health service card as among those potentially affected by the new artificial intelligence system. The total, around 170,000 people, includes people registered with Spain’s National Health Service, mutual aid members with access to the health service and “non-affiliated foreigners, mainly Moroccans.”

From a photo to a biometric pattern of your face

A biometric pattern of each patient’s face will be added to the personal and health data in their medical record, such as their address, ID number and illnesses. The administration or auxiliary staff will take a photo, which the AI system will convert in seconds into a pattern of the unique features of each person and store it with the rest of the personal data. The system does not perform video surveillance, nor will it be inside consults, but it will be in administrative and health card posts, and first points of contact with the health service: primary care clinics and hospitals. The contract calls for the installation of 48 of these devices in Ceuta in the areas of Admission, Emergencies, Day Hospital, Radiology and Pharmacology in the hospital and primary care clinics and another 64 in Melilla.

Thus, with a simple photo, the patient’s biometric and health data will be linked and registered in the database of the health system. The system will, for example, identify a specific patient with a simple photographic capture or to identify duplicate patient records. “Since facial features tend to be immutable, they become a distinctive identifying feature,” law professor Vera Lúcia Raposo of Nova University in Lisbon, Portugal, writes in a book on AI in the health sector. The INGESA contract requires the facial recognition system to have the ability to store several templates of the same face and at different times (such as with or without a beard, or with or without a mask). In addition, the system must allow for the detection of facial features, such as age, gender or expressions.

Is there no alternative to using my facial features to identify me?

Although INGESA argues that there are no alternatives to improve the quality and safety of care, facial recognition systems have been widely criticised for possible breaches of privacy, security-related problems and the discriminatory biases they may entail. “Perhaps it could provide greater agility in the identification of patients or that the patient does not have to carry any documentation,” Roviralta says, but: “I am much more concerned about the risks of data leaks and scrupulous compliance with the Organic Law on Data Protection.”

Health data and biometric data are each separately considered specially protected data. “When using special category protection data we need more justification for data protection purposes. We need this processing to be necessary, justified and proportionate,” says data protection expert Guillermo Lazcoz.

The report ‘Privacy and biometrics for smart healthcare systems: attacks, and techniques’ explains that “in healthcare, biometric detection systems can be used to improve patient care and streamline medical processes, but they also raise significant privacy concerns.” It adds that if this data “is stolen or misused, it can be difficult or impossible to change, making it a valuable target for hackers.” INGESA’s data protection impact assessment (DPA) for the system, which Civio obtained, states that “the seriousness of the risk to the rights and freedoms of the processing and its intrusion into the privacy of patients is appropriate to the objective pursued and proportionate to the urgency and seriousness of the processing.” The problem is that nowhere does the DPA define the urgency or seriousness of the situation. The assessment, which analyses in detail both the current legislation on which is based and the potential risks and the measures to mitigate them, legitimises the use of the AI system and concludes that it is not necessary to undergo independent review by the Spanish Data Protection Agency (AEPD).

Yet the assessment does not meet established standards for a proper DPA: “In general terms, the analysis is insufficient and incoherent from the point of view of data protection, which is worrying considering the special sensitivity of the data and information involved and the healthcare context in which the initiative is intended to be developed,” says independent data protection law expert Mikel Recuero.

The document raises a variety of problems and fails to answer several questions. For example, INGESA argues that there is no other procedure or alternative system that is less harmful to the protection of personal data and that fulfils the same function. The analysis neglects the existing situation and the way in which it already identifies patients. For those without a health card, any other identification document can work. “You are identified with your documentation, such as your health card, national ID card or passport.” Roviralta explains, “I think this is sufficient and this is the way it is done in the rest of the autonomous administrations.” The INGESA analysis should answer questions about alternative technologies, Lazcoz says: “What alternatives are there? Why do I choose this alternative instead of others? And then it should go into whether it is justified, necessary and proportional.

Nor is there any trace in the document of how to obtain the patient’s consent, although according to Lazcoz, this is intrinsically linked to another unanswered question in the document, he says: “One of the things that strikes me is the issue of not mentioning whether or not this type of treatment contributes to an automated decision. If I have an automated system that is saying this person is this person or this person is not this person and this has the effect of providing them with health care or not providing health care, then it seems to me to be an automated system and the regulations for that are a little more restrictive.” For example, data protection regulations require a person’s explicit consent to use their personal data in an automated decision.

Biases, gaps and people left out

The risks associated with AI facial recognition systems in general have been widely documented. “The increased collection of bodily data poses significant risks to individuals and society as a whole, including cybersecurity breaches, data misuse, consent violations, discrimination against vulnerable populations, biometric targeting and pervasive surveillance,” explains Júlia Keserű in the report ‘From Skin to Screen: Bodily Integrity in the Digital Age.’

One of these risks, which in the context of Ceuta and Melilla may be of great relevance, is the biases that these systems may entail. As the AEPD explains in one of its guides: “Some people cannot use certain types of biometrics because their physical characteristics are not recognised by the system. In cases of injuries, accidents, health problems (such as paralysis) and others, the incompatibility may be temporary. Permanent biometric incompatibility, however, could cause social exclusion.”

Cases of racial bias are also documented, as Keserű explains in the report: “Research has also consistently shown that contemporary biometric systems exhibit significant biases and can lead to large-scale discrimination,” such as a scientific investigation of three facial recognition products that revealed that they misidentified darker-skinned people by as much as 34%, compared to less than 1% for lighter-skinned people. Facephi, the company providing the INGESA cameras, notes in its blog that “the data that feeds our algorithms, related to biometric recognition, is designed to ensure a fair distribution with sufficient variety, quantity and quality across groups. Race, gender, age, religion, as well as other technical characteristics such as the type of capture device, or the position of the object, will not affect the result.” However, there is no quantitative public data or information to support this good practice.

What if this data ends up in the hands of persons other than the data controllers? The AEPD warns about this in one of its guidelines: “Unauthorised access to our biometric data in one system would allow or facilitate (in the case of using multiple authentication factors) access in all other systems using such biometric data. It could have the same effect as using the same password in many different systems… And, unlike password-based systems, once biometric information has been compromised, it cannot be cancelled.” According to the European Cybersecurity Agency (ENISA), between 2021 and the first four months of 2023, there have been 99 incidents involving data breaches in the healthcare sector in the European Union. Some 52 of these involved the leak of patient data or medical records.

“The government and INGESA will have to give explanations as to what is intended or what is being pursued with the installation of this type of camera, but the potential use that can be made of them is very perverse,” Iglesias says. Keserü’s position is along the same lines: “I am really afraid of what will happen and how much of our freedom will be curtailed if we accept that these systems are normal and can become mainstream. He adds: “This is an important human rights issue.” The underlying and most worrying problem may be the one that Iglesias points out: “These cameras can have a deterrent effect on people coming forward to seek the health care they need.”

Methodology

Civio became aware of this AI system through an INGESA response to an information request. We then searched for the INGESA contract for this system and found it published on the Public Sector Contracts Platform. In order to obtain more information, Civio made a second information request asking for the data protection impact assessment (DPA) document, a request that INGESA first denied. Civio filed a complaint with the Council for Transparency and Good Governance (CTBG), which required INGESA to provide the document.

Facephi, in charge of the biometric component of the system, declined to answer our questions. Dedalus, the other company in the UTE, initially agreed to answer Civio’s questions sent on 12 June but had not sent answers by the time the original Spanish-language article went to press on 17 July. INGESA received Civio’s questions about this system on 3 June and, although it replied saying it would send answers, more than a month and a half later it had not done so. We designed this visualisation with Illustrator and transformed it into Svelte using jsai2svelte.js. We adapted some of the icons from ivisual, by Noun Project under CC BY-3.0 licence.