top of page

Watching spaces, surveilling faces…in Jamaica? (Part 1)

Updated: Nov 30, 2022


*Image credit: EDRi. Copyright © 2022 EDRi.


In July of 2020, the Jamaica Gleaner reported, to little fanfare, that high-tech CCTV surveillance cameras “equipped with facial recognition software” were being utilized by business operators and police officers in Montego Bay. According to the Gleaner, those surveillance cameras had even facilitated “a number of arrests”.


Of late, the increasingly widespread deployment of facial recognition software and other facial recognition technologies (FRTs) has become the subject of impassioned debate worldwide. The term “FRTs” is an umbrella designation that refers to “a set of digital tools used to perform tasks on images or videos of human faces…” (Buolamwini, Ordóñez, Morgenstern and Learned-Miller 2020, 2). FRTs utilize machine learning algorithms trained on “data sets” of facial images to identify individuals. The process by which they make identifications starts with an analysis of the individual’s “facial geometry”, which is comprised of unique biometric identifiers. A “faceprint” is then generated and systematically matched against other faceprints contained in a database of facial images until an identification is produced. (Buolamwini, Ordóñez, Morgenstern and Learned-Miller 2020, 8-12).


FRTs are commonly used as verification tools to facilitate access to electronic devices like smartphones. FRTs are also used to identify of criminals or suspected criminals; assist with tracking missing children and victims of human trafficking; support disaster response and recovery efforts; streamline the provision of healthcare services, including patient screening; and control entry to airports.


General concerns around the unregulated deployment of FRTs


On the flipside, FRTs can potentially be misused as mass surveillance tools. Additionally, due to the sensitive nature of personal data they collect, store, and process (i.e., biometric facial data), FRTs can be highly invasive. Finally, digital and privacy activists have also expressed concern about their largely unregulated deployment by public and private actors, especially law enforcement, as well as their propensity for misidentifying black/brown people, and women at comparatively higher rates than white men due to inherent “algorithmic biases”.


This latter concern around inherent “algorithmic biases” was first spotlighted by the pioneering research of Dr. Joy Buolamwini, a Ghanaian American computer scientist and digital activist. In the main, Dr. Buolamwini’s research exposed the inability of several commercial FRTs to accurately identify darker-skinned female faces when presented with data sets of racially and gender-diverse facial images. Her groundbreaking research was inspired by her experience as an MIT student working on an AI system that could not detect her face until she covered it with a white mask, thereby challenging conventions assumptions about machine neutrality.


Due to the manifold concerns around the unregulated deployment of FRTs, in 2020, tech giants Amazon, IBM, and Microsoft decided to impose moratoriums on the sale of their FRTs to law enforcement, pending legal regulation by Congress. In that same year, the Court of Appeal of England and Wales tackled the issue in the seminal case of Edward Bridges v The Chief Constable of South Wales Police and others [2020] EWHC Civ 1058). In Bridges, the court determined that the use of automated face recognition by law enforcement breached “data protection laws, privacy laws, and equality laws,” including the European Convention on Human Rights (ECHR). Among other things, the court noted that the existence of “fundamental deficiencies” in the legal framework supporting the police’s use of facial recognition technology caused breaches of certain fundamental rights. It also that ‘[t]oo much discretion is currently left to individual police officers…[and]…it is not clear who can be placed on the watchlist, nor is it clear that there are any criteria for determining where AFR can be deployed’. In consequence of the Bridges decision, law enforcement in England and Wales is restrained from using automated face recognition until the decision is overturned by a superior court.


In the United States of America, San Francisco, Boston and California have similarly banned the utilization of FRTs by city departments and in police body cameras. Additionally, the city of Portland, Oregon has prohibited the utilization of FRTs by government agencies, law enforcement, and even private businesses in “places of public accommodation” except: (1). to the extent necessary to comply with federal, state or local laws; (2). where they are required to facilitate access to the user’s personal or employer-issued communication and electronic devices; or (3). where FRTs are utilized in automatic face detection services for social media applications. More recently, the European Commission of the European Union (EU) adopted a non-binding resolution banning the utilization of FRTs by law enforcement in public spaces. The regulation also prohibits the creation of private facial recognition databases—an apparent attempt to curtail the practice of “image scrapping” by private companies following the Clearview AI debacle in 2020.


FRTs and the Data Protection Act of Jamaica

In keeping with global trends prioritizing the protection of personal data, our parliament, on May 19, 2020, passed the long-awaited Data Protection Act (“DPA”)—a progeny of the EU’s General Data Protection Regulation (“GDPR”). Under the DPA, individuals (“data subjects”) enjoy various rights in relation to the handling (i.e., collection, processing, storage, and disposal) of their personal data. A fundamental right of data subjects is the right to withdraw consent to the processing of their personal data, barring certain exceptions, such as public security, and health. All actors handling personal data, whether directly themselves (i.e., “data controllers”) or indirectly on behalf of another (i.e., “data processors”), are obligated by the DPA to do so in accordance with the data protection standards it establishes.


By providing for the protection of personal data, the DPA gives statutory expression to the right to informational privacy, which the Court in Julian J Robinson v The Attorney General of Jamaica affirmed as an essential dimension of the constitutionally guaranteed right to privacy ([2019] JMFC Full 04, page 131, para. 174). Thus, as leading data privacy expert and Attorney-at-law Mr. Chukwuemeka Cameron has carefully explained, the duty of data controllers to protect the personal data they are processing does not “singularly arise from the [DPA]” since ‘[i]t is the Constitution that places the onus on [data controllers] to ensure that they are processing personal data in a safe, secure, transparent, and accountable way…’


Interestingly, much like its progenitor, the GDPR, the DPA does not frontally address, or even appear to seriously contemplate, the question of FRT deployment, regulation, or consequent legal ramifications. Notwithstanding that omission, since personal data, more specifically biometric facial data would necessarily be collected and processed by FRTs, their deployment would indirectly engage the DPA. Under the DPA, biometric facial data would qualify as “sensitive personal data”, and their processing by FRTs would therefore place data controllers utilizing the technologies for processing under more stringent obligations when handling sensitive personal data.


Another consequence of utilizing FRTs to process biometric facial data under the DPA is that the requirement for explicit consent for processing must be sought and obtained from the data subject whose facial image will be processed, unless there is some legitimate interest (e.g. public security or health) militating in favour of processing without obtaining their explicit consent. Under the DPA and similar GDPR-inspired regimes, biometric data is in that way because their processing “makes it possible to uniquely identify people”. Data controllers deploying FRTs that process biometric data would also be obligated to implement the necessary technical and organizational measures to safeguard the particularly sensitive data being processed effectively. The duty of data controllers to ensure the existence and effective functioning of that technological infrastructure would arguably be more stringent with biometric data.


While the deployment of FRTs undoubtedly engages the DPA, the DPA, in its current form, does not make provision for the effective regulation of FRTs. Across several jurisdictions, various stakeholders have accepted implementing a dedicated legislative regime for specifically regulating FRTs as a viable path to effective FRT regulation. Part 2 of this article will explore the subject of regulating FRTs with a particular focus on cross-jurisdictional approaches to regulation.

19 views0 comments

Recent Posts

See All

Kommentare


bottom of page