
This article provides a brief look into the ways identity can be constrained with regard to biometric technology. It discusses technological limitations where biometric identification systems may fail to represent a person’s full identity, including bias in recognition as well as the inability to capture complex and changing human characteristics. It also touches on political dimensions, where legal systems and governments may place limits on how identity is recognized and documented, particularly in the case of gender recognition.
The Pervasiveness of Biometrics
Biometric technology has become embedded in the contemporary fabric of society, encroaching on significant aspects of public and private life. What once was reserved for science fiction fantasy, utopian and dystopian alike, is now a societal and cultural norm. Using a fingerprint sensor to unlock a smartphone is one of the early cases where the public had access to this technology in a seemingly innocuous way. This arguably may have contributed to a social environment that was less resistant to the innovative identification practices, which have now become ubiquitous in both public and private sectors. Biometric identification systems are now routinely used in the financial sector for banking, in law enforcement for suspect identification, in businesses for employee time and attendance, and at borders for citizenship and immigration purposes. Although the use-cases for biometrics are expanding at a rapid pace, the limitations the technology places on identity give us reason to pause and are certainly cause for concern.
The Chronic Need to Prove Identity and Biometrics’ Promise of Better Verification
In contemporary western culture there is a fixation on proving one’s identity. A birth certificate for example, is often the first material documents that proves one’s existence, classifying the physical body with signifiers such as gender and race. Individuals must use this documentation, or forms of it, as a mandatory requirement to participate in the social world. Health, educational, political, and financial matters are just some of the areas that necessitate the proof of one’s identity. Biometric technology, thought to be a superior advancement to identification practices, allows an individual to prove their identity without the need for intermediary documentation outside of initial enrollment. Instead, they use the measurement of physical attributes of the body as proof of identity. Facial recognition, for example, uses measurement and statistical analysis to map facial features such as the distance between eyes, the width of a nose, or the shape of cheekbones among other nodal points of reference for individual identification and authentication purposes.

Image of a 3D-rendered human face with a web of data points that capture and map human facial features with precision, showcasing the intersection of technology and identity.
Stock photo ID:2167538233. (CCO) Photo Credit: iStock.com/Imaginima.
The overarching rationale behind the widespread implementation of biometric systems is that they are espoused as secure, efficient and objective. They are secure in the sense that this type of identification, physical characteristics, is something that cannot be lost, stolen, forged or otherwise duplicated in the way that other forms of identification documentation are susceptible to. Claims of efficiency and objectivity are attributed to the offloading of processes and decision-making to the technology, absent of human limitations in speed and judgement. Yet, the very claims that are championed in the design, manufacture, and sales of these systems are increasingly being called into question.
A False Sense of Objectivity
Biometrics technology relies on the uniqueness, permanence, and immutability of physical characteristics, such as the iris or fingerprint for example. This hallmark is easily challenged when considering that the characteristics of the physical body are prone to physical changes either suddenly or over the course of time. For example, manual labor may wear down the ridges of a fingerprint or facial features may change as a result of the natural aging process. When the body cannot authenticate to its prior identity because of physical changes, it suggests that biometric technology possesses certain temporal imitations, unable to see the extraneous circumstances that may alter even the most immutable of bodily characteristics. If identity is based on physical characteristics that are supposed to be immutable, but which can change, then identity itself is to be inferred as something not fixed but dynamic.
Claims of objectivity have also been routinely challenged. For example, Magnet (2012) analyzes biometric technology failures through a feminist science studies approach, demonstrating through case studies how biometric technologies fail in regard to identifying marginalized groups based on gender, race, ability, and class. Rather than eradicating inequalities pertaining to these matters, biometric systems are arguably reinforcing and reproducing them. Scholars whose work focus on the intersection of technology, race, and social justice have demonstrated how racialized black bodies have been historically and perpetually inscribed with notions of heightened danger and risk and made more or less visible when it suits the oppressor’s power-advancing purposes, unconsciously or intentionally (Benjamin, 2019; Browne, 2015; Noble, 2018).
Longstanding narratives of the black racialized body as risky and dangerous are what allows for their over-represented inclusion in facial recognition systems that work within the criminal justice context, while commercial systems for other purposes not only fail to see color but have also been shown to inadequately register and recognize female gender or other variations outside of the male/female binary (Buolamwini & Gebru, 2018). Research has shown how facial recognition technology is ineffective in recognizing transgender and non-binary individuals, where automated gender recognition does recognize gender, “…if one denies the role that self-knowledge plays in gender, and consequently, denies the existence of trans people” (Keyes, 2018). In this way, biometric systems may be challenged when an individual’s self-knowledge contradicts their biological gender assignment, despite their physical characteristics. Moreover, the notion of permanence and immutable characteristics is further problematized by advancements in science that make it possible to alter biological gender-based physical characteristics.
Given the predominance of biometric technology use within law enforcement and public safety matters, the already marginalized communities who must be assessed through such systems are made even more vulnerable by having their identity questioned and their physical body on display, accessible to an authoritative entity who does not make itself similarly accessible to them. This type of unreciprocated vision is a key example of an asymmetrical power imbalance and something Haraway (1988) would refer to as “the God trick”. This term refers to the illusion of an objective, all-knowing, and impartial perspective; where the gaze inscribes the marked body while the unmarked claims the power to see and not be seen, to represent while escaping representation. Haraway (1988) uses this term in the context of science and knowledge production, arguing that all knowledge is “situated”, influenced by the cultural, social, and political positions of those who are in charge of producing it. In the same way technology as an extension of knowledge, and in this case biometrics technology specifically, is situated and reflects the influence of the social and political environment for which it was designed and deployed.
Reinforcing Narrow Identity Classifications
Authentication demands that something either fits the original template or it does not. Identity is either verified or it is not. Access is granted or denied. The algorithms that are used as an extension of the human decision-making process have not achieved a level of machine-learning that allows for discretion to be distributed. Discretion is what allows one to look beyond the immediate circumstances using intent, lived experiences, and emotions to access a broader picture for context, allowing identification, access, and mobility that otherwise rigid algorithms may have perilously missed. Humans have always had the innate ability to recognize face patterns and characteristics to distinguish each other. The problematic nature lies not in the ability to identify someone based on their appearance, but rather in something being inferred on the basis of that process. It is not just that the constructs of race and gender are narrowly fit into such systems, but how the information is being used to create conditions for life that are vastly different for those that are identified in a particular way or unable to be identified at all.
The social constructs of gender and race that are reproduced in biometric technology are indicative of broader social issues in terms of power, inclusion, equality, and representation. The way gender is defined and interpreted in these systems reinforces societal norms and hierarchies, making it a deeply political matter. The choice to constrain gender to a limited definition in these systems mirrors the current political narrowing of the definition of the sexes. Most recently, the President of the United States, Donald Trump, signed an executive order that reinforced such a narrow definition limiting gender identity to either male or female and ostensibly refutes the notion that someone can transition from the sex assigned to them at birth. This immediately prompted the State Department to cease issuing travel documents with the “X” gender marker, disallow changes to gender listed on passports, or allow individuals to obtain a new passport that reflects a gender other than the one assigned at birth (The Associated Press, 2025). Biometric technology reinforces these rigid societal norms, where relying on binary gender classification contributes to the marginalization of those who do not conform to these categories and ignores or oversimplifies the complexity of human identity.
If these systems are unable to grapple with physical and material changes, they are even more inept at dealing with things that evade material observation both in their immediacy and over the flux of time. If the goal of identification is to articulate a body, then more not less descriptions are needed. This requires a practice of attuning to that which has been tuned out in formal identification channels – the things we don’t see or choose not to. Rather, the social, cultural, and political narratives that preceded the biometric sociotechnical apparatus are still within this technology’s field of vision. Identification through technologically assisted vision is therefore not revolutionary or transformative; instead, it perpetually sees others as they have always been seen. This line of sight ignores the immaterial, intangible, and unconscious but ever-present elements that constitute one’s being and shapes their becoming.
Concluding Thoughts
Relying on antiquated classifications and narrow definitions in identification processes only further serve to promote divisions among humans; divisions that were socially constructed and must be socially dismantled. Contending with the foundation of analog inequality is necessary if we want our digital systems to see and do something different.
This post was curated by Contributing Editor Dayna Jeffrey
References:
Benjamin, R. (2019). Race after Technology: Abolitionist Tools for the New Jim Code. Oxford: Polity.
Browne, S. (2015). Dark Matters: On the Surveillance of Blackness. Durham: Duke University Press.
Buolamwini, J. & Gebru, T. (2018). Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification. Proceedings of the 1st Conference on Fairness, Accountability and Transparency in Proceedings of Machine Learning Research, 81: 77-91. Available from https://proceedings.mlr.press/v81/buolamwini18a.html.
Haraway, D. (1988). ‘Situated Knowledges: The Science Question in Feminism and the Privilege of Partial Perspective,’ Feminist Studies, 14(3): 575‐599.
Keyes, O. (2018). ‘The Misgendering Machines’. Proceedings of the ACM on Human-Computer Interaction, 2(CSCW): 1–22. https://doi.org/10.1145/3274357.
Magnet, S. (2012). When Biometrics Fail: Gender, Race, and the Technology of Identity. Durham: Duke University Press.
Noble, S. (2018). Algorithms of Oppression: How Search Engines Reinforce Racism. New York: NYU Press.
The Associated Press. (2025). ‘The U.S. stopped allowing passport gender marker changes. Here are some of the people affected’, NBC News, 10 February. https://www.nbcnews.com/nbc-out/out-politics-and-policy/us-stopped-allowing-passport-gender-marker-changes-are-people-affected-rcna191458.