Amazon isn’t reading the lines on your palm. It’s reading the veins inside: the constellation of blood vessels that, like a fingerprint, is distinctive and unchanging, visible under infrared light. Or it will be reading them, if you sign up for Amazon One, the company’s latest venture into biometrics.
The Amazon One device scans your palm at checkout to make in-store purchases, indelibly linking your vascular system to your credit card. There are presently six Amazon Go stores in Seattle that use the technology and, this week, the company announced it was rolling the scanners out to several more. “Simply by being you,” is the system’s slogan, a phrase that conjures up a world free of transactions, one in which you might indeed be offered groceries by nature of your humanity. This is obviously not the vision Amazon One has for retail.
Its vision, expectedly, extends far beyond Seattle yuppies, and shopping entirely; the software’s original patent imagines its use in offices, libraries, and hospitals, wherein hands morph into library cards and concert tickets. To be sure, it’s not a particularly novel idea (biometric technology is commonplace in private security, and at least one company has rolled out its own facial-recognition-based payment system), but it is a sign of the tech’s function creep, its slow oozing into the everyday. In exchange for our palms, Amazon will offer up a modicum of convenience, and in turn, tighten its grip on everything else.
It’s notable that Amazon One chose palm scanning in lieu of more familiar, well-worn biometric tech, like fingerprinting or facial recognition (the company already sells the latter to clients including both the NFL and National Geographic). Palm vein matching isn’t new; it entered into commercial use in the nineties, and has been used in some U.S. hospitals since at least 2011. But now, its relative lack of notoriety has given it an advantage in the identification industry. As one study noted, palm scanning feels far less invasive than, say, a retinal scan, though there is no material difference between the two. Amazon is gambling that palm-scanning is a more approachable form of biometric surveillance.
After all, the company is marketing the devices for a world that is increasingly wary of biometrics. In promotional videos for Amazon One, cartoon characters buoy through pastel backdrops, gliding their palms across scanners and toting featherlight bags of groceries, while the uninitiated fumble with their wallets. It’s all thoughtless, and distinctly not-futuristic. Amazon’s website is adamant: “You’re in control.”
The immediate, most high-profile concern with this tech will be privacy, of course. Amazon One has anticipated this, and addressed it in soothing tones, outlining the layers of encryption that will secure your blood vessels should you choose to give them up. Such measures are never foolproof. But what a single-minded focus on individual privacy obscures is that security is the very point of the whole endeavor—not for consumers, but for Amazon.
Consider, for example, the biometric-adorned house. The Wall Street Journal described the phenomenon in flushed tones two years ago: Greenwich manors mounted with face surveillance systems, Hudson Yards lofts equipped with fingerprint locks. The technology, here, serves as a convenience for property owners, who no longer need to concern themselves with misplacing a key. But for many, such luxury real estate is a workplace. And as the WSJ piece happily notes, these systems can track domestic workers’ movements, or lock them out at any moment.
In this way, biometrics operates much like cashless technologies, which control how communities can and cannot interact with financial systems, enforcing racist notions of crime and borders on a minute scale. Privacy, here, has already been eroded; such identification by nature lacks it, no matter how well-encrypted the data is.
And this is the appeal of Amazon One, whose real clientele is not Amazon Go shoppers, but the concert halls and apartment buildings who will buy the devices. Their concern is their own security, which is greatly enhanced by biometric systems. Such technology further blurs the lines between the security state and the private sphere—a reminder that the push for bans on government use of biometric surveillance, particularly facial recognition, is too narrow in scope. As Sanjana Varghese writes, “If one form of surveillance becomes publicly reviled—or at least, subject to some level of scrutiny—then two others will pop up in its place.”
Amazon will stubbornly cloak this new project in harmless mystique—its scanners will become like fortune-tellers, operating outside of earthly rules and prejudices. This is the company’s model of invisibility, consistent across its multitude services: Packages are dropped at doorsteps, silently, while workers face worsening conditions. Doorbell cameras and smart speakers record at all times, unimposing.
Even before the palm readers, Amazon had embarked on a desperate quest for immediacy. This is most evident in their “cashierless” model, in use in some Amazon Go stores, which allows you to exit with your groceries and have them charged automatically to your Amazon account. It’s the pinnacle of mindless consumption, wherein the transaction itself is eliminated entirely. How do they pull off this technological feat, you ask? Each store is lined with hundreds of cameras, papering the walls, blinking at every turn.
Around 2014, a Tumblr meme brought new life to the sketches of a seventeenth-century painter named Charles Le Brun, using them for a slightly offensive joke about actress Sarah Jessica Parker’s resemblance to a horse. Some might say this is the punishment for famous women who dare to deviate, however marginally, (let’s be real, SJP is still skinny, white, and blonde) from Hollywood’s oppressive beauty standards. Others might respond that seeing animal faces in human ones and vice versa is only natural. Who among us hasn’t spotted a dog that looks like its owner or a certain secretary of transportation who eats like a squirrel?
And yet, the practice of human-to-animal comparison hasn’t always been so lighthearted. Le Brun originally prepared his sketches as supplement to a lecture on zoological physiognomy, the science of deciphering people’s personalities through their facial features and animal resemblances. Ever since its inception in the ancient world, physiognomy reappeared in different texts and popular theories throughout the centuries.
For the ancients, the logic was simple: Life in human society is a constant performance that forces us to hide our true essence. Animals, on the other hand, behave genuinely, they never pretend. If people share physical traits with certain animals, they must also share certain inner qualities. Ancient Greek physicians and philosophers wrote detailed treatises on these taxonomies, tying lion features to generosity, snakes to cruelty, dogs to gluttony, eagles to intelligence. Some of these associations still populate our literary descriptions—a leonine profile, an aquiline nose, a feline gracefulness.
By the eighteenth and nineteenth centuries, physiognomists were less concerned with contemplating animal souls than with using face measuring as another enforcement of racial and class hierarchy. Along with phrenology, anthropology, and eugenics, physiognomy turned prejudice into science.
Current biometric technology is plagued by this ugly history, no matter how hard tech companies try to conceal it. Journalists and academics have written extensively on the biases built into facial and body recognition software, drawing a throughline from centuries past.
But what if, as proponents of “ethical” technology claim, biometrics could be solely about recognition instead of judgement? And what an attractive method of recognition it is—your fingerprints and irises will never get lost like your passport or your passwords. They are, supposedly, the most unmediated “you” there is.
Though biometric technology is used increasingly for private surveillance and capital gain, in our cultural imagination it is still most closely associated with national security. Security of borders through surveillance of the bodies crossing them, but also the security that those bodies never change, and that behind every threat is an embodied person.
The expectation that our resemblance to a dog or a goat can shed light on our morals, habits, and desires feels absurd. The idea that nose or cheekbone or eye shape correlate with intelligence is equally ridiculous, albeit much more insidious and violent. But biometric technology, no matter how futuristic its getup, rests heavily on the same old mistaken assumptions. The problem with biometrics isn’t just that its measures are biased, its that they equate the body with personhood in the first place.
At the core of both physiognomy and biometric surveillance is a hopefulness that the truth about who we are is hidden in a set of stagnant physical qualities. That reality is a product of disclosure, literally. That somewhere underneath all the social constraints and performative interactions is the immutable length of our chins and the unique pattern of our retinas. But bodies are as difficult to pin down as whatever spirit or soul we have inside—they grow, transform, break, wither away. More likely than not, their capacity for revelation is as limited as the meaning of Sarah Jessica Parker’s resemblance to a horse.
what we’re looking at