Consumers Have No Constitutional Protection From Facial Recognition
Texas Insider Report: WASHINGTON, D.C. – Over the past decade, marketers have increasingly relied on Facial Recognition Technology (FRT) to create personalized advertisements. FRT depends on complex algorithms to identify a person by measuring the size, angle, and distance between a person’s facial features. So how can consumers keep their public images from being used in FRT?
FRT uses this size, angle, and distance measuring information to search a database of similar features and matches the image to a stored reference photo. Within seconds of capturing an image, FRT can detect and identify a single person in a crowded public area.
FRT offers obvious advantages to advertisers.
The technology is relatively inexpensive, has perfect recall, and can rely on vast amounts of stored information. High-quality digital cameras and lenses produce clear photographs, which makes identification simpler. Recognition software has grown more sophisticated over the past decade, enabling companies to generate three-dimensional face images and account for aging.
The technology allows marketers to produce targeted advertisements for individual consumers in public spaces. In its most common form, such advertising uses a camera mounted behind a digital screen; after facial recognition software processes a consumer’s image, the screen displays an advertisement based on the stored information.
Ads usually reflect demographic information and sometimes tailor messages based on emotional signals from the consumer’s face.
Privacy advocates have raised questions about the intrusiveness of FRT advertisements. Some argue state and federal privacy statutes offer only weak protection for consumers who encounter FRT systems.
No law expressly addresses commercial use of facial recognition software; the only notable discussion of biometric information simply prevents it from being used to perpetrate fraud. Further, anti-voyeurism and anti-surveillance laws generally apply to private spaces, not public ones. The Federal Video Voyeurism Prevention Act of 2004, for example, outlaws capturing an image of a person in a “private area” without consent.
Few constitutional protections affect consumer privacy with respect to FRT. The Supreme Court has held that a person has no “reasonable expectation” of privacy for information voluntarily revealed in public (“No person can have a reasonable expectation that others will not know the sound of his voice, any more than he can reasonably expect that his face will be a mystery to the world.”)
So how can consumers keep their public images from being used in FRT?
Some scholars propose application of “publicity tort” concepts to prevent one party from improperly benefitting from use of another’s likeness without consent. Such privacy claims have historically been used in advertising contexts. Others recommend a statutory solution that would allow consumers to opt out of commercial use of their personal images, similar to the Do Not Call Registry.
Recently, the FTC released a staff report on FRT best-practices. The report recommended that designers of FRT systems approach this new technology with “privacy in mind.”
The FTC also suggested that a “privacy by design” approach could “offer innovative new benefits to consumers” while simultaneously “respect[ing] their privacy interests.”
Steven C. Bennett is a partner in the New York City offices of international law firm Jones Day. The views expressed are his, and should not be attributed to Bennett’s firm or clients.