Aggressively Defending My Clients Since 1990


On Behalf of | Sep 1, 2017 | Firm News

Charles E. Trimble is an Oglala Lakota from the Pine Ridge Reservation explained in Indian Country Today that some Native Americans refused to be photographed. One of the most famous Native Americans in history, Tȟašúŋke Witkó (literally “His-Horse-Is-Crazy”; c. 1840 – September 5, 1877) or Crazy Horse, was never photographed while alive.  He ranks among the most notable and iconic of Native American warriors and was honored by the U.S. Postal Service in 1982 with a 13¢ Great Americans series postage stamp.  However, he never allowed his photograph to be taken, even while on his deathbed.”  Indian agent Valentine McGillicuddy recalled that when he urged the chief to pose for a picture: “His invariable reply to my request was ‘My friend, why should you shorten my life by taking from me my shadow?'”

While Mr. Trimble offers some other explanations, it has been suggested by others that Native Americans had a religious belief that a facial photograph can steal a soul, imprisoning it within its picture. In 1908, Walter Camp wrote to the agent for the Pine Ridge Reservation inquiring about a portrait. “I have never seen a photo of Crazy Horse,” Agent Brennan replied, “nor am I able to find any one among our Sioux here who remembers having seen a picture of him. Crazy Horse had left the hostiles but a short time before he was killed and it’s more than likely he never had a picture taken of himself.”  Brennan to Camp, undated (December 1908), Camp Collection, Little Bighorn Battlefield National Monument.

Based on developing 21st century technology, Crazy Horse’s idea that facial photographs could steal your soul may turn out to be true.  At least if one thinks of “soul” as meaning a person’s mind thoughts, and feelings.  Today, that thief is the government.
Suppose you do any of the following:

  • Apply for a driver’s license
  • Apply for a state identification card
  • Apply for a passport
  • Were arrested for a seat belt violation, taken you to police station and picture taken, commonly referred to as a mugshot.

In all of these circumstances the government has taken your picture and now use it during a criminal investigation.In March 2017, the FBI detailed how their facial recognition programs work.Generally, here is what could happen.There is a knock on your door. It’s the police. There was a robbery in your neighborhood. They have a suspect in custody and an eyewitness. But they need your help: Will you come down to the station to stand in the line-up?

Most people would probably answer “no.” This summer, the Government Accountability Office revealed that close to 64 million Americans do not have a say in the matter: 16 states let the FBI use face recognition technology to compare the faces of suspected criminals to their driver’s license and ID photos, creating a virtual line-up of their state residents. In this line-up, it’s not a human that points to the suspect—it’s an algorithm.
_____________________________________________________________________________________________                  One in two American adults is in a law enforcement face recognition network.


  • Police departments in nearly half of U.S. states can use facial-recognition software to compare surveillance images with databases of ID photos or mugshots. Some departments only use facial-recognition to confirm the identity of a suspect who’s been detained; others continuously analyze footage from surveillance cameras to determine exactly who is walking by at any particular moment. Altogether, more than 117 million American adults are subject to face-scanning systems.

These findings were published in a report from Georgetown Law’s Center for Privacy and Technology. It details the results of a year-long investigation that drew upon more than 15,000 pages of records obtained through more than 100 freedom-of-information requests.

The study’s authors—Clare Garvie, Alvaro Bedoya, and Jonathan Frankle—attempted to fill in large gaps in public knowledge about how facial-recognition technology is used, and the existence of policies that constrain how police departments can use it. Some details about the FBI’s use of facial scanning were previously known, but the scale of local and state law-enforcement involvement is only now starting to come to light.  Never before has federal law enforcement created a biometric database—or network of databases—that is primarily made up of law-abiding Americans,” the report says.

That means that some departments have gotten away with patently absurd uses of the technology: In Maricopa County, Arizona, the sheriff’s office—led by a famously combative and anti-immigrant sheriff—downloaded every driver’s license and mugshot from every resident of Honduras, provided by the Honduran government, to its facial-recognition database.

Departments that use the technology in a more straightforward way can still be stymied by the inaccuracies and biases that often plague facial-recognition algorithms. They’ve been found to perform more poorly on African-American faces than on other races, which can make it more likely that a system will misidentify an innocent black person as a suspect. And because African-Americans are disproportionately likely to be arrested—and thus show up in mug-shot databases—systems that use booking photos will be more likely to flag an African-American face than a caucasian one.

A major problem with facial recognition technology is the probability of a given feature’s distinctiveness. As the FBI’s Forensic, Audio, Video and Image Analysis Unit (FAVIAU) explained, “Lack of statistics means: conclusions are ultimately opinion-based.” To remedy this flaw, a 2008 FBI report recommended that the agency undertake research to quantify the frequency of facial features. But such efforts, which have been underway since at least the late 19th century, have so far proved inconclusive.

The Georgetown report indicates the more common way law enforcement will use facial recognition technology:

• Stop and Identify.
On patrol, a police officer encounters someone who either refuses or is unable to identify herself. The officer takes her photo with a smartphone or a tablet, processes that photo through software installed on that device or on a squad car computer, and receives a near-instantaneous response from a face recognition system. That system may compare that “probe” photo to a database of mug shots, driver’s license photos, or face images from unsolved crimes, also known as an “unsolved photo file.” (As part of this process, the probe photo may also be enrolled in a database.) This process is known as field identification.

  • Arrest and Identify.

A person is arrested, fingerprinted and photographed for a mug shot. Police enroll that mug shot in their own face recognition database. Upon enrollment, the mug shot may be searched against the existing entries, which may include mug shots, license photos, and an unsolved photo file. Police may also submit the arrest record, including mug shot and fingerprints, to the FBI for inclusion in its face recognition database, where a similar search is run upon enrollment.

  • Investigate and Identify.

While investigating a crime, the police obtain a photo or video still of a suspect from a security camera, smartphone, or social media post—or they surreptitiously photograph the suspect. They use face recognition to search that image against a database of mug shots, driver’s licenses, or an unsolved photo file and obtain a list of candidates for further investigation, or, in the case of the unsolved photo file, learn if the individual is wanted for another crime.  Alternately, when police believe that a suspect is using a pseudonym, they search a mug shot of that suspect against these same databases.

  • Real-time Video Surveillance.

The police are looking for an individual or a small number of individuals. They upload images of those individuals to a “hot list.” A face recognition program extracts faces from live video feeds of one or more security cameras and continuously compares them, in real-time, to the faces of the people on the hot list. Every person that walks by those security cameras is subjected to this process. When it finds a match, the system may send an alert to a nearby police officer. Today, real-time face recognition is computationally expensive and is not instantaneous.  Searches can also be run on archival video.

Other concerns about facial recognition technology (FRT) also include:

• A reasonable expectation of privacy includes a reasonable expectation of anonymity from government use of computer algorithms and databases to capture law abiding citizens’ faces and identify them without their knowledge or consent.

• FRT allows for a different kind of tracking that can occur from far away, in secret, and on large numbers of people. Fingerprints are only left on things you touch and you know when police are taking them. You can’t leave your face at home and, with limited exceptions, it isn’t acceptable to cover it. Depending how it’s used, FRT could rob citizens of a reasonable expectation of anonymity.

• Giving police the right to do this without judicial oversight creates a slippery slope that could lead to real-time, mass surveillance like that of Big Brother. Police have an incentive to collect as many photos as possible because the larger the database the more likely they are to get a match and solve a crime or identify a suspect or person of interest.

• Real-time, mass surveillance could also chill First Amendment speech unpopular with the government. Advocates point to the FBI’s disgraced COINTELPRO program of surveillance against civil rights activists and Vietnam War protesters during the ‘60s and ‘70s.

Specific legal concerns are:
1. Does a face recognition constitute a “search” that triggers Fourth Amendment protection?
2. What is the legal standard police must meet before using FRT?
3. Does your state have a law regulating the collection of biometric data?