Search
  • Adam Cohen

Battleground Biometrics


Battleground Biometrics:

Adoption, Legislation and Implications for E-Discovery

by Adam Cohen[1]

Introduction

While biometrics as a field is more than a hundred years old, recent years have witnessed a rapidly accelerating proliferation of new biometric technologies and applied uses. The law is giving chase to this technology at its familiar, lumbering, inevitably losing pace. Meanwhile, biometrics is being cast in all kinds of roles in all types of venues across the globe; like other celebrities of modern information technology, it refuses to stand still for a clear portrait. Nevertheless, this paper attempts to fill in some of the scene for context and directional guidance, for legal professionals unfamiliar with the intersection of biometrics and litigation.

For information security professionals, biometrics has long provided the promise of improved controls in the domain of Identity & Access Management (“IAM”). Many people have involuntarily become familiar with concepts like “authentication factors” with the expanding use of Multi-Factor Authentication (“MFA”) as an additional layer of defense in this virtualized world where password management has become a fruitless nightmare. Classic cyber-security doctrine teaches that authentication factors could be “something you know, something you have and/or something you are.” The first two are illustrated by, respectively, passwords and hardware security tokens--the third, biometrics.

Data generated by biometrics is inherently and uniquely personal. It can also be highly sensitive from a privacy perspective. For example, physical characteristics identified by certain biometric techniques, for example, retinal scans, can reveal sensitive personal health information. The accretion of vast oceans of data in unknown repositories managed by unknown “service providers” is unsettling to anyone with privacy concerns.

Another breed of concern, the integrity of biometrics data, arises from legitimate questioning of the accuracy of output by new biometric technologies or analytics based on biometrics data. Better hope that your “separated at birth” doppelganger is as well-behaved as you are, or at least doesn’t antagonize the government (or a galactic technology company with more effective power to affect your life than the government)…Otherwise, the remarkable resemblance may someday soon trigger an automated response, perhaps a SWAT team of robots dispatched to subdue and arrest you for a crime you didn’t commit. For litigators, concerns over the reliability of biometrics data relate directly to evidentiary admissibility.

The broad scope of discoverable electronically stored information (“ESI”) in the sense used by the Federal Rules of Civil Procedure, with all of its attendant legal obligations, unquestionably encompasses data generated by biometric systems. Given the imminent-if-not-present ubiquity of ESI-generating biometrics systems, any litigator or e-Discovery professional is certain to encounter this species of ESI. To orient such professionals, we set the table with some background and a snapshot of the current state of adoption of biometrics deployment generally. Next, we identify and summarize the status of legislation and regulatory guidance aimed at biometrics, as well as litigation engendered by this legislation. Finally, we provide certain considerations for approaching e-Discovery issues where biometric ESI is in play.

Background

The term “biometrics” is primarily used to refer to measurement of physical characteristics, but more recently its scope has been expanded by conjunction with behavioral analytics, i.e., “behavioral biometrics.” https://www.merriam-webster.com/dictionary/biometrics. The potential application of the term “biometrics” to systems that use algorithms to identify anomalies in user activity for security purposes, or that drill through “Big Data” to refine targeted advertising that seems to be reading your mind, is a significant expansion that is not addressed here directly. For purposes of this discussion, we focus on biometrics measuring and analyzing physical qualities or characteristics, not behavior or activity.

Perhaps the most widely familiar example of applied biometrics is law enforcement usage of fingerprints to identify criminals, which dates back to the late 19th century. In those days matching fingerprints required manual labor with reference to classification systems. In the 1960s, law enforcement agencies recognized the opportunity to apply emerging developments in Information Technology to make more effective use of fingerprints to solve crimes, sparking a number of research initiatives. Further information is provided in the accompanying paper by J. Kenneth Magee, “Authentication and Identification Biometrics: Technical Aspects and Security Considerations,” also submitted in connection with the Georgetown Advanced E-Discovery Institute 2018.

The 1980’s brought the emergence of “automated fingerprint identification systems” (AFIS) and today, there are hundreds of such systems. Law enforcement or other government agency use of AFIS eventually expanded beyond crime-solving, to other applications such as border control. Fingerprint systems are now commonly used in the private sector. For example, McDonalds uses them for time clock and cash register access and Bank of America for ATM transactions. One recent report predicts that “nearly 90% of businesses will use some type of biometric technology for authentication by the year 2020.” https://community.spiceworks.com/security/articles/2952-data-snapshot-biometrics-in-the-workplace-commonplace-but-are-they-secure. The same report states that 62% are already using it, with 24% moving to do so within two years (57%--fingerprint scanners, 14%--facial recognition, 5%--hand geometry recognition, 3%--iris scanners, 2%--voice recognition, 2%--palm vein recognition).

AFIS has evolved into “ABIS” (automated biometric identification systems). This movement from “unimodal” systems to those capable of linking different forms of biometric identification involves substantial increases in speed and accuracy. Current active uses of biometrics that measure physical characteristics focus not only on fingers, but also eyes, faces and voices. Palm, vein and heartbeat readers are also coming into use.

Another recent study shows the adoption of biometrics for identified purposes in order of most to least frequent:

Rachel L. German and K. Suzanne Barber, University of Texas at Austin, Center for Identity, “Current Biometric Adoption and Trends” (May 2018). Perhaps not surprisingly, the financial services sector shows the greatest activity in adopting biometrics for commercial purposes; its adoption is catalyzed by the growing supply of personal devices with biometric capabilities.

Legislation

In the United States, state governments, beginning with Illinois in 2008, have enacted or proposed laws responding to privacy concerns related to biometrics. The enactment of the Illinois law, 74 ILCS 14 (the Biometric Information Privacy Act or “BIPA”), was sparked by a situation where substantial volumes of biometric data from Illinois residents were perceived to be in serious jeopardy due to the bankruptcy filing of a fingerprint scanning system provider, “Pay By Touch.” The debtor company provided its systems to retailers for use in transactions with consumers, but the consumer fingerprints were transmitted to the debtor and stored on its systems. Given the 2007 bankruptcy filing, this data was potentially available for sale to satisfy obligations to creditors. Anxiety over a potential transaction like this is reportedly what gave birth to BIPA.

Texas (Bus. & Com. $503.001) followed Illinois in 2009, with Washington (19.375 RCW) passing its biometrics law in 2017. The enacted laws have material differences, beginning with defining the aspects of biometrics within the scope of coverage. Illinois is currently the only state providing a private right to sue for violation of its biometrics statue.

Scope of Coverage Regarding Biometrics Data

Illinois includes a “retina or iris scan, fingerprint, voiceprint, or scan of hand or face geometry” while specifically excluding:

  • writing samples, written signatures, photographs, human biological samples used for valid scientific testing or screening, demographic data, tattoo descriptions, or physical descriptions such as height, weight, hair color, or eye color.

  • donated organs, tissues, or parts as defined in the Illinois Anatomical Gift Act or blood or serum stored on behalf of recipients or potential recipients of living or cadaveric transplants and obtained or stored by a federally designated organ procurement agency.

  • biological materials regulated under the Genetic Information Privacy Act. Biometric identifiers do not include information captured from a patient in a health care setting or information collected, used, or stored for health care treatment, payment, or operations under the federal Health Insurance Portability and Accountability Act of 1996.

  • an X-ray, roentgen process, computed tomography, MRI, PET scan, mammography, or other image or film of the human anatomy used to diagnose, prognose, or treat an illness or other medical condition or to further validate scientific testing or screening.

Texas similarly, almost although not completely identically, includes “a retina or iris scan, fingerprint, voiceprint, or record of hand or face geometry”—but without setting forth any exclusions.

Washington also defines the term by specific inclusion and exclusion, but with different verbiage, including:

data generated by automatic measurements of an individual's biological characteristics, such as a fingerprint, voiceprint, eye retinas, irises, or other unique biological patterns or characteristics that is used to identify a specific individual

and excluding:

a physical or digital photograph, video or audio recording or data generated therefrom, or information collected, used, or stored for health care treatment, payment, or operations under the federal health insurance portability and accountability act of 1996.

Note that Illinois and Texas do not define “biometric identifier” as including the information or data generated by biometrics. Illinois utilizes the term “biometric information” to fill this potential gap, defined as:

any information, regardless of how it is captured, converted, stored, or shared, based on an individual's biometric identifier used to identify an individual. Biometric information does not include information derived from items or procedures excluded under the definition of biometric identifiers.

Texas simply uses the “identifier” term to serve the same purpose.

Notice & Informed Consent

Each of the statutes imposes requirements to notify and obtain consent from individuals whose biometrics data is collected. Again, there are significant differences among the three.

Illinois requires notice and consent before a “private entity” is permitted to “collect, capture, purchase, receive through trade, or otherwise obtain” biometrics data. The notice must be written and apart must include “the specific purpose and length of term for which” the biometrics data is being “collected, stored and used.” Along with notice, affirmatively expressed consent in the form of a “written release executed” by the data subject is required.

Consent, unspecified as to form, is also required if the entity wishes to “disclose, redisclose or otherwise disseminate” biometrics data, subject to exceptions for state or federal law or municipal ordinance or a court-issued warrant or subpoena.

Texas requires notice and consent, without specifying form or mechanism, in order to “capture” biometric data “for a commercial purpose.” Permitted sale, lease or other disclosure is restricted to four specific exceptions. These include: a) the data subject’s consent “to the disclosure for identification purposes” in case of death or disappearance; b) the completion of a financial transaction authorized or requested by the data subject; c) where other law requires or permits or d) by or to a law enforcement agency in response to a warrant.

Under the Washington statute, “[a] person may not enroll a biometric 16 identifier in a database for a commercial purpose, without first providing notice, obtaining consent, or providing a mechanism to prevent the subsequent use of a biometric identifier for a commercial purpose.” With the explanation that choice of “[t]he exact notice and type of consent required” depends on context, “[n]otice is a disclosure, that is not considered affirmative consent, that is given through a procedure reasonably designed to be readily available to affected individuals.”

Without consent, sale, lease or other disclosure--for commercial purposes--is limited to seven exceptions, the first of which is where the disclosure “is consistent with” the other subsections of the law. Others involve situations where the data subject has effectively given consent, i.e., in subscribing to or requesting products or services, or initiating or authorizing financial transactions (the latter only where the third party agrees to maintain confidentiality, which is another, independent exception under the sub-section). Finally, Washington makes exceptions for legal reasons, including federal or state statute or court orders requiring or authorizing disclosure, or even “to prepare for litigation or to respond to or participate in judicial process.”

Information Security Standards

Each of the state biometrics statutes require parties in possession of biometrics data to do something to protect it. Illinois states the standard as requiring protection meeting both a “reasonable standard of care” in the particular “industry” and a level at or greater than applied to “other confidential and sensitive information.” Texas basically uses the same standard, while Washington specifies “reasonable care to guard against unauthorized access to and acquisition of biometric identifiers.”

Note that biometric data is explicitly included within the scope of personal information subject to certain state data breach notification statutes.

Data Retention & Limitation

Illinois mandates:

a written policy, made available to the public, establishing a retention schedule and guidelines for permanently destroying biometric identifiers and biometric information when the initial purpose for collecting or obtaining such identifiers or information has been satisfied or within 3 years of the individual's last interaction with the private entity, whichever occurs first.

Showing a recognition that many retention policies tend to gather dust sans implementation, compliance with established retention and destruction guidelines is explicitly required.

Texas focuses on limiting the time period retention. Absent any applicable longer retention requirements in other governing law, biometric data is to be destroyed “within a reasonable time, but not later than the first anniversary of the date the purpose for collecting the identifier expires.”

Washington limits retention to a period “no longer than is reasonably necessary” to comply with court orders or other federal, state or local law, protect against “fraud, criminal activity, claims, security threats or liability” and provide the services in connection with which the data was “enrolled.”

Enforcement

Illinois is the only state that provides a private right of action: “Any person aggrieved by a violation of this Act shall have a right of action in a State circuit court or as a supplemental claim in federal district court against an offending party.” For each violation, prevailing parties can recover the greater of $1,000.00 or actual damages where the violation is negligent, or the greater of $5,000.00 or actual damages where the violations is intentional or reckless. In either case, recovery of attorneys’ fees and other litigation costs is available, as well as injunctive relief. Under the Texas statute, only the attorney general may bring an action; the AG can recover a civil penalty of up to $25,000.00 for each violation. Similarly, the Washington statute is enforceable only by the attorney general, under the state consumer protection act.

The private right of action offered in the Illinois biometrics statute has already found plenty of takers, although it took a few years for the trickle of cases to turn into a flood. According to one recent report, “[m]ore than 50 companies are now defending class-action lawsuits under [BIPA].” https://www.workforce.com/2018/05/24/biometric-privacy-lawsuits-rising/. These companies include Google, Facebook and Shutterfly, among others, who all failed in their bids to dismiss the complaints at the pleading stage. See, e.g., Monroy v. Shutterfly, Inc., 2017 WL 4099846 (N.D. Ill. Sept. 15, 2017), Rivera v. Google Inc., 2017 WL 748590, at *6 (N.D. Ill. Feb. 27, 2017); In re Facebook Biometric Info. Privacy Litig., 185 F. Supp. 3d 1155, 1172 (N.D. Cal. 2016), Patel, et al. v. Facebook, Inc., 2018 WL 1050154 (N.D. Cal. Feb. 26, 2018 (order on renewed motion to dismiss).

Many of the cases involve the use of employee fingerprint scanning to clock working hours. As recently as September 2018, the fast food chain Wendy’s, along with the provider of the biometrics technology it uses, Discovery NCR Corporation, were sued in Cook County, with a putative class alleging claims based on Wendy’s handling of employee fingerprints. The complaint alleges that Wendy’s uses biometric clocks which scan fingerprints of employees to track arrival, departure and use of Point-of-Sale and cash register systems, but without following the requirements of the state biometrics act.

According to the plaintiffs, Wendy’s violates that act by omitting to provide required written notice informing the employees/data subjects how the data is to be used and the duration for which it is to be stored. They also claim that employees never provided the required written release indicating consent to use their biometric information. Finally, plaintiffs allege that Wendy’s failed to provide the statutorily required, publicly available retention policy mandating destruction of biometric data within specified timeframes.

Other cases show the wide variety in the kinds of businesses that can become defendants in BIPA cases and the kinds of applications and uses of biometrics data that can arise. In Rosenbach, et al. v. Six Flags Entertainment Corporation, et al., 2017 WL 6523910 (App.Ct.Il. Dec. 21, 2017), the allegations also involve fingerprinting, but in connection with the purchase of a season pass to a “theme park.” Sekura, et al. v. Krishna Schaumberg Tan, Inc., 2017 WL 1181420 (Ill.Cir.Ct. Feb. 9, 2017) was settled for $1.5 million in the face of allegations that a tanning salon violated the act when it disclosed fingerprint scans to out-of-state third-party software vendor. See also, Santana, et al. v. Take-Two Interactive Software, Inc., 717 Fed. Appx. 12 (2d Cir. Nov. 21, 2017) (face scan used in video games; remanded with instruction to enter dismissal without prejudice based on failure to plead actual damages); McCullough, et al. v. Smarte Carte, Inc., 2016 WL 4077108 (N.D. Ill. Aug. 1, 2016) (lockers opened via fingerprint scan; motion to dismiss granted based on failure to plead actual damages).

One significant threshold issue that has been resolved inconsistently by courts is whether actual damages are necessary to confer standing under BIPA. Rosenbach, Santana and McCullough held that actual damages are a pre-requisite while Sekura, Monroy and Patel held the opposite.

Guidance from Federal Agencies

FTC

Although there is no federal law specifically covering biometrics as a unique source of data, the Federal Trade Commission issued a staff report in 2012, recommending certain practices with respect to facial recognition. Apart from a general exhortation to implement reasonable security, the FTC specifically suggests: a) preventing unauthorized scraping of consumers’ images, b) establishing and maintaining “appropriate” retention and destruction practices for such images and biometric data, c) considering the sensitivity of information (with the example of not putting “digital signs equipped with cameras” in bathrooms or locker rooms).

The FTC also recommends that businesses using facial recognition “provide consumers with simplified choices and increase the transparency of their practices.” Again, this general guidance is supported by the example of the camera-toting digital signs, which the FTC finds “often look no different than digital signs that do not contain cameras.” In addition, “social networks” are supposed to provide notice independently of any general privacy policy and allow consumers to: a) turn off any facial recognition feature at any time and b) delete any previously collected biometric information (specifically with respect to tagged photos).

Two other scenarios are highlighted in the report as demanding prior, affirmative, express consent from consumers. One is where the intention is to use the data in a “materially different manner” than originally represented to the consumer. The other is where facial recognition would be used to “identify anonymous images of a consumer to someone who could not otherwise identify him or her.”

EEOC

Biometrics were featured in a recent Fourth Circuit opinion reviewing a case brought by the Equal Employment Opportunity Commission. EEOC v. Consol Energy, Inc., 860 F.3d 131 (4th Cir. June 12, 2017. A coal miner in West Virginia stated a religious objection to his employer’s use of biometric hand scanning for clocking employees in and out of work. The employer refused to make an exception or “accommodation” and the EEOC sued on the employee’s behalf.

The Fourth Circuit upheld a jury verdict in favor of the employee/EEOC and the U.S. Supreme Court denied certiorari. The employer was held to have violated Title VII by constructively discharging the employee instead of accommodating his religious beliefs. Clearly, employers who want to use biometrics have legal considerations to undertake even if they are not in a state with a specific statute addressing biometric data.

State Legislation Outlook

It appears imminent that other states will enact laws imposing requirements on the collection and use of biometric data, although it is less clear how many will include a private right of action. As of September 2018, at least four states have pending legislative proposals that include such a right. These states are: Michigan, 2017 MI H.B. 5019, New Hampshire, 2017 NH H.B. 523, Alaska, 2017 AK H.B. 72, and Montana, 2017 MT H.B. 518. Bills have also been introduced in Connecticut and Massachusetts (which do not include private rights to sue).

The volume of BIPA litigation experienced by Illinois has given rise to efforts seeking to amend the law in favor of employers, by creating exemptions. For example, a bill introduced in February 2018 would reduce the scope of BIPA’s coverage to exclude cases where: a) the biometrics data is used solely for employment, human resources, fraud prevention or security purposes, b) there is no profit from the sale or lease of the data, or c) the data is secured at least as rigorously as other confidential or sensitive information. Still further amendments in this direction have been proposed, including amendments that would eliminate or restrict the private right of action.

E-Discovery Considerations

General Discoverability

Biometrics data is likely to arise with increasing frequency as a relevant source of ESI in litigation, even in cases which are not about violations of biometrics data protection statutes. This is because the proliferation in the adoption of biometrics systems, for an increasing number and variety of uses, makes it inevitable that data from these systems will be available to provide additional evidence about the “who, what, where and when” of events underlying disputes. In this regard, biometrics systems are part of a larger phenomenon involving our ever-increasing interaction with, and the connectivity among, ESI generated and stored by systems, services, applications, networks and devices of all kinds. The intersection of the growing number of data points will make it possible to reconstruct past events with greater and greater precision and depth of understanding.

In terms of admissibility, determining issues regarding the accuracy and integrity of biometrics systems and data will be of critical importance, but relevance will be clear in many cases. Now that proportionality has been enshrined with relevance as a primary touchstone of discoverability, the accessibility of biometrics data may be a focus of discovery disputes. Depending on the type of biometrics data involved, litigation adversaries may be arguing about whether access to proprietary software for reviewing and analyzing the data must be provided or whether the data must be converted into formats that can be utilized in commercially available systems. Disputes about what the data means may cause controversy over whether confidential information about how the systems work must be disclosed, or whether there is some other way to test or validate results. The increasing adoption of biometrics will raise all these questions and more.

The Role of Third-Parties

Possession, custody and control issues are also likely to play a role in e-Discovery issues regarding biometrics data. In many of not most cases, software is licensed and/or hardware leased from third parties. Storage of data generated by biometrics systems is likely to involve third-parties just like other sources of large volumes of data. But as we have seen with other types of ESI, the fact that a third-party has some role in effectuating preservation or discovery has no bearing on the litigating party’s obligations to preserve and produce. Rather, businesses using biometrics systems would be well-advised to consider such obligations in developing agreements with vendors and service providers as well as in their choices of technology and implementation architectures.

Privacy Considerations

Finally, the privacy implications of biometrics are likely to play out in the same way as other issues involving sources of “private” ESI, for example, social media account activity. This means that privacy objections are unlikely to overcome discoverability where the information sought is relevant. However, recent cases have exhibited a tendency to include privacy considerations as part of the proportionality prong of the discoverability analysis.

It is easy to imagine how many kinds of so-called “behavioral” biometrics may be highly sensitive, but most physical characteristics are not. As noted above, there are exceptions, such as how retinal scans can indicate pregnancy or health problems. Even in these cases, however, a protective order is more likely to be viewed as an appropriate resolution of concerns than an exemption from disclosure where the data is otherwise relevant and the burden not undue.

Conclusion

Biometrics systems are making their way into every corner of our lives. From our personal devices and information security to our places of work, biometrics systems are being developed and put into use. Laws directly regulating how these systems are used and what happens to the data they generate are developing on a state-by-state level, with the corresponding lack of uniformity and geographic impact on litigation that attorneys in the United States have come to see as either a bug or a feature depending on which “side of the v” their clients tend to occupy. To the extent there is any certain prediction about near future of biometrics data in litigation of all types, it is this—biometrics data is coming to an e-Discovery dispute near you, sooner than you think.

[1] Adam (Managing Director, Berkeley Research Group) consults businesses on managing legal compliance and security risk associated with information technology. An ex-IP litigator turned tech consultant, he holds professional certifications in cyber-security (information systems security, cloud security and ethical hacking) along with a law degree from Duke. His published work on electronic evidence has been cited in several landmark federal court opinions and he has been teaching law school courses continuously for more than a decade.


7 views

Recent Posts

See All

CONTACT

FOLLOW

©2017 by Digital Discipline. Proudly created with Wix.com