States are increasingly passing biometric laws to prevent companies from collecting and disseminating data such as fingerprints and facial recognition without permission.
Illinois was the first to pass such legislation – the 2008 Biometric Information Privacy Act (BIPA) – and nine other states, including New York, Massachusetts, and Maryland, have recently introduced biometric legislation, which gives individuals the right to private action with statutory damage.
Class actions are on the rise with some eyewatering payouts. In 2021, Facebook (now Meta) agreed to pay out $650 million to settle a lawsuit relating to data privacy claims, while Google paid out $100 million last year, and ByteDance, the parent company of TikTok, $92 million.
But it’s not only social media giants and search engines that being are affected – biometric information is collected in the workplace, by health providers, on mobile apps, and at schools. Other firms in the firing line include McDonald’s; amusement park operator Six Flags Entertainment; school bus service provider First Student; and automobile accessories manufacturer WeatherTech. In fact, Zywave’s data set of more than 1 million historical loss records includes over $1.3 billion in BIPA settlements to date.
Claims are usually under the personal and advertising injury insuring agreement of commercial general liability (CGL) policies and many insurers have gone to war in the courts, filing declaratory judgement actions seeking a ruling that they are not obliged to cover the defence costs of a client being sued for breaching BIPA.
Denial arguments include that BIPA claims are not “personal injuries” as there is no “publication” of material violating any right of privacy; that BIPA is a statutory scheme that falls within the statutory exclusion of the general liability policy; that the employment-related practices exclusion is applicable; and that other cyber and data disclosure exclusions apply.
Explosion of Class Action Litigation
Since the enactment of BIPA, there has been an explosion of class action litigation for non-breach privacy violations, with insurers defending claims for defence costs. The legislation came onto the insurance industry’s radar in May 2021, when the Illinois Supreme Court ruled on the seminal Illinois insurance coverage case for underlying actions alleging violations of BIPA.
The Supreme Court held the insurer had a duty to defend because the disclosure of fingerprint data to a third party constituted “publication” as required for “personal injury” coverage under the general liability policy. The Supreme Court also ruled coverage was not excluded by the policy’s distribution of material in violation of statutes exclusion because it excluded coverage for statutes that regulate the method of communication, not the dissemination of information.
In July this year, the US District Court for the Northern District of Illinois, denied an Illinois insurer’s motion for judgement on the pleadings, holding that the insurer had a duty to defend its insured in an underlying lawsuit that alleged the insured violated BIPA by requiring employees to scan their fingerprints to track their time and unlawfully collecting and disclosing their fingerprint data.
Following the Illinois Supreme Court findings in May 2021, the District Court found that the underlying lawsuit similarly alleged the improper disclosure of employees’ fingerprint data to at least one third-party in violation of BIPA, alleging the plaintiff suffered an injury, including mental anguish. The District Court held that the underlyinglawsuit fell within or potentially within the scope of the Illinois insurer’s policy.
The insurer argued that three exclusions precluded coverage: the recording and distribution of material or information in violation of law exclusion; the access or disclosure of confidential or personal information exclusion; and the employment-related practices exclusion. But the District Court concluded that none of the exclusions unambiguously precluded coverage.
In June this year, an insurance coverage dispute arose from two class action lawsuits filed against technology firm Wynndalco Enterprises, alleging that it violated BIPA when it licensed and sold to Clearview AI access to Illinois customers, including the Chicago Police Department. Clearview AI is an artificial intelligence firm that creates facial recognition software and has assembled a database of facial-image scans.
The Seventh Circuit affirmed the District Court’s judgement on the pleadings, holding that an insurer had a duty to defend two class action lawsuits against Wynndalco. The court determined the language of the catch-all provision of the statutory violation exclusion to be intractably ambiguous and ruled the insurer had a duty to defend its insured.
However, not all the judgments have found against insurance companies. In September last year, an Illinois insurer won its case against a food manufacturer when the US District Court held no coverage for a former employee’s lawsuit alleging violations of BIPA for using fingerprint biometrics.
CGL policies are not the only area where coverage has been litigated. Both employment practices liability and cyber policies may also provide coverage for BIPA claims. In April, the Appellate Court for Illinois said a cyber policy issued by an insurer created a duty to defend its insured, a technology company, who were sued by a truck driver for violating BIPA by unlawfully collecting fingerprint data as part of a railroad security gate system.
Facial Recognition in Spotlight
Facial recognition is a relatively new controversy that insurers must take into consideration when writing general liability policies. It came under the spotlight in March 2023 when a New York appeals court ruled that Madison Square Garden could ban lawyers suing them from attending events at its venues. Facial recognition was used by MSG Entertainment to stop a lawyer who worked at a law firm involved in litigation against the company from entering Radio City Music Hall as she tried to attend the Christmas Spectacular show with her daughter and her daughter’s Girls Scout troop. The ruling overturned a preliminary injunction that blocked the “lawyer ban.”
Biometric litigation does not stop at facial recognition and fingerprints: DNA has also entered the legal arena. In August 2023, class certification was granted to Illinois consumers whose genetic test results were shared with third parties without their consent by Sequencing, a DNA analysis company. The court found the company had violated the rights of 1,550 people under the Illinois Genetic Information Privacy Act.
Sequencing collects and analyzes its customers’ DNA after they either send the company their DNA directly or upload the results of a DNA test taken by a third party, such as 23andMe or Ancestry.com. Sequencing uses that information to assess the customers’ genetic code and produce reports, most of which are provided by third-party developers. As soon as a customer buys a report, their personal and genetic information is automatically transmitted to the corresponding third-party developer. The plaintiff claimed Sequencing did not inform him that his genetic information would be shared and that he never consented to the disclosure of that information to anyone.
Biometric legislation and litigation now represent the cutting edge of personal protections. The continued expansion of these protections may obligate insurers to assume liabilities they didn’t anticipate when developing and pricing their products.
As the outcome of biometric litigation cases remains unpredictable with courts differing in their interpretation of coverage terms, insurance companies need to review their policies and track coverage decisions. While many may attempt to exclude the legal costs of defending BIPA and GIPA violations, the threat of expansive interpretation of wording remains a concern.
While the litigation landscape remains uncertain, insurers need to keep up to date with privacy laws and ensure they are compliant to minimize exposure.