After more than four years of litigation, Facebook agreed to resolve a class-action privacy lawsuit stemming from its use of facial recognition technology. As part of the settlement, the social media giant has agreed to pay a total of $550 million to the participants in the case.

The lawsuit has significant ramifications for the right to privacy in the age of social media. Facial data collection and the collection of other forms of biometric data are likely to be front and center in the privacy debate going forward.

This white paper will review the background on this historic lawsuit and the settlement that came with it. We will also consider the Illinois law used to bring about this result, and how other states deal with these privacy issues. Finally, we will review the power of facial recognition data to explain why companies are after this information. A better understanding of your biometric privacy rights could help you ensure that companies—large and small—do not use your identity without your consent.

A Historic Settlement

At $550 million, the settlement in Patel, et al. vs. Facebook, Inc. is one of the largest privacy rights settlements in history. The lawsuit began in 2015 when several Facebook users filed a class action against the company. The class-action lawsuit alleged a violation of the Illinois Biometric Information Privacy Act (BIPA)—the first of its kind in the United States.

According to the complaint, the plaintiffs alleged that Facebook generated scans of the photos uploaded to Facebook and captured the subject’s facial recognition data. According to the company, this data was collected to enable accurate suggestions when “tagging” another person in a photo. The software scans the face of each person in an uploaded photo and generates a “template.” Facebook then uses these templates to match facial scans of your friends with the image, providing you with simple options to tag them in the picture.

According to Facebook, they also use facial recognition for security purposes. Since 2018, Facebook has steadily adjusted its approach to facial recognition. The use of this feature is now opt-in only—as opposed to the default setting—for new users.

The lawsuit wound its way through a California federal court for years, with Facebook fighting it every step of the way. The plaintiffs crossed a major hurdle in 2018 when a federal judge certified the class and allowed the suit to proceed on behalf of thousands of other Facebook users.

By resolving the case, Facebook headed off a potentially large jury verdict. BIPA limits damages to $1,000 per violation and $5,000 for intentional violations. While that might sound low, there are a few things to remember:

First, there are more than 2.4 billion Facebook users. While most of them are not in Illinois, a larger share of users could be in jurisdictions that adopt statutes similar to BIPA. Second, the courts have not yet weighed in on how these penalties accrue.

There is little guidance under the law on what constitutes a single offense. If the court finds that every picture uploaded by a person counts as one violation, the potential damages could be astronomical.

Other BIPA Cases

The Facebook lawsuit has resulted in the first settlement under BIPA, but it is not the first lawsuit based on the statute. There have been two previous lawsuits BIPA claims with mixed results.

Google was the subject of a BIPA lawsuit in 2016. According to the pleadings, a Chicago woman filed suit after a Google Photos user took a series of 11 images of the plaintiff on an Android smartphone. The user then uploaded those images to Google Photos, where Google scanned the plaintiff’s face and stored her biometric data. The plaintiff did not use Google Photos and had never consented to have her data collected or stored. In 2018, the court dismissed the lawsuit after finding that the plaintiff did not suffer any “concrete damages” from the collection of her data.

The other BIPA-related lawsuit involved the Snapchat phone application. In 2016, the attorneys for a Chicago man filed suit in a California court alleging a violation of BIPA. According to the pleadings, Snapchat acquired a facial recognition software company in 2015. After the acquisition, the company began measuring users’ faces using facial recognition software. The data was collected and stored to allow users to apply certain “lenses” that could alter images taken with the app. At no point did Snapchat inform users their data was collected, and at the time of the lawsuit, they did not have guidelines on how long they would store this information. The lawsuit was transferred to federal court in 2016 before ultimately going to arbitration.

How Other States Address Facial Recognition Data Collection

To date, Illinois is the only state that has adopted biometric privacy laws that provide consumers with a robust cause of action. That does not mean more will not follow after the success of this lawsuit.

Other states are making a push to protect consumer privacy rights. Unfortunately, the largest example to date may have missed the mark. Recently, California adopted the California Consumer Privacy Act (CCPA). The CCPA is considered by most to be the broadest, most stringent privacy law in American history. It provides valuable insight into the personal data collected from individual consumers and how that data is used. Where it lags behind the Illinois law is the private right of action for unauthorized collection of biometric data.

The CCPA does include the ability for consumers to file a lawsuit. However, this right is much narrower than the BIPA. This ability to sue is limited to cases where a data breach occurred. Even then, a consumer only has a cause of action to file suit if the breach resulted from negligent security on the part of the company. A class action similar to the Facebook lawsuit would not be possible under the CCPA as it stands.

For consumers in other states, holding Facebook accountable for their unauthorized collection of biometric data could be challenging. Some states do not allow these types of lawsuits at all, while others require a plaintiff to show Facebook caused them material harm. It remains to be seen if this settlement will push other jurisdictions to adopt similar measures.

Why Companies Want Your Biometric Data

Facebook has bigger plans for its enormous database of facial recognition data than enabling a photo tagging feature. Like with most things on the platform, the company inevitably intends to monetize this data. In fact, they already own multiple patents allowing advertisers to target users based on facial data. While the company claims they have this technology ready, they also allege they do not currently use any biometric data for advertising purposes.

The potential for facial recognition data to impact Facebook’s marketing efforts is immense—especially given the enormous amount of data Facebook already collects about each user. Facial recognition software, coupled with the data the platform collects about user preferences, could dramatically alter the way companies serve ads to consumers.

The use of this data goes far beyond the confines of Facebook. The marketing power of facial recognition data could be a powerful tool in countless settings. Walgreens is taking early steps to use facial recognition software in their cooler section. These “smart coolers” could identify a person that appears overheated from a workout and display an ad for a cold drink, for example. The potential for these ads multiplies if the cooler can identify a user’s face and access data from Facebook regarding their product preferences. The advantage for a company to have its product automatically marketed to their biggest fans based on facial recognition data is impossible to measure.

Ramifications to Data and Facial Recognition Privacy

Of course, companies and marketing executives are not the only parties interested in facial recognition data. Law enforcement has a keen interest in making use of facial recognition software to identify suspects or track down witnesses. In January of 2020, the Chicago Police Department came under fire for scraping millions of images from Facebook to identify suspects. While police in the city claim that the use of this system is limited to identifying unknown individuals implicated in a specific crime, significant privacy questions remain.

Ultimately, law enforcement’s use of facial recognition could become widespread. Already, many agencies are studying the possibility of large-scale facial recognition through the extensive use of public surveillance. Should that day come, law enforcement will undoubtedly have an interest in obtaining Facebook’s facial recognition data.