Quantcast
Channel: The Angle | Fasken IP
Viewing all articles
Browse latest Browse all 101

Doan v Clearview Inc and the Identification of Class Members

$
0
0

The Federal Court recently released its decision in Doan v Clearview Inc, 2023 FC 1612, in which the Court distinguished a situation where there is no basis in fact for proving that two or more class members can be identified for the sake of certifying a proceeding as a class action and a situation where it is merely difficult to identify said class members. Significantly, this case involved a situation where a company potentially ingested publicly available photographs online to aid the use of their technology, which is a circumstance that may become more commonplace with the ever increasing presence of Artificial Intelligence (“AI”) in all facets of everyday life. Continue reading to learn about how the Court’s decision in this case could have long lasting effects on the ability of individual plaintiffs to have actions filed against AI-related companies turned into class actions.

Background

The case began with the plaintiff filing an action against the defendant, a US based corporation that provides facial recognition and identification services using a facial recognition technology data. She alleged that the defendant collected, copied, stored, used, and in some cases disclosed publicly available photographs found on the internet without her knowledge or consent. The portions of her pleadings relating to privacy were struck, ultimately leaving her with only a copyright infringement suit.

This decision comes about because the plaintiff sought an order to amend the suit to be a class action proceeding, introducing materials showing that other individuals’ photographs were used similarly to her own by the defendant company.

Court Analysis

The Court focused their analysis on whether there is an identifiable class of two or more persons. They found that there was not enough evidence to show that the defendant truly could identify the individuals in the photographs or that the individuals could self-identify in order for there to be a class of two or more persons.

The Court made it clear that there was no need to actually identify all of the class members at this point in the process and that the level of difficulty imposed on the defendant to retrieve the relevant information was not relevant. However, this situation presented an issue that was beyond merely difficult because the defendant simply did not have enough information in order to identify the class members from the photographs at all. The defendants needed the copyright data and the location data of the photographs, neither of which were provably available to them.

The plaintiff had also suggested that people should be able to ask the defendants for a report on themselves or others in order to discover if they were a class member. This suggestion was also incompatible with the capabilities of the defendants, but more importantly, the Court pointed out that this would unacceptably transform the Federal Court’s opt-out class action scheme into an opt-in scheme.

The Court ultimately concluded that the class may exist in the abstract, but that the plaintiff failed to establish that the class members can be known either now or in the future. Therefore, the motion for certification of the proceeding as a class action was not granted.

Takeaways

The confirmation that if there is no evidence that class members can be identified, a proceeding cannot be turned into a class action could be meaningful to the future of cases related to AI training. This case specifically involved photographs potentially being ingested to aid technology in a way that is similar to how many AI currently function and will continue to function in the future. This decision could limit individual plaintiffs’ abilities to convert cases that involve them and other unidentified individuals into class actions if the AI technology using the copyright material is not advanced enough to identify the owners of the works or subjects of the photographs used. This raises the question of whether AI developers could or should be trying to create safeguards that allow them to better identify information pertaining to the works that their technology uses.


Viewing all articles
Browse latest Browse all 101

Latest Images

Trending Articles



Latest Images