Concerns about using facial analysis at events: part two

An illustration of a crowd of conference attendees watching a presentation, while ceiling mounted technology conducts facial analysis of the attendees.My January 15, 2024 article “Concerns about using facial analysis at events” generated much discussion. (See, e.g., this thread on LinkedIn which has, at the time of writing, four thousand impressions.)

Five days later, Panos Moutafis, co-founder & CEO of Zenus, the “ethical facial analysis” company, responded.
I find his response inadequate, and this post explains why. I’ve included portions of Moutafis’s response, quoted in red, together with my comments. I conclude with a summary if you want to skip the details.

Here we go.

After an introduction …“Ignorance can be bliss, but it can also be dangerous.” Moutafis begins:

Ethical AI by Zenus: A Summary

“Data from our ethical facial analysis service cannot be used to identify individuals. This is not an opinion. It is an indisputable fact.”

If the “Zenus AI” system is, in fact, completely unhackable, this statement may well be true. But it’s misleading because it does not address attendee privacy concerns. Why? Because, as I explained in my original post, combining Zenus facial analysis data with other attendee identification technology allows event owners to associate Zenus data with individual attendees.

Moutafis now admits this is possible, as his response now includes statements about how the Zenus system should be used. As far as I know, Zenus has not made these statements publicly before.

“If someone wants to use other technologies to identify individuals and combine the data [emphasis added], they need to obtain explicit consent first.

This is true of hotels, convention centers, event organizers, technology companies, etc. Otherwise, they are exposing themselves to liabilities.

A legal review takes place before starting to use a new service in this manner. People who work in the corporate sector and associations are familiar with these processes. This is not the Wild Wild West.”

The crucial phrase here is “and combine the data“. Moutafis is saying that when combining attendee tracking data with data supplied by the Zenus system, attendees must provide explicit consent. That means attendees must be informed about this in advance. And they must give explicit consent for event owners to use real-time continuous data from Zenus’s system to provide additional information on each attendee.

In my original post, I noted that Moutafis tries to put all the responsibility for such consent on the event owner and/or supplier of the attendee identification technology rather than his company. We’ll see why he needs to do this shortly.

GDPR and Data Privacy Regulations

Different regions and implementations have different requirements.

The European Data Protection Board, in particular, has clearly noted that facial analysis alone does not fall under Article 9.

See section 80 in the Guidelines adopted on January 29, 2020 [link].

“However, when the purpose of the processing is for example to distinguish one category of people from another but not to uniquely identify anyone the processing does not fall under Article 9.”

See section 14 in the Guidelines adopted on April 26, 2023 [link].

“The mere detection of faces by so-called “smart” cameras does not necessarily constitute a facial recognition system either. […] they may not be considered as biometric systems processing special categories of personal data, provided that they do not aim at uniquely identifying a person […] .”

In simple words. Are you using the service alone? Great.

Are you combining it with identifying information? Obtain consent or face the consequences. The pun is totally intended.

This section restates that the Zenus technology satisfies European Data Protection Board guidelines only when used in isolation. It confirms that clients combine Zenus analytics “with identifying information” “you” must “Obtain consent or face the consequences.” Again, the “you” is any entity but Zenus.

In addition, to bolster his case, Moutafis does a selective quote of section 14 in the Guidelines adopted on April 26, 2023. Here’s the entire section 14 with the portions Moutafis omitted in bold:

“The mere detection of faces by so-called “smart” cameras does not necessarily constitute a facial recognition system either. While they also raise important questions in terms of ethics and effectiveness, digital techniques for detecting abnormal behaviours or violent events, or for recognising facial emotions or even silhouettes, they may not be considered as biometric systems processing special categories of personal data, provided that they do not aim at uniquely identifying a person and that the personal data processing involved does not include other special categories of personal data. These examples are not completely unrelated to facial recognition and are still subject to personal data protection rules. Furthermore, this type of detection system may be used in conjunction with other systems aiming at identifying a person and thereby being considered as a facial recognition technology.

Wow! Moutafis omits the “important questions in terms of ethics and effectiveness” raised by facial analysis. And, tellingly, he cuts the last key sentence entirely:

Furthermore, this type of detection system may be used in conjunction with other systems aiming at identifying a person and thereby being considered as a facial recognition technology.

This, of course, is exactly what Moutafis admits happens if clients use Zenus technology with any other tech that identifies individuals.

So the European Data Protection Board guidelines say that Zenus’s system effectively becomes a facial recognition system under these circumstances.

That’s not what Moutafis implies. I’d describe this section of Moutafis’s response as deliberately misleading.

Our AI badge scanning reads attendee IDs

I have little to say about this. Badge scanning tech is common at meetings. If attendees give informed consent and can opt out of badge scanning, I don’t have a problem with it. But perhaps this is a place to point out the significant difference between technology (badge scanning) that identifies attendees only at discrete attendee-determined points in time, and technology (Zenus plus attendee identification data from a separate system) that continually accumulates attendee data all the time attendees are in sensor range.

Legal vs Moral Considerations. Consent vs Notice

“People often conflate face recognition (identification) with facial analysis (anonymized data). In a similar way, they conflate legal and moral considerations.”

That’s quite a comparison! It’s saying being confused about the definitions of two types of technology is similar to being confused about legal and moral concerns of the use of such technologies.

“It might not be legally required to provide notice about the use of facial analysis in many settings. But we still think it is morally a good idea to do so in the spirit of transparency and education.

Therefore, we ask our clients to post signage on-site, talk about the use of our service in their marketing communications, and include it on their online terms and conditions.

According to the people I’ve spoken to who attended the association meetings described in my original post where Zenus technology was used, there was no “signage on-site, talk about the use of our service in their marketing communications” or notification in the meetings’ “online terms and conditions“. Perhaps the folks I talked to overlooked this “advance notice”, or these meetings were the exceptions rather than the rule. But from this limited data, it doesn’t seem that Zenus’s clients pay attention to what Zenus says it asks them to do.

What about consent versus notice? Advance notice we love. Consent defeats the purpose of anonymity.

How could one exclude a person from the anonymous analysis (if they opt-out) without identifying them? They cannot.”

Finally, we get to why Zenus continues to insist that their technology does not require consent while trying not to mention that when it is used in conjunction with attendee identification technology it does require consent. There is no way for Zenus data to remain anonymous if attendees are given the right to not consent, i.e. to opt out of being included in Zenus’s aggregated analytics! That would require the identities of attendees who have opted out to be injected into Zenus’s internal systems, which would then need to perfectly exclude them from the data fed to clients. This obviously can’t be done in a way that satisfies privacy laws. Consequently, Zenus’s whole “no consent needed” house of cards collapses!

Aggregate vs Individual Analysis

“The chances that one would analyze a person’s face or body language and infer their psychological state are slim.”

This is a strange statement. Human beings have evolved to be exquisitely sensitive to other humans’ psychological states. Most of us do such analysis unconsciously every day, whenever we are together with other people. We look at someone’s face or body language and think “They look upset/happy/worried/tired”. We might well say to them: “Are you OK?“, “Wow, you look happy!”, “You look worried about something”, “Want to take a rest?”, etc. I’d say that inferring the emotional state of someone we’re with is default behavior, rather than a slim probability.

Of course, this statement allows Moutafis to pivot to his marketing pitch:

“…analyzing a room of people multiple times per second and combining this with survey and attendance data can be insightful.”

Because that’s what Zenus has designed its technology to do.

Concluding Remarks

“Our ethical facial analysis brings organizations valuable and actionable data without crossing the line into collecting personally identifiable information.”

One more time. When you don’t include any meaningful safeguards to prevent combining your data with that of other systems that clients are free to employ, clients can easily use Zenus technology to “[cross] the line into collecting personally identifiable information“.

“It is a rare example of technology using restraint. It is an example of building proactive privacy safeguards by default. It is an example to follow.”

Sadly, it’s not. While I admire the efforts that Zenus has made to create an “ethical facial analysis service”, as I’ve now outlined in these two posts, the company has not succeeded.

Conclusions

Zenus claims that its system when used in isolation at an event doesn’t supply data about individual attendees. Maybe so. But when used in conjunction with additional tech (XYZ) that identifies individual attendees, event owners can use Zenus data to create a continually updated real-time dataset of analytics of identified individual attendees. Zenus deflects any legal or ethical company responsibility for this surveillance by saying it’s the event owner’s and/or XYZ’s to inform attendees and obtain their explicit consent to be tracked and their facial analysis used.

Crucially, Moutafis says two contradictory things.

  • The use of Zenus technology doesn’t need explicit consent.
  • The combination of Zenus technology with other attendee identification technology does require explicit consent. But that’s the legal and ethical responsibility of the event owner or the tracking technology company. Not Zenus.

Because Zenus does not require their clients to forswear using additional attendee identification technology, this, therefore, creates a fatal contradiction for the company. Why? Because, as Motafis admits, when attendees are allowed to opt out from its use—which is their right under privacy laws—there is no way for the Zenus technology to work without excluding the attendees who have opted out. To do this the Zenus system must be able to identify individual attendees! Consequently, Zenus’s whole we-don’t-identify-individuals and no-consent-is-needed house of cards collapses!

Two unanswered criticisms from my original post

First, Moutafis was quoted as saying publicly that “some of his clients…will monitor [using Zenus AI] in real time and if a speaker is killing the mood they will just get him off the stage”. I said I was pretty sure that most event professionals would agree this is a highly inappropriate way to use Zenus’s technology. Or as the Harvard Business Review put it, “AI Isn’t Ready to Make Unsupervised Decisions“. Moutafi did not respond to this.

Second, it’s important to note that Moutafis didn’t respond to a key critique of Zenus technology that I shared in my original post.

Namely, how useful is Zenus’s technology anyway? Kamprath and I gave examples of how often the most impactful sessions at meetings—impactful in the sense of changing future behavior rather than entertaining an audience—can be somewhat uncomfortable for participants at the time. Not all sessions are a “success” when people express “positive sentiment”.

One more thing…

OK, that’s two thousand more words from me on this topic, on top of four thousand last week. Hopefully, that’s enough for now. But I’d be happy to meet in a public moderated discussion with Zenus. If anyone would like to host such a discussion, don’t hesitate to get in touch!

Leave a Reply

Your email address will not be published. Required fields are marked *