Technology

The Ethics of Facial Analysis for Events


Skift Take

The rise of AI-powered facial analysis at live events has made some uneasy. Is there a reason to be concerned?

Would you feel comfortable with a sanctioned robot stalker at your event? While the situation is not as Orwellian as that, it’s important to consider the potential benefits and pitfalls of using facial analysis technology.

Closed-circuit television (CCTV) cameras are in virtually every event venue. You should assume you’re on camera if you’re at an event. But while CCTV systems are in place for surveillance, the cameras used for facial analysis are watching for a different reason: AI-powered heatmaps and sentiment analysis.

At a time when data privacy concerns are paramount, and AI is imitating people well enough to scam their relatives, some attendees are wary of having AI track their facial expressions on camera.

Should event planners or their attendees be concerned, and what can they do to get people on board?

Facial Analysis or Facial Recognition

One source of the uneasy feeling some have expressed is confusion around what facial analysis does and how it treats your data. Part of the challenge may be that facial analysis is often conflated with facial recognition.

Facial recognition tries to identify you specifically (recognize you) and may record or transmit your unique facial data to do so. 

Facial analysis combines your facial data with that of the rest of the group being analyzed for the purpose of statistical analysis and only records that aggregated data so nothing can be attributed to you later. 

Stop Watching Me, Please

Facial analysis was recently used at PCMA Convening Leaders 2023. Some participants who didn’t realize they were being analyzed were unhappy. Greg Kamprath took to LinkedIn to explain his misgivings, which started a debate with Zenus, the company that supplied the technology.

At the heart of Kamprath’s discomfort is the idea that event attendees are being surveilled without consent.

“I was surprised a few minutes in when they told us we were being watched at that moment by cameras, which were analyzing our age, gender, and emotions,” writes Kamprath. “That is not something I’m interested in having done to me.”

Legal Consent Not Required

According to most data-governing legislation, consent is required if the data being collected can be used to identify you personally. This is referred to as “personally identifiable information” or “PII”. While this normally includes things like your name, address, and email, it also includes biometric templates or signatures based on your face.

Facial analysis systems don’t require consent because they neither create biometric signatures nor store or transmit video containing people’s faces.

But Kamprath contends that any technology deployed to watch and track you requires consent ethically, if not legally.

“What if the event organizer hired a person to stare at you and write down how you felt from moment to moment?” asks Kamprath. “Even if they said ‘don’t worry, I’m not writing down your name,’ I think many people would have a problem with that.”

“This is not analogous with the way facial analysis works,” says Panos Moutafis, CEO and co-founder of Zenus. “We’re not looking at how one person is reacting; we’re looking at how everyone is reacting.”

Facial analysis is less like hiring someone to stare at you and more like hiring someone to stand at the front of the room to tally how many people are reacting like you.

Assessing Audience Reactions

This is what the IMEX Team did before they employed Zenus’ facial analysis system last year at IMEX America. It used the technology to assess the strategic efficacy of new experiential investments in specific locations.

“Traditionally, we would earmark some time in which we would just stand there and observe how these things were being received,” says Oliver Bailey, interaction designer at IMEX. “It wasn’t very efficient, but it was the only way.” Facial analysis solved that problem by allowing Bailey’s team to observe the crowd uniformly and at scale without anyone having to be there.

But whatever the business benefits, Kamprath maintains that event organizers are ethically required to inform people and accommodate those who object to being analyzed by, for example, designating an area where they can sit without being scanned.

A Slippery Slope

Moutafis agrees that attendees should be informed when facial analysis is being used, but more to build trust than to ward off any liability from a privacy or security perspective.

One reason there may be a lack of trust is because of the potential for a slippery slope. Kamprath’s concern is that facial analysis is a gateway to greater surveillance, and normalizing the infrastructure creates opportunities for bad actors to abuse it. 

“This is a valid concern,” admits Moutafis, but this is a reality for any technology. The real question is how to prevent abuse or misuse. That’s where legislation comes in.

“There is a solid legal framework for what is considered PII and what requires consent,” says Moutafis. Any video service that stores or transmits personal biometric data that can be attributed to an individual requires consent

The logistical impracticality of obtaining consent from everyone on a tradeshow floor is one reason Zenus has sunset its facial recognition services – despite being in high demand – as part of its rebrand as an “ethical AI” operator.

Consent is only one issue. Even if everyone opts in, storing those biometric signatures in a database is a security risk. “If there is a breach, those signatures can be compared with images on Facebook, LinkedIn or other social media to find a match,” explains Moutafis.

Zenus’ clients have their own checks and balances as well. This process usually begins with an information security review that evaluates Zenus’ data security certifications, policies, and infrastructure. The next step for many (especially larger) clients is to submit their Zenus deployment to their own ethics boards for approval.

Ethical Considerations

At IMEX, ethics is a part of the discussion from inception through to securing financing for any new technology deployment they want to add. “I believe it’s important for organizers to consider what they want to measure and why rather than deploying the technology indiscriminately, without a hypothesis,” says Bailey, adding that delimiting the type of data collected within the purpose of the deployment made the ethical implications relatively straightforward.

“No attendee’s personally identifiable data, including images, were of interest, and none were collected or captured,” said Meghan Risch, chief of staff and vice president of corporate communications at PCMA. She confirmed that the use of facial analysis technology at Convening Leaders 2023 was limited to aggregated dwell time and reactionary data. “What has been captured is purely to understand attendee behavior — which sessions and experiences resonated with participants — to help us deliver better and more impactful experiences for our global business events community,” said Risch.

Best Practice

Nevertheless, given the scale of events produced by the likes of IMEX and PCMA, it may not be reasonable to expect all attendees to understand how the technology works or what’s really at stake (if anything).

As such, it behooves event owners to develop some best practices around implementing any technology that captures, tracks, or analyzes someone’s likeness:

  1. Be very intentional in implementing technology to avoid collecting data that might put attendees at risk.
  2. Enlist your tech partners to help you determine safe and ethical practices.
  3. Carefully determine the service provider’s security credentials and review their privacy policy.
  4. Educate your attendees on the use of the technology, how it works, and whether there are any privacy implications. Err on the side of over-communication.
  5. When you’re dealing with information attendees may perceive as sensitive, build trust through transparency and choice.