Radio City Music Hall prevented a mother from attending their “Christmas Spectacular” show with her daughter’s Girl Scout troop by using facial recognition technology.
Attorney Kelly Conlon traveled to New York City with her young daughter’s Girl Scout troop on the weekend after Thanksgiving.
She and the troop’s other mothers took the girls to see the Rockettes perform in the “Christmas Spectacular” at Radio City Music Hall, but Conlon didn’t make it past the lobby.
Security guards immediately flagged her as soon as she walked into the building and pulled her out of the crowd of theater-goers.
“It was pretty simultaneous, I think, to me, going through the metal detector, that I heard over an intercom or loudspeaker,” she told NBC New York.
“I heard them say ‘woman with long dark hair and a gray scarf.’”
Security guards asked her for name and identification, but she said they already knew who she was.
“They knew my name before I told them. They knew the firm I was associated with before I told them,” Conlon detailed. “And they told me I was not allowed to be there.”
Radio City Music Hall uses facial recognition to “ensure safety” at their shows.
As it turns out, Conlon wasn’t flagged because she was a threat to the public, but because of her job.
She works as a lawyer for Davis, Saperstein and Solomon, a New Jersey law firm that is handling a personal injury case against a restaurant that is owned by Madison Square Garden Entertainment (MSG), the same parent company that owns the Radio City Music Hall.
According to MSG, any attorneys working at law firms “pursing active litigation” against any of their companies are banned from attending shows at any other venues.
Their multiple entertainment companies include the Beacon Theater, the Chicago Theater, the Hulu Theater, any of the restaurants from the Tao Group, and of course sporting events and big ticket concerts at Madison Square Garden.
“While we understand this policy is disappointing to some, we cannot ignore the fact that litigation creates an inherently adverse environment,” a spokesperson said.
“All impacted attorneys were notified of the policy, including Davis, Saperstein and Salomon, which was notified twice,” they explained.
Sam Davis, a partner at Conlon’s firm, is legally sticking it to MSG by challenging their liquor license.
“The liquor license that MSG got requires them to admit members of the public, unless there are people who would be disruptive who constitute a security threat,” Davis remarked smugly.
“Taking a mother, separating a mother from her daughter and Girl Scouts she was watching over — and to do it under the pretext of protecting any disclosure of litigation information — is absolutely absurd,” he continued.
“The fact they’re using facial recognition to do this is frightening. It’s un-American to do this.”
Conlon said that she sent her daughter inside to see the show with the Girl Scouts and accompanying parents, while she waited around outside until it was over.
“It was embarrassing, it was mortifying,” she commented.
As mortified as Conlon is, the internet is equally alarmed at the private application of the technology.
“If you think “I’m not worried about facial recognition technology. I haven’t done anything wrong.” You might want to reconsider that statement,” one person wrote.
“Ban this s–t yesterday,” someone else said in reaction to Conlon’s story.
“This is exactly why it is NOT ENOUGH to just ban government and law enforcement use of facial recognition and biometric surveillance. There are so many ways private corporations and even individuals can abuse this tech. It should be banned for all commercial use & public use.”
“We abuse the h–l out of the word “dystopian” on this app but holy h–l this is some Bad Place s–t,” another pointed out.
“Not only can private companies use facial recognition to ID you in real time, they can link you to your employer.”
Nonprofit Surveillance Technology Oversight Project added, “All we want for Christmas is for NY to ban facial recognition.”
Facial recognition is likely about to get a whole lot more expansive in the near future.
The Transport Security Administration (TSA), has been quietly testing facial recognition technology in limited airports as a faster way to verify the identity of airport travelers.
The aim is to match passenger’s photo identification to their actual face at security checkpoints. The passenger places their ID in a machine, which also records their face live.
“We’re assessing how the technology works and we’re assessing it’s accuracy, we’re assessing it’s impact on passengers,” TSA administrator David Pekoske told ABC 7 Chicago.
“The response has been universally very positive. More effective, speedier, more convenient for passengers are the things that I hear.”
The pilot program was rolled out in 2017 and is currently in 16 airports, including Miami, Las Vegas, Denver, and Phoenix.
According to Fox News, the TSA “hopes to expand it across the United States as soon as next year.”
Albert Fox Cahn, who hails from the Security Technology Oversight Project, is worried about the implications of the technology.
“Quite frankly, it’s not doing anything to help the public. The urgent need for greater transparency,” he said.
“This technology is going to screw it up. And people are going to end up being detained by TSA, they’re going to be faced with even more surveillance and more invasions of their privacy, just because an algorithm gets it wrong,” Cahn cautioned.