Massive Attack Turns Concert Into Facial Recognition Surveillance Experiment
Posted4 months agoActive3 months ago
gadgetreview.comOtherstoryHigh profile
calmmixed
Debate
70/100
SurveillanceFacial RecognitionArt and Technology
Key topics
Surveillance
Facial Recognition
Art and Technology
Massive Attack uses facial recognition technology at a concert, sparking discussion about surveillance, art, and data privacy.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
25m
Peak period
117
0-12h
Avg / period
36.5
Comment distribution146 data points
Loading chart...
Based on 146 loaded comments
Key moments
- 01Story posted
Sep 15, 2025 at 5:51 PM EDT
4 months ago
Step 01 - 02First comment
Sep 15, 2025 at 6:16 PM EDT
25m after posting
Step 02 - 03Peak activity
117 comments in 0-12h
Hottest window of the conversation
Step 03 - 04Latest activity
Sep 20, 2025 at 8:50 AM EDT
3 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
ID: 45255400Type: storyLast synced: 11/20/2025, 6:56:52 PM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
Nothing personal, but you do seem to have a nice education. US ?
The headline is perfectly parseable, unless most of the headlines on HN or BBC. The fact that it says "Band concert" shall be selfexplanatory.
https://youtu.be/6IDT3MpSCKI
In fact, I recall many songs from The Matrix being played nonstop back in my teenage gamer IRC days. Maybe even by others than just me
Massive Attack's Teardrop was used in the original US air (although as a Brit I've somehow heard all three on tv re-runs and Amazon Prime)
God damn those are 12 great songs!
I also hadn’t really clued in to just how political they were until seeing their visuals, which I also thought added a lot. Surely not everyone’s cup of tea though.
But yes. They do need new material dammit.
I attended one in Paris this year. Not only it was exactly the same show as few years ago including the decorations and the scrolling text wall but the sound was horrible. I couldn't hear anything.
I would be better off if I just stayed home and listened to the recording.
Bear in mind Beth made "Out of Season" apart from Portishead several years before the release of "Third." I wouldn't think her recent solo work indicates a split.
The best concerts are breakthrough concerts for new bands (first or second album), and then the greatest-hits type concerts that are 10 years after the last album. Every concert that is a tour with album 3-4-5 is usually pretty meh.
"AI assists in refining our editorial process, ensuring that every article is engaging, clear and succinct."
One thing I hope we'll see in the future on these types of articles is the ability to view the original prompt. If your goal is to be succinct, you can't get much more succinct than that.
https://marketoonist.com/2023/03/ai-written-ai-read.html
How could you possibly tell? I've been playing around with AI detectors, putting in known all-human samples, known all-AI samples, and mixed samples.
The only thing it's gotten right is not marking a human sample as 100% AI (but it marked one of the AI samples as 100% human).
Having such a mark would be a witch-hunt for sure.
What is sadly rather ironic is the author's first name, "Al" looks like AI when stylised in the article's font.
Writing for the last 14 years and for GadgetReview since 2017; Managing Editor since 2018.
a smart person can make ChatGPT sounds completely authentic, and a very boring and middle of the road writer who uses em-dashes can make themselves sound completely inauthentic. it's not like LLMs got their style from nowhere
as far as I'm concerned, as long as the factual information has been curated by a human, I don't give a shit
Would it matter if the same prompt gives different output? You couldn't verify it.
How likely are you to just trust, let alone know for sure, whether or not the text I showed you is actually what I fed to the llm?
I think this assumes a very limited scope of how AI gets used for these. As if the article is a one and done output from a single prompt. I can imagine many iterative prompts combined with some copying and pasting to get an hour’s worth of copy in five minutes.
I think more drama has been created around this than is necessary. Based on the video, the real-time projected visitor's faces were not analyzed. They were simply shown with a random description flag attached, such as "energetic," "compassionate," "inspiring," "fitness influencer," or "cloud watcher." It seems to be an artistic provocation showing what a real people analysis could look like.
IDK about shouldn't. Public photography not being a crime comes from a time where one could still be generally expected to remain anonymous despite being photographed. Just like how you can be seen by strangers in the street while walking and still remain anonymous. Yet stalking is a crime, and facial recognition seems to be the digital equivalent. Facial recognition is something that can be done at any point by someone with your picture in their hand.
It would also completely kill any form of street photography, even if you don't appreciate the art it would kill documenting times and places for posterity, for what benefit exactly?
I actually don't find it hard to sacrifice the recreational photography of strangers, but I do have a hard time balancing it with the need to photograph crime and government entities overstepping their authorities.
I don't have a good answer for it all.
We would not only lose an art form but also the recording of the past, a candid photo of today has a lot more value in 50-100 years, rather absurd to lose this. It wouldn't even guarantee anything, bad actors would continue to do so covertly.
I find it pretty hard to sacrifice it, it's a freedom, making society at large less free to fight tyranny doesn't seem the way to solve anything, e.g.: EU Chat Control bullshit.
I don't have a good answer either but I lean on the camp of seeking solutions that are smarter than a sledgehammer.
In this case, I think it would be interesting to think about the most concerning area: linking a person in a photo to their real-world identity. It seems like there could be restrictions on how face-recognition databases are built and accessed, possibly incorporating intent to harass or intimidate as an aggravating factor, and possibly linking across time and place. If I take a picture of some guys playing basketball or chess as I walk around town, I don’t need to identify them in my art exhibit entry and I certainly don’t need to link one of them to a different time and place without their permission.
The laws regardless this almost always make a distinction between intentionally surveillance and by chance background noise. Taking a picture of the street with people on it doesn't matter, recording the street 24/7 probably does, and purposefully singling someone out and photographing them definitely matters.
We already kind of have this. Think about it - stalking is illegal, but you've walked behind people right? You've glanced into someone's window before, right? You've taken a picture of a random person before, right?
So why aren't you in jail? Because laws aren't algorithms
This feels kind of like the way you could avoid having extensive traffic laws & control systems in 1905 when only a few people had cars.
But now I can point a camera at a crowd and It will:
All this with consumer gear I can carry with me, no government level spy gadgets needed. All live at 2-20fps depending on how much hardware I throw at it.With some extra work I can then find each of them on social media, grab their real names and other information from public sources and now I have a surveillance database. (Illegal where I live, but who's gonna check?)
This makes "public photography" a whole different thing from what it used to be.
(David Brin's been beating this drum for about three decades now - I doubt I could say anything he hasn't already said. https://www.davidbrin.com/transparentsociety.html)
That is a strange dichotomy, "government vs everyone". You miss the much more important large private organizations.
Government can at least be held accountable, if voters are willing. What the private orgs do you don't even have a chance to know about without a (tragic doomed person) whistleblower. Even the "evil" government actions heavily uses those unaccountable private entities for much of the dirty work.
Also "everyone" is useless. What use is any of it to individuals? Weapons or information. The fight is among deep complex organizations. Individuals - unless part of some network - may as well not exist. The individual with a firearm as a protection against government comes to mind, even in groups they'll be blown away anytime the organized large groups even sneeze towards them.
Another example is who uses the law: Any large company or even the government is much much MUCH more effective, no matter how much an individual has law on their side, at least when the large organization is willing to drag out the fight until the individual or small group runs out of resources.
If you want to achieve something, ORGANIZE! Otherwise you just throw yourself into the grinder, at best even providing reasons and justification to the other side.
Totally agreed. Even if that network is as simple as posting something to social media and watching it go viral, it's still a network.
I thought about breaking commerical interests out separately in my post, but didn't want to overcomplicate. An example would be the V888 form in the UK, which allows you to request the details of the licenced keeper of a vehicle, as long as you can show "reasonable cause". The reasonable causes are, of course, mostly commercial.
You're tapping and paying and the system stores your purchase under "male, 35-45, hispanic, anxious"...
Creepy as all hell.
See https://www.photrio.com/forum/threads/law-regarding-photogra...
This means you can not make a photo/video of a person in public without their consent if they are the focus of your image. They also have the right to revoke consent anytime in the future.
The only exception is at large gatherings like for example the Street Parade where the expectation of privacy can not be expected especially since the event is televised.
This is also why you can not put cameras on your home that film public streets etc. They need to be blocked off or facing the other way.
In more sensible countries the law says that it's legal to film, but it's not legal to publish videos and photos of people without their consent.
Are you looking at this from a US perspective where illegally obtained evidence is not admissible in court (fruit of the poisoned tree)? At least in Norway this is not the case, nor is it absolutely forbidden in the UK.
See, for instance, https://www.lawgazette.co.uk/commentary-and-opinion/fruit-fr...
https://www.fedlex.admin.ch/eli/cc/24/233_245_233/en#art_28
Dash-cam footage is a gray area since the video is generally deleted automatically and not publicized. If the crime is severe enough the footage is permitted in court.
Criminals do not just get away just like that. There is a lot of public cameras run by for example the SBB (national train company). These cameras have strict rules as to how long the footage is stored and who has access. The footage will not be posted publicly unless in very very rare cases where the severity of the crime outweighs the privacy of the criminal.
How many innocent people have faced the wrath of the public because of false identification in the US when some grand event occurs? Does anyone remember Richard Jewell[1]?
[1] https://en.wikipedia.org/wiki/Richard_Jewell
IIRC some countries recently started experimenting with automagically granting copyright to people for their own likeness, I think it was aimed at AI generates fakes, but it's probably more widely applicable.
Anyway, don't be a dick, don't take pictures of people without their consent.
Or
"Data shows you hang out in low income areas, we don't think that aligns with our companies goals."
So the "face your principals" is completely fucking arbitrary. That's the fear.
Where I live, a concert is not considered "public", unless it's a government-run event on government property.
Otherwise, a concert is a private event, in which case you have no right to privacy. Just like going into a store.
I that case they should have used descriptions like "gay", "muslim", "poor", "bipolar", "twice divorced", "low quality hire", "easy to scam", "both parents dead", "rude to staff", "convicted felon", "not sexually active", "takes Metformin", "spends > $60 on alcohol a month", "dishonest", etc.
None of the people who actually take advantage of you or manipulate you using surveillance capitalism cares if you're a "cloud watcher" or "inspiring"
That would certainly better demonstrate the scary dimension of mass video surveillance and face recognition. However, not many people would buy tickets for the next Massive Attack show after being lectured like this.
Their images were not being sold, nor were they being used to promote the concert. Plus, nearly everyone who goes to a concert these days agrees that their image will be captured and possibly used in future promotional material.
It’s incredibly common for tickets to big gigs to have fine print along the lines of “by attending you consent to being recorded”. This has been the case for decades. If you’ve ever watched an official recording of a live performance, you’ve seen this in action.
This is just a novel presentation of what is already commonplace recording. And it’s great and it makes a point, but the article is bad.
One season Saturday Night Live did this with its studio audience as a recurring gag.
The one that stuck with me was the couple labeled "Pregnant two hours."
I don't see evidence of facial recognition.
Face recognition means computing which individual from some other database of people a particular face belongs to.
There’s also face tracking — detecting a face in an image and then tracking the same face across subsequent images. Which is often implemented by using a face recognition approach, but without any predefined catalog of people — you just dynamically fill up your face database as faces appear in the image sequence / video source.
It detects objects and gives you the bounding box. Then you draw a square on it and add a label.
No fancy LLM needed, just old fashioned machine learning models.
Recognition implies associating the faces with an ID.
I do, however, also appreciate how strict the community seems to be about recording without consent. Some people go to burns to be able to completely disconnect from their usual lives without fear that there will be any reprisal for legal/maybe-illegal-but-harmless activities they might do there, and the potential of being recorded can put a serious damper on that feeling of freedom.
[0] https://www.portals.org/
Also consider resourcing, the manpower, money, tools, electricity devoted to surveillance back then compared to today
How about today? Where could you venture in secret without being tracked? How could you hold a private conversation? Your face & license plates are constantly tracked, along with your personal phone, laptop , watch, fitness tracker, Tire Pressure Management Systems, etc.
If you had to assign a logarithmic authoritarian intensity scale to those regimes, and to today's regimes, how would you rank them? Consider the spying capacity, resources, recording capacity, analytic capacity.
I would put today's regimes many orders of magnitude more severe.
what do you think?
Also the martial forces (police , military, security ) are more directly managed , and more broadly deployed . You can no longer reason with an individual because their decisions have to be run up the chain . Individuals no longer have authority to provide exceptions or help
'Airlines Sell 5B Ticket Records to Government for Warrantless Searching' https://news.ycombinator.com/item?id=45250703
My head hurts.
[1] https://news.met.police.uk/news/arrest-landmark-for-met-offi...
[2] https://www.bbc.co.uk/news/articles/c62lq580696o
[3] https://www.independent.co.uk/tv/news/met-police-facial-reco...
Authoritarian regimes come up with bogus charges to include political opponents in the "bad guys", painting them as criminals to the rest of the country, and legitimizing their arrests.
These surveillance technologies have two main problems: if you have more data, it's easier to dig dirt on people. And if you don't have data, you can always fake it.
The requirement for it to work though is that you need regular people to believe that political opponents are in fact criminals.
The scary shit is that the US is not too far from being there.
In the past decade we had some examples where some countries had really big anti-government movements and protests... and they simply couldn't achieve anything substantial. Iran comes to mind, but I think we also had some in the East of Europe.
And then there are countries like Russia and North Korea (and likely many more) where it looks like (at least from the outside) that mass protests are pretty unlikely, because any kind of political opposition is suppressed before it reaches this level.
We can imagine something like 1984, where it was only the party (middle) class monitored for thinking. But proletariat (low) class was free to think whatever - because they knew system was bad; they weren't required to pretend.
I guess my point is, the totalitarian system doesn't need widespread surveillance. But it needs believable ideology, which enough people from low and middle class believes to keep the communication barrier between these groups sufficiently closed.
Is the neoliberalism such ideology? Is it something that can offer enough positives to sustain/counterpoint negatives of widespread surveillance? I doubt it.
We can look at recent examples where UK and US tried to control the narrative and failed - Palestine Action, ICE arrests, troop deployments in cities, Tiktok ban.. Despite surveillance, people are not buying the ideology.
And it isn't identifying the people or anything. It's putting some meaningless adjective like "Resourceful" below them.
Have seen this headline a few times and thought it was actually novel and demonstrative of some face database or something, but instead it's just a surveillance gimmick. Put a bunch of generative AI face loops with bounding boxes and adjectives.
And that's part of the performance. You don't get to choose what companies do with your personal data.
Have they done this again with an updated system?
It’s hard to explain the concept of surveillance and its effects to laypeople. And the corporations absolutely know that.
6 more comments available on Hacker News