Is The Surveillance State Just Lazy?
The dubious ethics of facial recognition technology explained
The year is 2054 and an elite police unit has teamed up with trillionaire Mr Big to utilise his psychic technology. They plan to eliminate crime from the streets of New London, but the only problem is paperwork, tons of it.
The Met are forced to sit back, not intervene and issue future victims with crime numbers. At least they can process an insurance claim ahead of time.
Flashback to 2024 and away from the fictional themes of Philip K. Dick and we find ourselves in a world increasingly embracing surveillance technology.
The imaging machines are meant to keep us safer, but, even with CCTV everywhere, the UK’s capital has still become a haven for low-level crime.
These misdemeanours, from shop-lifting to phone-snatching (link), are making people’s lives more miserable, disrupting small business activity and undermining trust in the police.
How bad is it? An anti-crime campaigner was able to get his bike robbed from outside New Scotland Yard last year, only for the Met to do nothing about it (link).
And even if over-stretched officers are able to collect evidence, record a witness statement and send it up to the prosecutors, the perpetrator’s day in court could be years away thanks to excessive backlogs (link). Justice delayed is often justice denied.
It is in this context that the police have rolled out live facial recognition (LFR) cameras across London. The mobile systems, which are usually attached to a van, are usually placed in high-traffic areas of neighbourhoods. The idea is to catch people who are “wanted by the police or the courts”.
The process is justified with a dubious ethical concept of “informed consent”. This phraseology is much more common in the medical sector and was first popularised following World War II and the establishment of The Nuremberg Code (more here).
You probably would have filled out a form or been provided with some useful information before undergoing a check-up or procedure, for example. But law enforcement is using ‘informed consent’ to mean something different.
“There is no legal requirement for you to pass through the LFR system. If you do, then unless an alert is triggered, the system will immediately and automatically delete your biometric data,” a bold orange sign read in my neighbourhood recently.
But what if you were blind, riding a bike or distracted? You wouldn’t have spotted the notice and your face would have been scanned regardless.
This scenario raises further questions, especially for anyone who has dealt with computers before. The Met claims the system “immediately and automatically” deletes the data. What, even the meta data?
And what if the system is wrong? Privacy campaigners Big Brother Watch caught the Met stopping and detaining an anti-knife campaigner last year (link). Here’s how that Kafkaesque episode played out:
“38 year old Londoner Shaun Thompson was returning home from a volunteering shift with Street Fathers, a community organisation that provides a positive male presence to young people and tackles knife crime, when he was wrongly flagged as a person on the Metropolitan Police’s facial recognition database outside of London Bridge station.
“He was held by officers for almost 30 minutes, who repeatedly demanded scans of his fingerprints and threatened him with arrest, despite him showing multiple identity documents further evidencing that he was not the individual on the facial recognition database.”
Away from the UK, LFR-style systems have wrecked lives (link) in the US, prompting some states and cities to ban the technology. San Francisco led the way in May 2019 when it outlawed the systems.
Privacy advocates have long fought against these types of invasive technologies, but perhaps the best argument comes from a more conservative perspective.
That’s because these systems increasingly move law enforcement away from the idea of ‘community policing’, where more officers are present in neighbourhoods, have relationships with the public and can ultimately spot and potentially prevent bad guys.
The late Ross Anderson made a similar argument (link) when spy agencies wanted to hack into everyone’s WhatsApp messages because it would apparently stop child exploitation. He concluded:
“We do indeed need to spend more money on reducing violent crime, but it should be spent locally on hiring more police officers and social workers to deal with family violence directly. We also need welfare reform to reduce the number of families living in poverty.”
Have no doubts: Tech might be able to help law enforcement, but it certainly can’t replace it.
DeepSeek Questions
A Chinese insurgent has rocked the world of generative AI. DeepSeek’s creators have claimed that they used just 2,048 Nvidia H800s and $5.6 million to train a model with 671bn parameters.
Why is that important? Because it’s a fraction of what OpenAI and Google spent to train comparably sized LLM models. OpenAI’s Sam Altman has promised to create “better models” in reaction to the news (link).
If DeepSeek's claims are proven to be true, here are some implications and questions I will be thinking about:
Biden's export controls on chips have been undermined
LLM builders will need less GPUs than previously thought
Cost of running and training AI models could dramatically slash
Puts focus back on start-ups and scale-ups, rather than mega cap companies
Where does VC/private equity money go next?
How does Wall St react to all of this in the medium and long-term?
If you're an international AI researcher (non-US), where should you go next?
An Explosion of CapEx
Not to be outdone by the $500 billion Stargate AI project, Zuck is promising to spend up to $65 billion on CapEx in 2025, ending the year with 1.3 million GPUs (link). “This is a massive effort, and over the coming years it will drive our core products and business, unlock historic innovation, and extend American technology leadership. Let's go build!”
A16Z Leaves London
After investing much time, money and resource into the UK, Andreessen Horowitz has decided to leave London (link). Have no doubt, this is a blow for the tech community, other VCs and crypto in the UK in particular.
A16Z wanted to help the UK become a global crypto hub. It's disappointing to hear one UK official say "they were never really here". When, in reality, A16Z were very visible and highly engaged.
Perhaps the UK government should reflect on where it's at with crypto? Marc Andreessen, one of the firm's founders, has been heavily involved with the Trump team and now A16Z will put its full focus on the US as his administration plans to promote crypto.
A new SEC head also means crypto red-tape will be cut back in the US, which has made big strides in the crypto space including launching ETFs last year. Further market liberalisations are expected, with Coinbase, usually a cautious platform, listing Trump's meme coin as just another example of how sentiment and actions have shifted.
NYT Turns to Bundles
So, The New York Times currently has more than 11 million total subscribers, but the company has set itself a goal of reaching 15 million sign-ups by 2027.
Since there isn’t another ‘Trump bump’ in sight, the outlet has signalled that it is open to partnering with other publishers to offer-up subscription bundles to consumers (link). If the move is a success, it could grow the news media market.
The Political Press Box
Find my latest long-form audio interviews of political communicators here. Please like, subscribe and listen.
📧 Contact
For high-praise, tips or gripes, please contact the editor at iansilvera@gmail.com or via BlueSky (link). Follow on LinkedIn here.
213 can be found here
212 can be found here
211 can be found here
210 can be found here
209 can be found here