
Beyond the Click Series – Are We Being Watched?
Ever wonder why you see ads for things you only mentioned in conversation? Or why Netflix knows what you’ll like before you do? That’s all thanks to digital surveillance – and it’s happening around us all the time, whether we realise it or not.

Digital surveillance is when companies and governments collect data on what you do online, like the sites you visit, things you buy, or even your location. Companies use this information to sell you products. Ever searched for sneakers and suddenly seen ads for shoes everywhere? That’s your data at work. But it’s not just about selling – governments use digital surveillance to monitor people for security reasons, too.
This sounds reasonable on the surface, especially for catching criminals or terrorists. But there’s more to it. The potential for groups or governments to exclude you or single you out because of your digital identity has some frightening implications. Not everyone is treated equally when it comes to digital surveillance.
The way data is collected and used can depend on your race, gender or age. That means digital surveillance can reinforce harmful stereotypes and make some people feel like they’re being watched more closely than others – simply because of who they are.

A good example of this is Facial Recognition software, one of the tools used for digital surveillance.
It identifies people based on their facial features, but it doesn’t work the same for everyone.
These systems are typically very accurate for white male faces but much less so for women and people of colour.
This isn’t just an annoying glitch – it’s a serious problem.
If facial recognition technology can’t accurately identify people from certain racial or gender groups, it can lead to those groups being over-surveilled or wrongly identified.
Imagine being flagged as suspicious just because the technology made a mistake. People of colour and women are often unfairly watched, monitored or treated differently because these systems can’t ‘see’ them properly. And that’s not just unfair – it’s potentially dangerous.

One researcher, Joy Buolamwini from MIT, dug deep into this issue.
She tested the four largest facial recognition software companies and found they were highly accurate for white men, with an error rate of less than one per cent.
But for darker-skinned women, the error rate shot up to nearly 48 per cent.
That’s akin to flipping a coin to decide if the software gets it right or not! This research made headlines and forced big tech companies to rethink how they were using this technology.
These errors aren’t just hypothetical. People have been wrongly arrested because facial recognition software misidentified them.
This has serious implications for policing, international travel, and even getting a loan or a job. If a system can’t accurately identify you, it can impact your life in ways you might not expect.
Additionally, when certain groups are watched more closely than others, it sends a message that these people are more suspicious or less trustworthy, simply because of their identity. For marginalised communities, who already face discrimination in many areas of life, this kind of surveillance reinforces existing inequalities and can make them feel even more excluded or unsafe.

It’s not just about privacy – it’s about being denied the right to be yourself without fear of being judged or targeted. If facial recognition technology has biases against certain races or genders, it means some people are being watched more closely and treated less fairly. That’s not okay, and it’s something we should all be concerned about.
You might think, “I don’t do anything wrong, so why should I care if I’m being watched?” But that’s not the point. Digital surveillance isn’t just about catching bad behaviour. It’s about fairness and privacy.
Even things that seem harmless, like personalised ads or suggested videos, can influence how you think and act. And because this surveillance isn’t always transparent, you don’t know who’s watching, what they’re collecting, or how it’s being used.
We can all be more aware of what we share online and think twice before clicking ‘accept’ on every app or website. Remember, it’s not just about what you’re looking at. It’s also about who’s looking at you!
Stella Pennell is a PhD Sociologist, NZ karate champion, and a curious human being. Watch her episode on the ALTBAYS Table Talk here, and read more of her articles in Coromind here.

Coromind: Coromandel’s Collaborative Magazine

Help us take Coromind Magazine to new heights by becoming a member. Click here
Change the Weather for Your Business: Advertise with Us.
Advertise your business in the whole Hauraki Coromandel in the coolest Coromandel Art Magazine, from Waihi Beach/Paeroa /Thames up to the Great Barrier Island.
Advertise Smarter, Not Harder: Get in Touch



