Jan 11, 2026

The Counter-Arguments

Defenders of school surveillance systems offer several justifications for these practices. Each argument appears reasonable on its surface but collapses under scrutiny.

"It's for Safeguarding"

"Schools need comprehensive surveillance systems to protect children from abuse, neglect, and harm. Every photo, every behavioural observation logged in CPOMS, every data point collected serves the noble goal of keeping children safe"

True safeguarding doesn't require broadcasting children's images on social media. Posting photos of children to Twitter or Facebook serves institutional marketing, not child protection. If anything, it increases risk by making children's images, locations, and routines publicly accessible.

Behavioural surveillance platforms like ClassDojo aren't safeguarding tools - they're behaviour management systems that reward and punish children for compliance. Tracking when a six-year-old "stay on task" or "demonstrates grit" has nothing to with protecting them from harm.

Proportionality matters. Not every concern justifies a permanent record. A child who seems "unusually quiet" one day doesn't need that observation logged in CPOMS and retained until age 26. Recording every minor behavioural variation creates massive databases of trivial information that obscure genuine safeguarding concerns.

Other countries protect children without this level of surveillance. Norwegian data protection authorities have taken a strong stance against biometric surveillance, with the Norwegian Data Protection Authority calling for a general ban on biometric recognition technology for surveillance purposes due to major interference with the right to privacy. Finland and Norway safeguard children effectively without fingerprinting them for lunch payments or logging their behaviour in commercial tracking systems. If these countries can protect children without comprehensive surveillance, England's approach isn't necessary - it's excessive.

The safeguarding argument is often used to justify surveillance that has nothing to do with safety. Collecting nationality data, sharing educational records with researchers, or allowing commercial companies access to behavioural profiles serves policy goals or business models, not child protection.

"Parents Consent"

"Schools obtain parental consent for surveillance practices. If parents sign the forms agreeing to photographs, biometric data collection, or use of educational platforms, they've given permission. The surveillance is therefore legitimate."

For consent to be valid under GDPR, it must be informed, specific, freely given, and unambiguous. School "consent" mechanisms fail all four requirements.

Consent under pressure isn't free consent. When refusing means your child is excluded from class photos, left out of school trips, or stigmatised as the "difficult family," consent is coerced. The power imbalance between schools and desperate parents makes genuinely voluntary consent impossible in educational settings.

Parents can't consent on their child's behalf for their entire childhood. A four-year-old whose parents signed ClassDojo consent in reception didn't agree to have their behaviour tracked for seven years. A teenager whose parents consented to Google Workspace in Year 7 didn't consent to have five years of school work and communications stored on corporate servers. Children's own views and developing autonomy must be respected as they mature.

Children themselves never consent. They're the one's being surveilled, yet nobody asks their permission. Article 12 of the UN Convention on the Rights of the Child guarantees children's right to express views on matters affecting them. School surveillance decisions are made entirely by adults, without meaningful input from the children whose privacy is violated.

The consent model assumes parents understand what they're consenting to, have genuine choice, and can protect their children's interests. All three assumptions are false.

"It's convenient/efficient"

"Biometric lunch payment systems are faster than cash or PINs. Google Workspace streamlines classroom administration. ClassDojo makes behaviour tracking efficient. CPOMS enables quick information sharing between staff. These technologies save time and improve school operations."

Convenience doesn't override fundamental rights. Privacy is a human right enshrined in Article 8 of the European Convention of Human Rights. Rights don't disappear because violating them is administratively convenient. Speed and efficiency are institutional benefits that come at the cost of children's privacy and autonomy.

Other solutions exist that don't require surveillance. PINs work perfectly well for lunch payments - they're memorable, secure, and don't require storing biometric templates. Many schools worldwide function successfully using cards, cash, or simple account numbers. Fingerprint scanning wasn't adopted because alternatives don't work; it was adopted because it's marginally more convenient.

Public/private key cryptography offers an even better solution. This technology - which secures Bitcoin transactions and decentralised social networks like Nostr - allows individuals to prove their identity without revealing personal information or requiring centralised databases. A student could authenticate themselves for lunch payments, library access, or attendance tracking using cryptographic keys stored on a simple card or device, with no biometric data collection and no central authority storing sensitive information. The technology exists. Schools choose biometric data surveillance over privacy-preserving alternatives because it's easier to administer, not because it's necessary for security. The choice reveals priorities: institutional convenience matters more than children's bodily autonomy.

Efficiency for schools comes at a cost for children's privacy. A teacher might save two minutes per day using ClassDojo instead of traditional behaviour management, but children pay with permanent behaviour records created by subjective adult judgements. This convenience calculation ignores who bares the cost.

Victorian schools educated children successfully without biometric scanning, behavioural tracking apps, or comprehensive digital surveillance. For most of human history, education happened without these technologies. Their absence didn't prevent learning - children thrived. The claim that modern surveillance technologies are necessary for education is historically absurd.

When schools claim they "need" Google Workspace or biometric systems, what they mean is "We've built our operations around these systems and changing would be inconvenient." That's an argument for institutional inertia, not necessity.

"Nothing to hide, nothing to fear"

"If you haven't done anything wrong, you shouldn't worry about surveillance. Only people engaged in wrongdoing need privacy. Law-abiding families have nothing to hide and therefore nothing to fear from school surveillance systems."

Privacy scholar Daniel J. Solove has comprehensively dismantled this argument, which he describes as one of the most persistent "misunderstandings of privacy." The "nothing to hide" argument misunderstands what privacy is and why it matters.

Privacy, is a fundamental right, not about hiding wrongdoing. Article 16 of the UN Convention on the Rights of the Child establishes children's right to privacy regardless of whether they've done anything wrong. The European Convention on Human Rights Article 8 guarantees everyone's right to a private life. Rights exist whether or not the right-holder has "something to hide"

As Edward Snowden observed: "Arguing that you don't care about the right to privacy because you have nothing to hide is no different than saying you don't care about free speech because you have nothing to say." The right to privacy isn't conditional on having secrets - it's inherent in human dignity.

Information can be misused even if innocent. A child photographed playing innocently at school has done nothing wrong, yet if that image appears in search results, gets scraped for facial recognition training, or is seen by an estranged parent with a restraining order, harm occurs despite the child's innocence.

Context collapse causes problems. Information shared appropriately in one context (school) creates issues when it appears in another (social media, employment searches, romantic relationships). A CPOMS note about a difficult period in Year 3, legitimate at the time, becomes problematic when it follows a child through secondary school and potentially into adult life.

Children deserve privacy to develop, make mistakes, and experiment with identity. Childhood is a time for exploration, risk taking, and learning from failure. Comprehensive surveillance chills this development. Children aware that everything is watched, recorded, and judged become cautious, performative and afraid to be authentically themselves.

The "nothing to hide"argument also contains an insidious implication: if you do object to surveillance, you must have something to hide. This creates guilt by suspicion an discourages legitimate privacy advocacy.

"Technology is inevitable"

"Digital technology in education is the future. We can't turn back the clock. Children need to learn in technologically advanced environments to prepare them for modern life. Resisting educational technology is Luddite nostalgia that will leave children unprepared."

How technology is used is a choice, not an inevitability. Nobody argues against using computers in education. The question is whether we use technology in ways that respect children's rights or ways that violate them. Technology is a tool - we choose how to deploy it.

Other countries use technology without such surveillance. Finish schools use digital tools extensively while maintaining strong privacy protections and explicitly prohibiting biometric data collection in schools. It's possible to have technologically advanced education without fingerprinting children or logging their every behaviour.

"Progress" doesn't mean surrendering all privacy. The same fallacy justified every surveillance expansion in history: "You can't stop progress!" But progress is about improving human flourishing, not adopting every available technology regardless of consequences. We can have digital education without digital surveillance - they're not synonymous.

The real inevitability isn't technology - it's function creep. Today's "voluntary" system becomes tomorrow's requirement. Today's safeguarding database becomes tomorrow's comprehensive social control infrastructure. Today's educational data becomes tomorrow's input for predictive systems.

Consider the National Pupil Database in this light. It was established to track educational outcomes. Now it contains 21 million individual profiles covering every aspect of children's lives - ethnicity, language, behaviour, SEN status, poverty indicators, safeguarding flags - data that will never be deleted.

When you think 20 years ahead, the state will have comprehensive profiles on all it's citizens, created without consent and retained permanently. This has nothing to do with the stated educational objectives. It's pre-crime infrastructure - databases that could eventually be paired with Digital ID systems to enable unprecedented surveillance and social control.

The UK government announced mandatory digital ID schemes in 2025, with plans for employment verification and potential expansions to other services. Digital rights organisations including Big Brother Watch, Liberty, Privacy International, and the Electronic Frontier Foundation have warned that digital ID systems create "mass surveillance infrastructure" with risks including profiling, tracking, function creep, and disproportionate impact on marginalised communities.

When lifetime educational profiles in the NPD connect with mandatory digital ID systems, the government gains comprehensive surveillance capability over citizens from age 4 until death. Every school interaction, behavioural concern, academic struggle, and safeguarding flag becomes linkable to employment records, housing status, benefit claims, and all other government interactions.

This isn't speculation about dystopian futures - it's logical extension of existing systems. The infrastructure is being built now. The NPD database exists. Digital ID is coming. The only question is whether society permits these systems to connect.

Treating technology as inevitable serves those building and profiting from surveillance systems. It discourages democratic deliberation about what kind of technological society we want to create. But technology is never inevitable - its always chosen. We can choose differently.

[1] Biometric Update, "Norway's data privacy watchdog seeks ban on remote biometric identification," October 3, 2025, www.biometricupdate.com/202510/norways-data-privacy-watchdog-seeks-ban-on-remote-biometric-identification

[2] Better Internet for Kids, "Finland - Policy monitor country profile," 2025, better-internet-for-kids.europa.eu/en/knowledge-hub/finland-policy-monitor-country-profile

[3] GDPR Article 4(11) definition of consent: gdpr-info.eu/art-4-gdpr

[4] GDPR Recital 43: gdpr-info.eu/recitals/no-43

[5] UN Convention on the Rights of the Child, Article 12: www.ohchr.org/en/instruments-mechanisms/instruments/convention-rights-child

[6] European Convention on Human Rights, Article 8: www.echr.coe.int/documents/d/echr/convention_ENG

[7] Daniel J. Solove, "'I've Got Nothing to Hide' and Other Misunderstandings of Privacy," San Diego Law Review, Vol. 44, p. 745 (2007), papers.ssrn.com/sol3/papers.cfm?abstract_id=998565

[8] UN Convention on the Rights of the Child, Article 16: www.ohchr.org/en/instruments-mechanisms/instruments/convention-rights-child

[9] European Convention on Human Rights, Article 8: www.echr.coe.int/documents/d/echr/convention_ENG

[10] Wikipedia, "Nothing to hide argument," en.wikipedia.org/wiki/Nothing_to_hide_argument

[11] Defend Digital Me, "An update on National Pupil Data," July 1, 2021, defenddigitalme.org/2021/07/01/an-update-on-national-pupil-data

[12] User observation about NPD and digital ID infrastructure concerns

[13] House of Commons Library, "Digital ID in the UK," January 1, 2026, commonslibrary.parliament.uk/research-briefings/cbp-10369

[14] Electronic Frontier Foundation, "The UK Has It Wrong on Digital ID. Here's Why," November 28, 2025, www.eff.org/deeplinks/2025/11/uk-has-it-wrong-digital-id-heres-why

[15] Liberty, "LIBERTY'S POSITION ON DIGITAL ID," September 26, 2025, www.libertyhumanrights.org.uk/issue/digital-id-liberty-position

[16] Statewatch, "UK: Joint briefing on the 'do not introduce digital ID cards' parliamentary petition debate," December 2025, www.statewatch.org/news/2025/december/uk-joint-briefing-on-the-do-not-introduce-digital-id-cards-parliamentary-petition-debate