Politics

Unveiling the Hidden Power and Politics Shaping Data

Exploring the politics of data – Queen Mary University of London

In an age where algorithms influence elections, social media shapes public opinion, and governments race to harness artificial intelligence, data is no longer a neutral by-product of digital life – it is a source of power.At Queen Mary University of London, researchers are probing this new terrain, asking who controls data, how it is used, and what it means for democracy, justice and everyday citizenship.

“Exploring the politics of data” is not just an academic exercise; it is indeed an attempt to map the shifting balance between states, corporations and individuals in a world built on constant surveillance and relentless details flows. From predictive policing to welfare algorithms, from health apps to border technologies, the work at Queen Mary exposes how data can deepen inequality as easily as it can drive innovation. This article examines that research, and the questions it raises about clarity, accountability and the future of digital society.

Unpacking the power structures behind big data at Queen Mary University of London

At this East London campus, data is never treated as neutral or purely technical. Researchers, students and visiting practitioners probe who designs the algorithms, who funds the infrastructures, and who is left out of the datasets that shape public life. Seminar rooms double as critical observatories, where students dissect case studies on predictive policing tools in UK cities, NHS data-sharing deals, and platform governance in higher education. In these discussions, data is reframed as a political terrain, structured by competing interests, regulatory gaps and global inequalities. Through this lens, lecture slides become maps of hidden influence, tracing the route from raw data collection to corporate dashboards and ministerial briefings.

Workshops and reading groups frequently break down key actors in the data ecosystem, asking participants to track accountability across institutions and borders. Students are encouraged to map their findings using simple matrices and visual aids, turning abstract power relations into something concrete and contestable. Core questions guide this critical practice:

  • Who owns the tools and platforms?
  • Whose labor makes data systems function?
  • Which communities carry the greatest risks?
  • What forms of resistance and regulation are emerging?
Actor Power Levers Campus Focus
Tech Firms Infrastructure, IP, lobbying Platform governance, AI ethics
Governments Law, procurement, surveillance Data protection, digital rights
Universities Research agendas, partnerships Ethical review, funding scrutiny
Civil Society Advocacy, watchdog roles Campaigns, community data projects

How data governance shapes public trust and accountability in higher education

Behind every dashboard, retention model and “student success” metric lies a web of decisions about who owns information, who can see it, and how it may be used. In universities, these decisions are no longer purely technical; they are increasingly understood as political acts that can either erode or deepen public confidence. Clear rules about consent, data minimisation and secondary use signal that institutions recognize students and staff not as data points, but as partners with rights and agency. When those rules are opaque-or applied inconsistently-suspicions grow that analytics are being driven more by market logic than by educational values, fuelling concerns about surveillance, bias and the quiet sidelining of vulnerable groups.

  • Transparency: publishing what data is collected, why, and for how long
  • Participation: involving students and staff in setting data-use boundaries
  • Redress: offering clear routes to challenge automated or data-informed decisions
  • Independent scrutiny: opening systems and algorithms to external review
Governance choice Public signal
Opt-in learning analytics Respect for autonomy
Open impact reports Willingness to be judged
Shared policy drafting Commitment to co-governance

These apparently procedural moves have real political consequence. Where governance is robust and participatory, universities are better positioned to justify decisions on admissions, funding or academic performance in ways that withstand media scrutiny and legal challenge. Conversely, scandals over data breaches or opaque algorithmic tools can rapidly damage institutional credibility, inviting intervention from regulators and lawmakers. In this sense, information practices become a barometer of civic responsibility: they reveal whether higher education treats data as a private asset to be exploited, or as a public good to be stewarded with care and shared responsibility.

Building ethical data practices in research and student services

At Queen Mary, safeguarding the people behind the numbers means designing data workflows that respect autonomy as much as accuracy. Researchers and student services teams are beginning to treat consent as an ongoing conversation rather than a one‑off form, foregrounding transparency about what is collected, why it is stored, and how it might shape decisions about funding, wellbeing interventions, or course design. This shift is supported by practical measures such as privacy‑by‑design templates, ethics reviews that focus on lived impact rather than box‑ticking, and cross‑disciplinary panels that bring legal, technical and pastoral expertise into the same room.

Within student services, ethical data practice is becoming part of everyday operations, not a specialist concern reserved for formal studies. Staff are encouraged to interrogate algorithmic tools used for attendance monitoring,risk flagging and engagement analytics,ensuring that seemingly neutral dashboards do not reproduce bias or penalise students who take non‑linear paths through higher education. Guiding principles include:

  • Proportionality – collecting only what is needed to support learning and wellbeing.
  • Accountability – making decision paths and data access logs visible to staff and students.
  • Co‑creation – involving student representatives in drafting data charters and feedback mechanisms.
  • Care – treating data as a relationship, not a commodity.
Area Ethical Focus Example Action
Research projects Informed consent Plain‑language data summaries for participants
Student analytics Bias reduction Regular audits of predictive models
Support services Data minimisation Short retention periods for sensitive records

Policy recommendations for universities navigating the politics of data

As institutions become both generators and custodians of vast datasets, their credibility hinges on clear, enforceable frameworks that anticipate conflict rather than merely reacting to it. Universities should embed data governance into core academic processes, not treat it as a peripheral IT issue. This means establishing cross-disciplinary committees that include legal scholars, computer scientists, ethicists, students and community partners, tasked with regularly reviewing how data is collected, stored, shared and monetised. Key priorities include:

  • Transparency about what data is collected, why, and who can access it.
  • Consent mechanisms that are intelligible, revisitable and easy to withdraw.
  • Safeguards for vulnerable groups whose data may carry heightened political risk.
  • Clear red lines on commercial use, political profiling and law-enforcement access.
Area Risk Mitigation
Student analytics Surveillance stigma Opt-in dashboards
Research data Political misuse Ethics-by-design
Partnerships Corporate capture Public interest tests

Institutions should also treat data disputes as democratic questions, not purely technical glitches. Embedding data politics into curricula, staff training and student portrayal ensures that decisions about algorithms, platforms and partnerships are subject to robust debate. Practical steps include:

  • Creating data ombudspersons or independent panels to hear complaints and appeals.
  • Publishing impact assessments for major data initiatives, open to public comment.
  • Negotiating contracts that protect academic freedom, safeguard whistleblowers and mandate open publication of methods, not just results.
  • Aligning all data practices with broader institutional commitments to equity, decolonisation and civic engagement, so that technical choices are always read against their social consequences.

In Retrospect

As data seeps further into the fabric of public life, the questions being raised at Queen Mary are unlikely to fade. If anything, they are set to become more urgent. Who designs the systems that order our lives, and in whose interests? How can democratic institutions keep pace with technologies that evolve faster than regulation? And what does accountability look like in a world where decisions can be traced to lines of code as much as to elected officials?

By treating data not as a neutral resource but as a political force, researchers at Queen Mary are helping to reframe the debate. Their work suggests that the future of data-driven societies will not be resolute by engineers alone, but through ongoing negotiation between citizens, states and corporations.

As governments and tech firms race to harness ever larger datasets, the politics of data is no longer a niche academic concern. It is fast becoming one of the central battlegrounds of contemporary democracy – and the questions being asked in Mile End’s lecture halls may yet shape how that contest unfolds.

Related posts

King’s Launches Exciting New MA in Culture, Politics & Society

Noah Rodriguez

Prince William Chooses to Stay Silent Amid London’s Priority Debate

Jackson Lee

How Elon Musk Is Revolutionizing Politics Across the UK and EU

Caleb Wilson