US Campuses Have Become the Newest Laboratories for Surveillance Technology


Support justice-driven, accurate and transparent news — make a quick donation to Truthout today! 

In early June 2025, The Guardian revealed that the University of Michigan paid over $800,000 to Amerishield, parent company to a private security company called City Shield. It was part of a broader $3 million public‑security budget, which included surveillance of pro‑Palestinian student activists. The university hired plainclothes agents who trailed students into cafés, harassed them, and even staged confrontations, including faking a disability to falsely accuse them of theft. City Shield, a private security agency based out of Detroit, used the supposed evidence collected by their agents against these students to prosecute them and send them to jail.

Massive Blue, a company based in New York, has created a surveillance tool called Overwatch that uses AI to monitor online spaces. Although this tool is sold and marketed as a public safety tool, the technology uses AI-generated virtual characters that infiltrate online groups, take part in conversations, and collect intelligence, particularly targeting college protesters and activists labeled as “radicalized.” These bots are created to imitate human behavior, making them very hard to detect.

The tech stack these agencies deploy is formidable: geofencing tools, license plate readers, real-time social media surveillance, predictive analytics. Even Radian6, a Salesforce product, has been linked to student protest monitoring. These tools don’t just observe behavior, they anticipate it, allowing administrators and police to intervene before a rally even begins.

This surveillance is not merely bureaucratic overreach. It is an act of intimidation, one that reflects an ideological alignment with systems of repression abroad. In Gaza, for example, humanitarian aid is increasingly distributed only through biometric registration, leaving starving Palestinians with no choice but to submit to facial and fingerprint scans to access food. Although aid agencies and occupying forces justify this as a form of “efficiency,” it can only be seen as coercive surveillance, stripping Palestinians of dignity and autonomy under the guise of relief.

Within the United States, university officials have increasingly turned to firms like ShadowDragon, Skydio, and Stellar Technologies, whose tools are capable of profiling, analyzing, and geolocating social media posts, drone-mapping encampments, and even identifying masked protestors through AI-enhanced facial recognition. These companies aren’t developing tools for student safety. They’re developing battlefield tech and students are the new targets.

In April 2024, when students at Columbia University set up a peaceful encampment to protest the genocide in Gaza, few expected the administration to respond with mass arrests. Fewer still understood the extent to which military-grade surveillance technologies were already in place to track them. In the same month, Jewish Currents reported on Yale University’s use of drones, surveillance cameras, and plainclothes officers to monitor pro-Palestine student activists. Surveillance footage was used to identify students who had not violated any laws but had simply been present.

The technology uses AI-generated virtual characters that infiltrate online groups, take part in conversations, and collect intelligence.

Meanwhile, in May 2025, it was revealed that LAPD used Dataminr to track the social media activity of students organizing pro-Palestine events. Surveillance reports included screenshots of Instagram stories, private group chats, and Twitter threads. Some of this data was sourced using tools built by Dataminr and Social Sentinel, both of which specialize in identifying “emerging threats” by combing through vast amounts of social media data tools originally developed for use by law enforcement and intelligence agencies. In addition, in March 2024, it was revealed that Elon Musk’s X was also selling user data to Dataminr.

The logic is chillingly consistent: Dissent is pathologized, monitored, and neutralized while capitalists keep making money. And universities have become willing partners in this process.

In my Ph.D. research, I studied surveillance infrastructures, particularly in contexts where settler colonial regimes seek to erase dissent. What we’re seeing on U.S. campuses today mirrors repression models from places like East Turkestan (Xinjiang), where everyday resistance is quelled through predictive monitoring and data extraction. What this tells us is that surveillance isn’t about protection, it’s about power. And American universities are rapidly becoming test beds for the kind of repressive technology we associate with authoritarian states abroad.

The money behind these technologies flows through a familiar nexus of venture capital firms, defense contractors, and government agencies. ShadowDragon, for example, has been celebrated by the World Economic Forum and boasts contracts with U.S. law enforcement and military branches. Cobwebs Technologies, Xtend, and Oosto are Israeli surveillance companies whose products are now being deployed against student activists in the U.S. These firms have marketed their tools based on effectiveness in tracking Palestinians and other “high-risk” populations. Now, they are used on American campuses to monitor solidarity.

There is a historical precedent to this kind of repression. During the civil rights and antiwar movements of the 1960s and 70s, FBI programs like COINTELPRO targeted student activist groups like the Student Nonviolent Coordinating Committee (SNCC) and Students for a Democratic Society (SDS), and organizations like the Black Panther Party, with infiltration, wiretaps, and blackmail. The FBI’s targeting of student and civil rights activists through COINTELPRO is well-documented, as explored in this Yale News interview with historian Beverly Gage, who says J. Edgar Hoover’s tactics were both coercive and widely normalized. Hoover was the first director of the FBI, serving from 1924 until his death in 1972. However, the difference today is scale and automation. Rather than needing a human informant in every room, university administrators can now rely on machine learning algorithms to scan student emails, monitor group chats, and alert authorities to so-called “escalation indicators.”

This surveillance is not merely bureaucratic overreach. It is an act of intimidation.

Recently, billionaire investor Kevin O’Leary called for permanently destroying the digital lives of student protesters using AI-based doxxing campaigns. While O’Leary was widely condemned, his dystopian vision isn’t far from the reality already taking root. Companies like Babel Street and Stellar Technologies offer tools with the explicit promise of identifying individuals based on minimal data inputs like a partial image or a Twitter handle.

Tools like NesherAI claim to identify masked individuals in a crowd using behavioral and gait analysis. Corsight AI advertises facial recognition that works through masks, hoods, and sunglasses. These are not theoretical tools. They’re being piloted now, often in secret, and often in tandem with increasing police presence on campuses.

These developments must be viewed in the broader context of the militarization of civilian life. The pipeline between Gaza and Georgia Tech is not rhetorical. Technologies field-tested in war zones are refined and normalized on American campuses, where students are reduced to data points in behavioral prediction models. Israeli firms developed surveillance for use in Gaza and the West Bank are now contracted domestically.

Surveillance technologies once reserved for military and foreign operations are now being deployed on U.S. university campuses, targeting student protestors, particularly those demonstrating for Palestine. These tools are not isolated innovations. Many were tested on Palestinians and Uyghurs before being repurposed for U.S. settings. This is not a breakdown of democratic norms but their systematic redirection using universities as testbeds for next-generation policing.

University administrators often claim that surveillance is about safety. But whose safety? Lately, surveillance tools have mostly targeted the most vulnerable in our communities: Muslims, Indigenous people, undocumented immigrants, Black people, and anyone who supports Palestinian liberation and an end to the U.S.-sponsored genocide in Gaza. They claim that these tools are here to protect us, but in truth, they only serve to erase us, discipline us, and punish us.

It doesn’t have to be this way. Campuses have historically been sites of inquiry, dissent, and transformation, not laboratories for digital repression, and we must fight to keep them that way. We must reject the narrative that surveillance is inevitable. It is not. It is a political choice, made behind closed doors, mostly without consent or oversight of those who are affected the most.

Surveillance Watch is one intervention in this landscape, an attempt to shift the power back to those being watched. It is a community‑driven project that deploys interactive maps and databases to reveal the web of surveillance companies, their funders, subsidiaries, affiliations, and global operations in an effort to make opaque surveillance networks transparent. However, we need more. We need more public and legal accountability, more student organizing, more media scrutiny, and way more institutional courage. We need to say, unequivocally, that resistance is not a threat. It is a right.

We are not subjects. We are watching back.

Media that fights fascism

Truthout is funded almost entirely by readers — that’s why we can speak truth to power and cut against the mainstream narrative. But independent journalists at Truthout face mounting political repression under Trump.

We rely on your support to survive McCarthyist censorship. Please make a tax-deductible one-time or monthly donation.





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *