Securus Like MINORITY REPORT: The Evolution of Algorithmic Surveillance in American Prisons
What is algorithmic surveillance? In Philip K. Dick’s story that inspired Minority Report, a predictive system allowed authorities to arrest people before a crime.
That dystopia is now reality inside American prisons. Securus Technologies has trained Artificial Intelligence on years of incarcerated people’s communications and, according to public statements, also on seven years of phone calls from the Texas prison system. Kevin Elder, the company’s president, claims the system detects crimes when they are “contemplated.” Meanwhile, communication costs can reach predatory levels: you pay to speak—and by speaking, you fund the development of the technology that surveils your loved ones.
HOW WE GOT HERE
A 15-minute phone call from an American prison can cost more than twenty dollars. During that call, the provider records every word, every pause, every shift in the incarcerated person’s voice. That conversation with a mother, a child, or a partner becomes training material for an Artificial Intelligence system that is then sold back to the very infrastructure that holds the people who generated the data.
Securus Technologies built an empire worth 1.2 billion dollars on this mechanism, operating an extraction model that penetrates and exploits every layer of the relationship between incarcerated people, their families, and the prison system. First, systematic data extraction: every voice becomes a biometric resource stored permanently, every conversation feeds proprietary archives. Then the profits from call rates that look “manageable” but, for a family maintaining weekly contact, can turn into $1,200–$1,500 per year just to communicate. Finally, Securus sells its product—predictive surveillance—to prisons, while shifting development costs onto incarcerated people. Surveillance can be imposed without meaningful consent. A warning light: if it works there, it can be normalized anywhere.

In December 2024, the company publicly admitted it had developed a predictive surveillance system trained on years of incarcerated people’s communications—phone calls, video calls, emails. Kevin Elder is blunt: their language model can detect “when crimes are being thought about or contemplated, so we can intercept them much earlier in the cycle.”
A detail that matters: when we talk about “seven years of calls,” we’re not talking about a single prison, but the Texas prison system. And the “real-time” pilot has no declared geography: the company does not publicly indicate which facilities it is operating in, across jails, prisons, and detention centers (including ICE). Opacity isn’t a bug—it’s the model.
The key word is “contemplated.” The system doesn’t merely monitor actions already taken: it aims at thoughts, at intentions that have not yet crossed the line between mind and world. The economy of algorithmic prophecy found its perfect laboratory: a population with no alternatives, forced to pay to communicate, unable to refuse consent.
ALGORITHMIC SURVEILLANCE: ANATOMY OF A CRIME THAT DOESN’T EXIST
Welcome to Minority Report: Securus and the “Pre-Crime” Empire
The story begins in 2023, deep inside the Texas prison system.
Securus controls communications in thousands of facilities: state prisons, county jails, ICE detention centers. Millions of daily conversations flow through its infrastructure, leaving permanent traces of voices, words, and relationships.
Before 2023, surveillance often relied on reactive logic: predefined keywords (“drugs,” “weapon,” “escape”) triggered alerts that human operators had to verify by listening to recordings. Slow, costly, inefficient.
Securus’ insight: delegate the work to an AI capable of processing everything, always, searching for patterns invisible to the human ear.
The Texas corpus: seven years of voices
In 2023, Securus obtained seven years of recordings from the Texas prison system. Hundreds of thousands of hours of conversations, millions of words spoken by voices that believed—naively—they were speaking only to their families. Among those recordings, there are also attorney-client communications that should be protected by legal privilege.
This corpus becomes the training archive for a custom-built language model: not the open internet, not public texts. Prison slang, codes, euphemisms, emotional patterns, relationship structures.
The shift: from “after” to “before”
The system operates as an integrated chain: the conversation is transcribed, analyzed, and evaluated as it happens. Investigators can focus the system on specific incarcerated people or enable broader sampling across the general population.
The algorithm highlights segments labeled “high risk” according to opaque criteria. These fragments are delivered to an operator who decides whether to investigate further. “Human oversight” is meant to avoid full automation, but in practice it can turn operators into validators of decisions already produced by the machine.
Securus claims it has prevented trafficking, gang operations, and contraband organized by corrupt guards. It provides no publicly verifiable cases, no independent audits, no evidence. It demands blind trust in a proprietary system while selling that opacity as “security.”
Algorithmic surveillance paid for by the incarcerated
Families pay some of the highest rates in the country to communicate. Securus records every conversation and captures two assets at once: immediate revenue from the calls and training data for its proprietary models. Those data are turned into surveillance tools and sold back to the same institutions that detain the people who generated the data.
The real innovation is political: building a framework where “security” and “AI” become a permanent justification for fees, surcharges, delays, and provisional caps. The device doesn’t have to work perfectly: it has to be paid for, defended, renewed. And the payment is extracted from those who cannot say no.
You pay to speak. By speaking, you train the AI. The AI surveils you. You pay more to fund that surveillance. The cycle feeds itself: poverty into data, data into profit, profit into more extraction.
Bianca Tylek, executive director of Worth Rises, calls it “coerced consent”: incarcerated people have no communication alternatives. Either they surrender their biometric voice data, or they lose contact with their families.
The political turning point: the FCC timeline (2024–2027)
2024 is the year of regulatory conflict. The Martha Wright-Reed Act pushes the FCC to intervene on prison communication rates and ancillary fees. In 2024, the FCC adopts a reform order that promises major cuts and a new cost architecture. But on January 20, 2025, Brendan Carr becomes Chair of the Federal Communications Commission and, months later, the shift arrives: a waiver moves full compliance to April 1, 2027. In October 2025, higher “interim” caps are also voted through. The consequence is simple: the market isn’t broken—it’s extended.
How Securus’ AI works, technically
The system goes beyond simple keyword detection. Neural models analyze meaning, time context, and semantic relationships distributed across weeks and months of conversations.
Each call first goes through speech-to-text. But transcription is not enough: the system must identify speakers, distinguish the incarcerated person from the family member, and recognize other voices through biometric vocal features (tone, cadence, timbre).
From there, the core mechanism links each word to the others, producing context-dependent meaning. This makes it possible to connect seemingly harmless phrases into a sequence the model interprets as planning.
The final layer produces a probabilistic score: not “certainty,” but likelihoods built from statistical correlations learned from the training set.
In parallel, sentiment and prosody analysis evaluates changes in tone, volume, and speaking speed, integrating vocal and textual signals. The system’s power emerges from memory: it does not evaluate a single call in isolation but pulls from the entire communication history of the incarcerated person.
ALGORITHMIC SURVEILLANCE: BIAS IS THE SYSTEM
The idea that algorithms are “neutral” ignores embedded prejudice: language, race, class, territory become operational variables.
The science of coded prejudice
Studies on commercial NLP models have shown that equivalent texts, written in Standard American English and African American Vernacular English (AAVE), can receive dramatically different “toxicity” classifications.
Why? Models are trained on corpora dominated by the standard; legitimate language varieties are treated as “deviations” and therefore associated with aggression or risk.

False positives and a self-reinforcing spiral
The system can generate false positives: colloquial or dialectal expressions, out-of-context readings, ambiguous words. When a group is flagged more often, the sanctions that follow produce “official” documentation that cycles back into training, strengthening a spurious correlation: non-standard language = risk.
The opacity problem
If you are flagged, you don’t know why. You can’t see the data used against you. There’s no transparency, no independent audit, no meaningful appeal. The machine isn’t just a sensor—it becomes a standard of reality.
FOLLOW THE MONEY: ALGORITHMIC SURVEILLANCE BACKED BY INDUSTRY GIANTS
To understand Securus is to read the incarceration economy as an industry: captive markets, contracts, regulatory rents, extraction of value from poverty and isolation.
80 billion: the incarceration economy
The United States spends tens of billions every year to incarcerate millions of people. More than half of those budgets flow to private providers: communications, technology, healthcare, transport, surveillance infrastructure. It’s not “just” private prisons: it’s an industrial ecosystem.
The 13th Amendment loophole
The clause in the 13th Amendment allowing involuntary servitude “as punishment for a crime” enabled the historical continuity of forced prison labor. Today extraction is not only about labor: it’s about cognitive data. Not only bodies—thoughts turned into a monetizable surface.
ALGORITHMIC SURVEILLANCE AND PREDICTIVE POLICING AS DESTINY
Securus is not an anomaly: it’s an outpost. Predictive policing programs have shown bias and ineffectiveness in multiple contexts, yet the spread continues because the business model doesn’t depend on results: it sells a perception of control, legitimizes budgets, and produces contracts.
Police departments and security agencies around the world treat Minority Report not as a warning, but as a template. The film ended with the system dismantled: the moral was clear. Twenty years later, many institutions choose the opposite.
FOUR LEGAL FRONTS FOR SECURUS’ ALGORITHMIC SURVEILLANCE
January 2026 marks a turning point: Securus’ business model faces simultaneous pressure on multiple fronts. Four parallel proceedings could redefine the company’s operating limits.
1. FCC and the Martha Wright-Reed Act: the end of inflated rates
The Federal Communications Commission adopted reforms aimed at separating communication costs from “ancillary services” such as surveillance. The rule includes compliance dates phased between January 2025 and April 2026. Some providers, including Securus, obtained temporary waivers through September 2025 to complete billing system changes.
The most visible change: the cost of a 15-minute call could drop from more than $11 in some facilities to under $1. This directly affects the business model that enabled Securus to fund AI development through high rates. The FCC also prohibited “site commissions,” agreements in which providers shared profits with correctional administrations.
Securus and other operators filed appeals in multiple federal circuits, arguing that the new rates would prevent “fair compensation.” Courts have so far rejected requests to pause the order. FCC Commissioner Anna Gomez stressed that the costs of security should not fall on incarcerated people’s families.
2. Recording attorney-client calls: a systematic pattern
Securus has a documented history of unlawful recording. In 2020, the company and CoreCivic agreed to a $3.7 million settlement for recording attorney-client communications at the Leavenworth Detention Center and sharing them with prosecutors. Similar cases have surfaced in at least seven states: California (over 14,000 privileged calls recorded), Kansas, Louisiana, Maine, Missouri, Texas, Wisconsin.
In New York, a 2021 internal audit identified more than 1,500 privileged calls recorded, involving hundreds of defendants. In one Maine case, an attorney was recorded 304 times while speaking with clients. Investigations found that automatic transcription systems did not consistently exclude numbers designated as “protected,” pushing privileged material into surveillance databases.
ACLU’s David Fathi called this “potentially the largest violation of attorney-client privilege in modern American history.” In 2015, a leak of Securus databases exposed 70 million recorded calls, including tens of thousands involving attorneys.
3. Supreme Court and location data: the case under review
On January 13, 2026, the U.S. Supreme Court agreed to review the FCC’s authority to fine telecom companies for selling customer location data. The case directly involves Securus: a Missouri sheriff used location data obtained through Securus (and sourced from AT&T, Verizon, and T-Mobile) to track people without a warrant. The sheriff was convicted for violating constitutional rights.
In 2024, the FCC issued roughly $200 million in fines against the three carriers for selling location data to intermediaries such as Securus. The carriers challenged the FCC’s authority. Federal courts issued conflicting rulings: the 2nd and D.C. Circuits upheld the fines, while the 5th Circuit sided with AT&T. The Supreme Court consolidated the cases and is expected to rule by June 2026.
If the Court upholds the FCC’s authority, Securus and its telecom partners could face stricter limits on buying and selling personal data—undermining a key piece of the surveillance ecosystem.
4. California AB 2013: transparency on training data
As of January 1, 2026, California requires developers of generative AI systems to publish information about the data used for training. Assembly Bill 2013 requires disclosures on dataset sources, the number of data points, whether the dataset includes copyrighted material, whether it contains personal information under the California Consumer Privacy Act, and whether synthetic data were used.
The law does not provide broad exemptions for trade secrets. This triggered immediate industry pushback: xAI filed a lawsuit on December 29, 2025, arguing that the rule violates the Fifth Amendment by compelling disclosure of proprietary information. The law can be enforced under California’s Unfair Competition Law, enabling both public and private actions.
For Securus, this could mean potential obligations to disclose: the volume of conversations used to train models, whether privileged communications were included, whether meaningful consent existed, and whether minors’ communications were used. Forced transparency could expose practices previously protected by proprietary opacity.
THE ALGORITHM THAT CREATES THE FUTURE IT PREDICTS
We have reached a threshold: surveillance shifts from actions to intentions, from punishing crimes committed to anticipating crimes imagined. “Justice” risks being built on opaque probabilities, on correlations extracted from data contaminated by decades of selective policing.
Securus engineered a closed loop: poverty into data, data into profit, profit into more extraction. A device in which those trapped fund their own confinement through every attempt at human contact.
The developments of January 2026 show significant cracks in the model: sharply lowered FCC caps, litigation over privileged recordings, a Supreme Court review of location-data sales, and California forcing disclosure of training datasets. But the question remains: is it enough to stop “pre-crime” from becoming infrastructure?
One question remains: what happens when the last zone of privacy—the space where thought exists before it becomes speech—is treated as territory that profit can colonize?
A call can cost predatory amounts; thought, for now, still moves without a tariff. But the model is already clear: create dependency on communication, turn communication into data, sell data as control, charge the controlled to fund their own control. How long remains before even thinking becomes a market cost?
Sources: algorithmic surveillance
- Benton.org summary of MIT Technology Review: AI trained on prison phone calls to look for planned crimes (December 2024)
- Worth Rises: mapping private-sector players in the prison industry (2024 update)
- ACLU: prison phone company surveilling people who haven’t committed crimes
- Oxford Insights: racial bias in NLP (2019)
- NAACP: AI in predictive policing (2024)
- Blodgett & O’Connor (2017): language bias and AAVE classification
- FCC: public documents on prison telecom reforms (order, waivers, interim caps)
Further reading (2026 update)
- FCC: waiver order related to Securus billing changes (December 2024)
- Broadband Breakfast: litigation over FCC rate caps (July 2025)
- Filter Magazine: court rejects bid to stall rate-cap reforms (November 2024)
- Prison Legal News: settlement over illegal recording of attorney-client calls (2020)
- Investigation: attorney-client call recording across the U.S. prison telecom system (2021)
- Worth Rises: prison call surveillance and rights (2021)
- EPIC: FCC comments on Martha Wright-Reed Act implementation
- MediaPost: Supreme Court review of FCC privacy fine authority (January 13, 2026)
- California: AB 2013 bill text on training data disclosure
- Legal analysis: AB 2013 disclosure obligations (effective January 1, 2026)
- Legal analysis: challenges to training data disclosure law (January 2026)
CYBERMEDIATEINMENT
DECODE > RESIST > RECLAIM







