The Data Brokers Fueling ICE’s Deportation Machine—And the Union Shareholders Fighting Back

“As investors, we are thinking about this as a risk to our investments, but also as a social and ethical issue.”

Maurizio Guerrero

State Police clash with protesters by ICE facility in Broadview, Ill., on October 10. Photo by Jacek Boczarski/Anadolu via Getty Images

In the current political climate, the last bulwark against the abusive deployment of corporate-owned generative artificial intelligence might just come from union shareholders.

AI tools are constantly evolving, and with no one stepping in to create some guardrails, they will become harder and harder to regulate,” says Emma Pullman, head of shareholder engagement at the British Columbia General Employees’ Union (BCGEU) in Canada. The union, representing more than 95,000 members in the public and private sectors, is a long-term investor in Thomson Reuters and is pressuring the Toronto-based data broker to align its AI products with human rights principles.

As investors, we are thinking about this as a risk to our investments, but also as a social and ethical issue,” Pullman says.

The union has been calling on Thomson Reuters to stop just thinking like a software vendor, and start acting like a company whose products can shape people’s lives — by following international human rights standards, being transparent about the risks its tools may pose and updating its values to better reflect the power and reach of AI.

Sign up for our weekend newsletter
A weekly digest of our best coverage

Ahead of Thomson Reuters’ annual shareholder meeting in June, BCGEU submitted a proposal raising concerns about the company’s AI product and its ties to President Donald Trump’s terror campaign against immigrants.

Thomson Reuters’ products have been directly linked to human rights concerns, particularly through their use by U.S. government agencies including Immigration and Customs Enforcement (ICE) for immigration enforcement, raising serious questions about surveillance, due process and the rights of migrants and asylum seekers,” according to a May press release from BCGEU.

Far from an experimental side project, AI is deeply embedded in the core products of Thomson Reuters — best known for its newswire agency (Reuters) and its legal research service (Westlaw) — which makes efforts to set ethical guidelines for how AI is governed a challenging but urgent task.

It’s like trying to turn the direction of a very massive vehicle, like a ship or a train,” says Sarah Lamdan, author of Data Cartels: The Companies That Control and Monopolize Our Information, with regards to changing the policies of data brokers like Thomson Reuters and LexisNexis.

Still, amid the almost total capitulation of Congress to the Trump administration’s deregulatory crusade, proposals from responsible investors remain one of the few ways to hold these types of companies accountable, as their products operate at the intersection of two woefully unregulated industries in the United States: data brokering and AI.

Thomson Reuters, for example, has provided ICE with access to its Consolidated Lead Evaluation and Reporting (CLEAR) platform. CLEAR creates individual profiles of people by aggregating and connecting thousands of datasets full of typically hard-to-find information, such as court records, business filings, driving records and social media, along with proprietary databases. One of CLEAR’s features is its risk analysis summary,” which uses generative AI to identify areas of concern that might merit further investigation,” according to the company’s product description.

Along with Accurint (a similar product from LexisNexis), CLEAR has become central to the U.S. surveillance apparatus and deportation machine. Both tools create these encyclopedic dossiers (Accurint alone uses more than 10,000 sources) to sell to government agencies, law enforcement and private entities without judicial oversight, undermining government privacy protections and bypassing established legal safeguards, such as sanctuary ordinances.

An official statement of work document—outlining how ICE Enforcement and Removal Operations (the ICE division which focuses on noncriminal immigration cases) would use CLEAR — states the platform would provide ICE with any information that identifies the possible location of the target and changes in the target’s identifiers, such as addresses, phone numbers, email addresses, user names, new aliases, date of birth changes, SSN changes, utility changes, arrests, credit checks, death registry information, employment changes, insurance changes and affiliated organizations through which a location can be derived.”

According to Lamdan, products from Thomson Reuters are critical to the surveillance infrastructure that ICE relies on.” Like other data broker products, CLEAR and Accurint become more valuable the more they amass the industrial amounts of individuals’ data that AI systems need to operate, she adds.

And they operate with no meaningful regulation.

"With no one stepping in to create some guardrails, [AI tools] will become harder and harder to regulate." —Emma Pullman, head of shareholder engagement at BCGEU

In 2024, the Consumer Financial Protection Bureau made a move to designate data brokers as consumer reporting agencies,” which would enable the agency to restrict the sale of people’s sensitive information to marketing companies or law enforcement. The Trump administration, however, canceled the proposed rule before it could be fully finalized and also hindered the Federal Trade Commission’s ability to protect consumers from the unlawful sale of personal or sensitive location data. The administration also rolled back the Biden-era executive order mandating standards for safe and trustworthy AI use by federal agencies then tried to ban states from regulating AI. When Congress rejected the proposal, Trump released his AI Action Plan restricting states from reining in this technology.

It’s within the midst of this deregulatory spree that responsible investors are advocating for responsible corporate AI policies. Campaigns at Google’s parent company, Alphabet, Meta and Microsoft are ongoing, while the AFL-CIO Equity Index Funds — an investment trust for union members’ pension plans, with more than $12 billion in assets as of 2023has pressured the entertainment industry to adopt AI principles to counter the dehumanization” of the U.S. workforce.

In Canada, BCGEU is taking the lead in pressing data brokers to establish guidelines and considering the launch of an investor campaign at the British conglomerate RELX, parent company of LexisNexis, to promote AI principles for its products. It’s part of a multi-year effort to hold companies accountable for their AI practices.

The people with the least leverage, unfortunately, are the subjects of data collection — which, in a sense, is all of us,” Lamdan says. So push back from investors who find these tools unconscionable or problematic or violating human rights” could be one way to curb some of the most outrageous abuse of tech on immigration enforcement.

Multi-year campaigns

The #NoTechForICE campaign, launched in 2018 and spearheaded by the grassroots organization Mijente, highlights the connections between the tech industry and the surveillance of immigrants, with Thomson Reuters being one of its main targets. Mijente also published a report titled Who’s Behind Ice?” which revealed ICE abductions were powered by data brokers and tech companies — including Palantir, Amazon Web Services and Microsoft. 

Jacinta González, co-author of the report and head of programs at MediaJustice, a nonprofit that investigates how corporations and governments abuse media and technology, explains that the information compiled from data brokers would be used to create basically the target list for ICE when they were conducting raids.” 

Data brokers’ tools, such as CLEAR and Accurint, have also been used to gather information on immigrant rights and abolitionist activists, making them targets for harassment and retaliation. 

What we’re seeing is pure criminalization of organizing and clamping down on anyone who dares resist or have an opposing viewpoint to what Trump and his squad are saying,” González says. The intention of all of this is to stifle political dissent.” 

Consider the case of Cat Brooks, an activist against police violence who has been targeted by white supremacist groups because of her work, according to a class action lawsuit filed against Thomson Reuters in 2020. “[C]oncerned for her safety and that of her family,” Brooks subscribed to a service that routinely scrubbed her personal information from the internet. Still, CLEAR was able to construct, without her consent, a 360-degree view of her life,” according to the complaint, including her address, cell phone number and information about her relatives, neighbors and associates. The same complaint argued that the database violated the privacy rights of 40 million Californians by secretly collecting their data.

Thomson Reuters claims that CLEAR includes billions of records that are updated daily, meaning that even a recent move or new utility sign-up could be reflected in an individual search. For immigrants and activists, this expansive surveillance dragnet is terrifying and can have life-changing consequences. 

As Thomson Reuters was taken to court in California, BCGEU submitted a shareholder proposal for the company to implement human rights practices in line with other companies. The proposal received about 30% of the votes from independent investors; by 2021, the following year, 70% of these investors endorsed the proposal. 

“What we're seeing is pure criminalization of organizing and clamping down on anyone who dares resist or have an opposing viewpoint to what Trump and his squad are saying.” — Jacinta González, head of programs at MediaJustice

The union argued that CLEAR provides the data that ICE uses in its Immigration Enforcement program to track, arrest, detain and deport foreign nationals — including children — on a massive scale.”

The union also stated that family separation and detention, which had become a huge scandal during the first Trump administration, are illegal under international law,” adding that we see this as an ethical issue as well as a risk to investors.”

In 2022, as BCGEU filed the proposal again, Thomson Reuters agreed to adopt the Guiding Principles on Business and Human Rights from the United Nations, which urges companies to do their due diligence to identify, prevent, mitigate and account for their human rights impacts, and to establish a process to enable remediation for any adverse human rights impacts they cause or contribute to.”

Data brokers, which nonconsensually traffic information extracted from virtually every individual in the United States, rarely disclose how they collect, analyze or sell data. They give people no chance to see and correct errors or discriminatory assumptions that may be weaponized against them. For example, CLEAR has risk inform” flags — usually associated with criminal records — that are automatically triggered when a targeted individual changes their name, according to the California complaint.

In another case, filed in a New York court in 2017, a woman argued that a CLEAR dossier had wrongfully stated she had been convicted of a crime, which prevented her from obtaining a job.

Furthermore, data brokers are using generative AI to ramp up the speed of investigative efforts, with algorithms trained on historical law enforcement data.

I think it’s an important political moment for investors to take a stand because there is a huge crisis in human rights with how immigration enforcement is happening on the streets of cities across the United States,” González says. It’s important for investors to acknowledge that having contracts with ICE is bad for human rights and reprehensible,” she adds.

Still, the proposal put forth by BCGEU in June received just under 20% of the vote from independent shareholders, or nearly 5% overall, says Pullman.

The Thomson Reuters board of directors recommended shareholders vote against the BCGEU proposal, claiming the company’s AI governance framework is well-suited for effectively overseeing responsible use and development of AI.”

Almost 70% of Thomson Reuters shares are owned by The Woodbridge Company (the Thomson family holding company), while financial institutions and individual investors control about 27.5%. BCGEU holds less than 1% of the shares, enough to submit resolutions on the company’s governance.

Despite the ultimate results of the proposal vote, investor proposals can spur company leadership to address issues, argues Audrey Mocle, whether through continued engagement with shareholders or concrete actions. Even 20% sends a pretty strong signal to the board of directors that this is something that matters,” says Mocle, deputy director at Open MIC, an organization working for accountability in the use of digital technologies through shareholder engagement.

From the shareholders’ perspective, the proposal is meaningful, Mocle adds, which can ultimately begin a dialogue or a process of internal change.” At first, there may not be much knowledge among investors about the impacts of a certain technology, but it’s a multi-year effort, a multi-year campaign,” and over time, the impacts become clearer.”

Investors: the last bulwark

How much CLEAR’s operations changed after Thomson Reuters aligned with the UN’s guiding principles, in 2022, remains distinctly unclear. The company claimed, also in 2022, that it was reviewing its cooperation with ICE, while the amount of its ICE-awarded contracts declined.

To what extent we can directly connect this to the human rights framework? I can only speculate, but I think there’s something there,” Pullman says.

The Department of Homeland Security (ICE’s parent company) currently has an open contract with Thomson Reuters worth $22.8 million that expires in 2026. In May, Thomson Reuters supplied ICE with a law enforcement investigative database” subscription that includes license plate reader data to enhance investigations for potential arrest, seizure and forfeiture.”

Thomson Reuters did not reply to specific questions, but in its public CLEAR FAQ,” the company claims it does not contain data about a person’s immigration or employment eligibility status” and states it is not designed for mass illegal immigration inquiries or for deporting non-criminal undocumented persons and non-citizens.”

This is their way of trying to muddy and confuse the waters as much as possible,” González says. You don’t need concrete data about people’s immigration status to have that data be used in an immigration enforcement raid.”

Despite the company’s claims, ICE Enforcement and Removal Operations at one point had a contract to use the database and, according to an email referenced in a Washington Post report, an ICE agent claimed CLEAR was used to track a non-citizen who overstayed a visa — a civil, rather than criminal, matter.

BCGEU plans to resubmit its proposal at next year’s shareholder meeting, asking Thomson Reuters to extend its commitment to human rights to its AI governance framework. According to the union, the company is exposed to AI-related legal risks that can be detrimental to investors.

In 2024, for example, Thomson Reuters agreed to pay a $27.5 million settlement to the California plaintiffs who sued it three years prior for its repeated violations” of consumers’ rights as it collected and sold their personal data without consent.

BCGEU’s proposal is endorsed by Glass Lewis, one of the largest proxy advisory firms in the United States. According to the firm, Thomson Reuters’ commitment is broad and unspecific” and doesn’t mention the UN’s guiding principles, nor does it allow shareholders to understand how the company’s framework is being applied or how human rights considerations are incorporated in their execution.”

Kit Walsh, director of AI and access-to-knowledge legal projects at the nonprofit digital rights group Electronic Frontier Foundation, says that shareholder proposals are one of the tools to create some accountability where there isn’t enough.”

But a corporation’s commitment, Walsh warns, is not a sufficient alternative to enforcement of international law and domestic laws designed to prevent companies from monitoring individuals and committing human rights violations.” In the political climate” fostered by the current administration, going after human rights violators could be an uphill battle.”

Over the past few months, pushing corporations to take greater responsibility has become more difficult, as many corporate boards are rethinking their environmental, social and governance commitments in an effort to avoid accusations of wokeness.” But without accountability, the use of AI by data brokers collaborating in immigration enforcement will engulf more than undocumented individuals, according to migration policy experts.

When you combine the use of these systems [data brokers and AI], which are inherently flawed and biased, with policing practices that don’t allow for correction or double-checking to ensure accuracy, that’s a very dangerous situation,” Lamdan says. And it deprives people of due process, so people are just being taken away, stopped or removed. And they have no opportunity to say, Hey, you’ve got the wrong person. Hey, that data is wrong. Hey, let me explain.’ ”

Despite the challenges, Mocle thinks pressure within companies should not be dismissed: I certainly think that investors are one of the last really strong bulwarks against the unconstrained development of this new technology.”

Maurizio Guerrero is a journalist based in New York City. He covers migration, social justice movements and Latin America.

Get 10 issues for $19.95

Subscribe to the print magazine.