Published in News

Meta sues surveillance company for fake accounts

by on13 January 2023


Making money off watching users is our job

Meta has sued to block a surveillance company from using Facebook and Instagram.

The social notworking site claims that Voyager Labs partnered with law enforcement and created thousands of fake accounts to harvest user data.

The case comes after a Guardian investigation revealed the company had partnered with the Los Angeles police department (LAPD) in 2019 and claimed that it could use social media information to predict who may commit a future crime.

Public records obtained by the Brennan Center for Justice, a non-profit organisation, and shared with the Guardian in 2021, showed that Voyager’s services enabled police to surveil and investigate people by reconstructing their digital lives and making assumptions about their activity, including their network of friends.

In an internal record, Voyager suggested that it considered using an Instagram name displaying Arab pride or tweeting about Islam as signs of potential extremism.

The lawsuit in federal court in California details activities that Meta says it uncovered in July 2022, alleging that Voyager used surveillance software that relied on fake accounts to scrape data from Facebook and Instagram, Twitter, YouTube, LinkedIn and Telegram.

According to the complaint, Voyager created and operated more than 38,000 fake Facebook accounts to collect information from more than 600,000 Facebook users, including posts, likes, friends lists, photos, comments and input from groups and pages.

The affected users included employees of non-profits, universities, media organisations, healthcare facilities, the US armed forces and local, state and federal government agencies, along with full-time parents, retirees and union members,

Meta said in its filing. It is unclear who Voyager’s clients were then and what entities may have received the data. But Voyager, which has offices in the US, the United Kingdom, Israel, Singapore and the United Arab Emirates, designed its software to hide its presence from Meta and sold and licensed the data it obtained for profit, the suit says.

Meta’s director of platform enforcement and litigation Jessica Romero said: “Some impacted people don’t fit the criminal profile Voyager tries to sell as the focus of their data collection and analysis.”

Some features that Voyager advertised in the records obtained by the Brennan Center posed significant ethical questions, including one the company called an “active persona”, which appeared to facilitate police use of fake profiles to gain access to otherwise private information on Facebook.

In November 2021, after the internal records were revealed, Facebook sent the LAPD a letter demanding that it cease all social media surveillance use of “dummy” accounts, saying fake accounts were a violation of the company’s policy requiring that people use their real names. The Meta-owned platform also noted that using data obtained from the platform for “surveillance, including the processing of platform data about people, groups, or events for law enforcement or national security purposes” was prohibited.

While it is unclear whether the LAPD ultimately used the fake profile feature while working with Voyager, emails showed that officers said it was a “great function” and a “need-to-have” service.

Voyager is a part of a broader industry of better-known players like Palantir that purport to make crime predictions based on past behaviours and activity, including those shared on social media.

While the practice has been criticised by privacy and civil liberty advocates as a pseudoscience that does little more than perpetuate bias and discrimination in policing, law enforcement continues to be eager to acquire solutions that purport to make their jobs more efficient and validate their decisions. And tech firms working to offset the industry’s slowing growth have increasingly answered law enforcement’s call for new surveillance and policing products.

 

Last modified on 13 January 2023
Rate this item
(3 votes)

Read more about: