As an organization working on corporate accountability in the tech and media sectors, Open MIC’s work is increasingly focused on the ways that companies enable harmful surveillance through the sale of their products and services. Tech companies often market their products and services — which include cloud storage, data analytics services, social media platforms, biometric technologies and more — as “neutral” tools. However, the use of big data is never neutral.
Tech companies are extracting and selling huge amounts of our personal information to create profit — even when it causes harm in the process. Harvard Business School scholar Shoshana Zuboff refers to this as surveillance capitalism: companies are monetizing our behavioral and biometric data at scale, often without our awareness or our meaningfully informed consent.
The surveillance capitalism business model perpetuates racist surveillance, including when tech companies sell products to government agencies. For example, in 2012, IBM used unauthorized video footage of everyday people to create a product that would help the New York City Police Department search for people based on their hair color and skin tone, despite the fact that the NYPD has a long record of racist policing. As another example, in 2016, Twitter, Facebook, and Instagram shared customers’ location data and social media information — information that ultimately ended up in the hands of government agencies spying on Black racial justice activists.
Surveillance is not safe, and it is not good business. Tech companies selling dangerous, unregulated products and services face reputational and business risks given the civil and human rights issues posed by these sales. Open MIC is working to hold tech and media companies accountable for perpetuating racism, discrimination and surveillance by promoting policies that protect consumer privacy and community safety, and that increase transparency and oversight.
Partnering with investors who share our values, Open MIC is working to shift this harmful surveillance business model.
Below are some examples of how tech companies’ products and services enable racism and government surveillance, and what Open MIC is doing about it. To learn more, check out our Campaigns Page.
Social Harm | How Do Tech Companies Enable that Harm? | What is Open MIC’s role? |
---|---|---|
Racial Profiling | Police use of facial recognition technology led to the false arrest of a Black man in Michigan.
Hundreds of local police departments partner with Amazon’s Ring (and its corresponding Neighbors app) to gain access to household camera footage, creating a widespread surveillance system and stoking racial bias and fear in neighborhoods. |
Open MIC helped organize shareholder resolutions at Amazon asking the company to assess the civil and human rights risks of facial recognition technology, and to disclose how the company assesses the potential harm caused by its surveillance tech products in the hands of government customers. |
Voter Suppression, Housing Discrimination | Facebook established a policy that, in practice, allows politicians to spread voting misinformation and incite violence against Black people, among other harms. Facebook’s algorithms previously allowed advertisers to discriminate based on race, age, and gender in housing, employment and credit ads. |
Open MIC organized shareholder resolutions at Facebook and Alphabet urging these companies to establish board-level civil and human rights experts as well as stronger oversight regarding civil and human rights issues. |
Immigration Enforcement | Palantir provides data analytics services — hosted by Amazon cloud — to ICE, enabling the agency to carry out detention, deportation and workplace raids. Companies like Thomson Reuters, Microsoft and Salesforce also have contracts with ICE. |
Open MIC educates investors about the business and human rights risks posed by tech companies’ contracts with ICE and other government agencies, as well as the risks posed by direct and indirect investments in Palantir, a privately held company. |