Open MIC in the News
Contact us at media@openmic.org with any press inquiries.
As Interfaith Center on Corporate Responsibility (ICCR), a coalition of faith-based and values-based investors, and OpenMIC, a nonprofit focused on responsible use of digital technologies, explain in their new report, Dehumanization, Discrimination and Deskilling: The Impact of Digital Tech on Low-Wage Workers, a critical element of algorithmic management systems is the monitoring and surveillance of workers in violation of their human rights.
“Generative AI makes up information. It creates information and misinformation that doesn’t exist. So, yeah, there are a wide number of experts that are concerned about it, and the shareholder proposal at Alphabet, and one that we’ve raised at other companies, reflects that concern. One survey shows that people, because of generative AI and those concerns about generative AI, they are eroding public trust in technology generally, and that’s one of the things that concerns us about the public franchise that these companies have.”
“These AI resolutions are just the beginning – this is not something that we see as a one off,” said Dheere. “AI is here to stay and it is in the interest of civil society, investors, companies, and governments policymakers to maintain focus on how we integrate it constructively into society while protecting human rights – and, frankly, the companies that are creating it.”
Michael Connor of Open MIC: “…we've embarked over the years on a whole range of issues ranging from the need for federal privacy laws writ large to artificial intelligence, facial recognition, all sorts of issues. And more recently working with both Arjuna and with my colleague Jessica at Open MIC on questions of artificial intelligence and what that means for misinformation and disinformation and as well dealing with Ekō and Christina as well. So the three organizations have been involved in a big effort lately about artificial intelligence.”
Arjuna Capital, which specializes in sustainable investing and manages $319 million, partnered with the advocacy group Open MIC to submit proposals calling on Alphabet and Meta to produce in-depth reports on the dangers of generative AI’s deployment in misinformation campaigns and how the companies plan to address them.
“Alphabet and Meta need to assure billions of users and their shareholders that their management and boards are up to the task of responsibly managing [AI] technology,” Open MIC Executive Director Michael Connor said.
“Their desire to maximize the value of TikTok may conflict with their desire to promote the interests of the Jewish people and Israel,” said Michael Connor, the executive director of Open MIC, an advocacy organization focused on corporate accountability in the tech industry. “There may be that conflict and they may not want to wade into it. It’s a complicated situation.”
Without that trust, no system or institution that relies on the accurate communication and assimilation of fact—not democracy, not financial markets, not health or the environment, not small business, not policy advocacy, not human rights—will survive as we know it.
Before we reach the tipping point, which I and many of my colleague’s think may well be next year’s elections, it would behoove us to remember that other axiom about moving fast: Speed kills.
Tech firms began reporting details about government interactions a decade ago, under pressure from investors. Takedown demands from private parties - as in the Harvard case - can put companies in the awkward position of choosing sides, said Michael Connor, executive director of Open Mic, which has successfully pressed companies for details about government requests. Connor said of Google that like other tech companies, "They don't like being in the position of being content moderators, and they'd rather they didn't have to."
"We should not be turning the classroom into a combat zone or a surveillance zone," said Michael Connor, executive director of Open MIC, a nonprofit that is urging Axon's shareholders to vote to abandon the drone plan in schools and other public places at a shareholder meeting next month.
The lack of information is a problem for shareholders and users alike. Investors play a critical role in holding corporations accountable by ensuring that companies like Amazon are upholding their human rights commitments and avoiding risky behaviors. But without more transparency, investors are left in the dark.
“You've got one person who is the CEO and the chairman and the largest stockholder in the company, and he frankly is responsible for a number of bad business decisions,” Connor says. “The share price took an almighty hit, and the solution was something close to a gimmick to try and bring it back.”
Because IPG has now followed up by publicly stating that Acxiom does not collect personally identifiable information related to sensitive locations such as abortion clinics, nor does it collect granular purchase data, Open MIC has withdrawn its proposal.
“A big part of our concern really comes down to transparency,” Dana Floberg, Open MIC’s advocacy director, told MarketWatch on Thursday. It’s unclear exactly what data Acxiom collects, Floberg added, as well as “what kinds of measures they may or may not be taking to ensure they keep that data safe, and what they turn over to law enforcement.”
"All the shareholder proposal asks, Is that an independent third party examiner come in and look at the evidence in an organized manner and try to help the company understand what its products are doing to people," Michael Connor, one of the activists, said in an interview Tuesday with Reuters.
“Alphabet is one of the most influential companies on the planet that shapes people’s attitudes about all sorts of things through search, through YouTube,” said Michael Connor, executive director of Open MIC, a corporate responsibility group that also signed the letters. “The proposal … is not an extraordinary request.”
Record ESG proposals are being fueled by a more shareholder-friendly SEC regime and growing awareness of shareholder engagement tools, according to Michael Connor, the executive director of Open Mic, an organization that works with institutional investors to file social impact shareholder proposals targeting Big Tech. Companies usually try to block shareholder resolutions through legal challenges, but “right now, the folks who make those decisions at the SEC are more favorably inclined toward shareholder resolutions,” Connor told Protocol.
“We need to be realistic about the outcomes, especially at companies like Alphabet and Meta which have ‘dual class’ shares that give the company founders and other insiders powerful voting rights that make shareholder initiatives extremely challenging,” said Michael Connor, Director at Open MIC, a nonprofit which campaigns for corporate accountability in media and tech.
In response to a shareholder proposal that would require the company to produce a public report on how nondisclosure agreements affect harassment and discrimination claims, Google parent Alphabet said in a proxy statement that its “employment, severance, and settlement agreements do not prohibit the disclosure of facts underlying claims of harassment or discrimination.” A shareholder proposal to study the risks associated with confidentiality clauses also passed a vote.
Despite protests from Meta, the US Securities and Exchange Commission (SEC) ruled that the vote should go ahead, which Michael Connor, Executive Director of Open MIC, noted is “a win for all those who are deeply troubled by Meta’s appalling track record of dodging accountability and failing to address human and civil rights abuses, as well as privacy concerns affecting billions of people globally”.
“For the last year or so we have been filing shareholder proposals and pressing companies on this subject,” said Michael Connor, executive director of Open Mic, one of the members of the Transparency in Employment Agreements, or TEA Coalition, which has been pressuring shareholders to act. Salesforce was facing a proxy vote at an upcoming annual meeting asking it to prepare a report on how mandatory so-called concealment clauses might stifle disclosure. The announcement today withdraws that proposal.
With AI-related risks being inherently global in nature, the hope and anticipation is that there will be a suitably robust regulatory environment wherever the technology is deployed, Michael Connor, Executive Director at Open MIC, tells ESG Investor.
“Many companies have global activities, so we need to ensure that if they are applying a higher standard in Europe due to EU regulations, this should be the case wherever they operate to foster greater corporate accountability in the deployment and use of digital technologies,” he says.