Return to search

The responsible investment dilemma of big tech

Social media platforms and online retailers were among 2020’s biggest winners. But how are investors working with the firms to manage some of their many ESG risks?

While many companies have struggled to keep their heads above water through the lockdowns and turbulence of Covid-19, there is no denying that the ‘FAANGs’ are on a roll. Facebook’s share price climbed 33% last year, and in the midst of the pandemic Amazon’s share price hit $3552.25.  The forward price-to-earnings multiple on Apple’s stock went from 22 at the end of 2019 to 33 a year later. Netflix shares have continued to surge in recent weeks, off the back of record breaking growth for the firm during lockdown, while Alphabet (formerly Google, the ‘G’ in FAANGs) closed 2020 trading at more than $1,750 per share.

All of this has been good news for the plethora of ESG funds that have heavy exposure to tech.  Back in August, S&P Global Market Intelligence analysed 17 ESG-labelled funds, including products from Nuveen, Calvert and BlackRock, and found that 14 had outperformed the S&P 500 in the first months of the pandemic – a result it attributed partly to the fact that at least 20% of each fund comprised tech stocks.

But, while these companies often position themselves as leaders on climate change – having some of the most ambitious Net Zero pledges and channeling money into low-carbon technologies – they also pose some of the biggest ESG risks. From allegations of breaking labour laws to tax scandals, from sexual harassment cases, to accusations of skewing national elections, the tech giants are constantly accused of doing the wrong thing by society. 

‘Tech companies bring a lot of positives, but then again, so do fossil fuel companies, which ESG funds wouldn’t be allowed to have this kind of exposure to’

“The reason they are a problem from an ESG perspective, is exactly the same reason they’ve made us so much money,” says one investor at a North American asset manager, who asked not to be named. “Tech companies bring a lot of positives, but then again, so do fossil fuel companies, which ESG funds wouldn’t be allowed to have this kind of exposure to.”

So how are responsible investors striking the balance between reaping the profits of this booming industry and managing its ESG risks?

Many investors complain that tech companies are particularly hard to gain access to when it comes to shareholder engagement. “It’s ironic that a company like Google, which is all about improving the flow of information, is so hard to have a conversation with,” said one investor, while another high-profile engagement specialist told RI that tech was “definitely a harder sector for us to engage with” than others. 

But some efforts are bearing fruit – especially collaborative ones.  

In 2019, two mosques in the city of Christchurch, New Zealand, were subject to terrorist attacks that killed 51 people. The shootings, carried out by a white supremacist, were streamed and distributed across social media platforms, prompting the New Zealand Super Fund (NZSF) to set up a shareholder engagement campaign with fellow state-backed investors the Accident Compensation Corporation, the Government Superannuation Fund, National Provident Fund and Kiwi Wealth. The initiative targeted Facebook, Twitter and Alphabet (the latter owns YouTube), and attracted support from more than 100 global institutional investors, including major players like Aviva, HSBC, Nomura and Northern Trust.  

Subsequently, in a 2020 letter to the executives of the three firms, the investors expressed their dissatisfaction at their response to the footage being posted and shared on their platforms, and urged them “to do more to protect the public from similar events in the future”. It asked them for better senior-level governance and accountability, and more resources to ensure social media platforms cannot be used to promote objectionable content like the Christchurch terrorist attack.

In response to the coalition’s engagement – which included shareholder proposals – Facebook, Twitter and Alphabet have all strengthened controls to prevent live-streaming and dissemination of objectionable content (although NZSF’s 2020 report points out that “as the additional mass shootings across the world have shown, the platforms are still open to abuse”). At the end of last year, facing wider pressure, Facebook also strengthened its Audit and Risk Oversight Committee to explicitly include the review of content related risks that violate its policies. 

“All the companies are taking encouraging steps to efficiently assess content and to remove objectionable content from their platforms,” reflects Katie Beith, a Senior Investment Strategist within the Responsible Investment arm of the NZSF. “Technology is developing rapidly and with the help of AI the companies appear more effective at capturing contextual content such as hate speech.” 

She says that the pension fund is now “assessing whether all the incremental changes we’re seeing at technology platforms since Christchurch are sufficient to solve the problem, or whether it is something fundamental that we need to go back to basics on”. 

“It is really difficult to assess whether the changes are strong enough in practice,” she admits. “You could even argue they’re not, with what's happened in the [US] capital,” she adds, referring to the recent riots at Capitol Hill.   

Masja Zandbergen-Albers, Head of Sustainability Integration at Robeco and a member of the coalition, wrote an investment note last month, stating: “Recent events in the US have shown that the use of artificial intelligence can bring with it negative social issues. People who are interested in one topic, such as ‘the US elections were stolen’, will only see information confirming this due to the algorithms used by social media companies. They are not exposed to other facts and opinions – and this can be detrimental”.

As well as facing pressure from the NZSF group, Facebook has received a shareholder resolution from US advocacy firm As You Sow, asking it to remove political ads containing “lies and mistruths” – as well as images of child pornography and torture – and ban accounts associated with such content. 

As You Sow’s CEO, Andrew Behar, notes that Facebook committed to introducing an algorithm that stopped the spread of hate speech on the platform ahead of the 2020 US presidential election, but that this appears to have a temporary measure, prompting As You Sow to file a proposal for the 2021 AGM asking why it isn’t permanent. 

In Europe, Sweden’s Council of Ethics – the body that advises the country’s public pension funds on ethical considerations – published a set of expectations on tech companies and human rights at the end of last year. The demands, which call on big firms in the industry to align their practices to the UN Guiding Principles on Business and Human Rights, cover a wide range of topics including discrimination related to algorithmic biases and the commercialisation and use of data. 

“We need to build a responsible tech community, not only for today and tomorrow, but for the rest of mankind’s history – that’s the key thing,” explains John Howchin, Secretary General of the Council. “And if it's done in 2021, or if the foundation is looking better in 2025, that’s okay, but we need to start having that discussion. And I think the responsible investment community is going to pick up on that.” 

The expectations are now being adopted by big names like APG, AXA Investment Management, the Church of England Pension Fund and Legal & General Asset Management to kick off engagement programmes with some of the tech heavyweights. 

Earlier this month, Swedish investors Folksam and Ohman Fonder convened more than 70 peers, managing some $6trn, to submit a letter to Amazon demanding it cease “all anti-union communications, including public statements, captive audience meetings, texts, websites, on-site billboards, and any other form of contact with workers regarding their freedom of association”. The action follows a report by the Washington Post that the firm is persistently deploying anti-union tactics against workers trying to form a union in Bessemer, Alabama.

The signatories are demanding Amazon implements its commitment to the International Labour Organisation’s Declaration on Fundamental Principles and Rights at Work and the UN Universal Declaration of Human Rights. 

And isn’t just investors tightening the thumb screws. In December, EU regulators unveiled two proposed legislative initiatives, the Digital Services Act and the Digital Markets Act, which involve a series of stronger obligations for digital services regarding how they tackle illegal and harmful content, as well as their use of customers' data, and the liability of online providers for third-party content. Large fines are threatened for non-compliance. 

In the US, social media companies have been protected by the controversial Communications and Decency Act, Section 230 of which states that: "no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider". In other words, Twitter, Facebook, YouTube and the likes are not responsible for the content they host, and cannot be punished for it. 

“We have the laws in place, but for some reason these companies are just allowed to operate outside of them,” says As You Sow’s Behar. “If they all had to follow the political advertising laws that [television] networks follow, we would be in a much better place. So step one is to redefine Facebook, Google, Twitter, et al as communications companies that fall under the same legal statutes as CBS, NBC, ABC.”

Donald Trump was a vocal critic of Section 230 during his US presidency, but the hostility towards the rule extends across the political aisle: President Joe Biden said before he was elected that he would consider revoking it too. Indeed, last week Congressional Democrats began discussions with the White House on ways to crack down on Big Tech, including how to deal with the contentious Section 230, which is now 25 years old. If greater accountability regarding content distribution is forced upon social media giants, it could open the floodgates for lawsuits – in a way that could prove financially material for the firms and their shareholders. 

“We expect regulation, it's inevitable,” says Beith, acknowledging that currently social media firms “often fall between pieces of regulation”. As well as transparency and accountability, she wants rules to be widened to address the human rights role of such companies. But, the ball is rolling, she notes. “We are seeing a swell of often-competing laws and regulations which the companies need to navigate.”