Bloomberg — By Saijel Kishan and Jeff Green
An Apple Inc. shareholder that has long chided the company for not doing enough to fight the spread of photos depicting sexual violence against young people is now upping pressure on the iPhone maker to crack down on videos too. Money manager Jeff McCroy, who runs Christian Brothers Investment Services — one of the oldest socially responsible investment firms — has pressed Apple, AT&T Inc. and Verizon Communications Inc. over the last several years on efforts to more effectively block child pornography. Reports of videos containing child sexual abuse material surpassed flags for still images for the first time in 2019 and remained nearly half of the 65 million images identified last year, according to the National Center for Missing and Exploited Children. McCroy’s firm, which owned $271 million in Apple shares and debt as of Aug. 31, wrote to Apple last month in a letter that hasn’t previously been made public, encouraging it to take a tougher stance on videos.
“The technology sector as a whole has been slow to address this important issue,’’ McCroy said in an interview. Abusive imagery in videos has the potential to spread rapidly across the internet, so companies should share artificial intelligence technologies and other tools with one another to help address the issue, he said.
The role of investors in curbing child exploitation has growing potential now that shareholders and regulators have shown more interest in holding companies accountable for their environmental and social claims, said Yiota Souras, general counsel for NCMEC. Tying the issue to earnings and reputational risk makes it more likely that shareholders will support proxy proposals, she said. “It’s not just a feel-good ‘We should think about the children’ sort of perspective,” Souras said. “It’s investors realizing these things are liabilities for companies. ”Apple tried to meet demand from lawmakers to clamp down on the spread of child sexual abuse material, or CSAM, only to get shot down by privacy advocates. The tech giant announced in August a trio of new tools designed to help fight CSAM. One feature, designed to scan a user’s iCloud photo library for explicit images of children in the NCMEC database and report them to relevant authorities, drew backlash from privacy advocates, who feared it could set the stage for other forms of tracking. Apple shelved the plan earlier this month and said it would take time to “collect input and make improvements” before attempting to roll out the system. Some privacy advocates are urging the company to scrap the plan altogether.
“It’s disappointing to learn that Apple is delaying their efforts for change,’’ CBIS’s McCroy said. “The longer it takes for action, the more children that are at risk of exposure and harm. We hope that Apple will expedite these planned improvements and take action sooner than later.”
NCMEC has a carefully vetted database of about 5 million known child-exploitation images, and of those about 500,000 are videos, Souras said. The CSAM has been tagged using a digital signature system since 2001, and in 2009 advocates added technology called PhotoDNA to allow matching of known images with those being shared on the internet. The number of known CSAM images has increased by more than 200% since 2019 and videos are making up an increasing share of new complaints, Souras said. Investors seem particularly keen now to consider socially and environmentally responsible issues. In the past few years, they’ve brought a near-record number of such topics to vote at public companies. Proposals that garner more than 30% of shareholder votes often prompt a company to meet with investors to try to allay concerns. The victory of the Engine No. 1 fund in replacing three directors on Exxon Mobil Corp.’s board over climate change risks also demonstrated the potential pitfalls of ignoring shareholder activists.
Still, while investors seem willing to engage in some ESG issues, they have been largely reluctant to approve proxy proposals on child imagery, mostly due to concerns over risks to privacy. Fewer than one-fifth of Facebook Inc. shareholders supported a proposal in May from CBIS and other investors to ensure that new encryption policies don’t make it more difficult to identify the exchange of CSAM. A similar 2019 proposal drew 13% support. Facebook said in response to the proposals that it already has sufficient protections in place. For example, Facebook said its WhatsApp chat app bans about 300,000 accounts a month suspected of sharing child exploitative material using non-encrypted data.
The balance between protecting users’ privacy and detecting abuse remains a challenge, said NCMEC’s Souras. The focus should be on developing industry standards that run across all companies, since the pace of technology likely means new platforms will develop to supplant the current market leaders, she said. Calls by the U.S. Securities and Exchange Commission to develop transparency on environmental, social and governance criteria and specific areas such as human capital are a good starting point, she said. McCroy said his firm, which manages $10.7 billion from Chicago, is currently reviewing whether Apple and other companies, including AT&T and Verizon, are taking sufficient steps to combat the spread of child abuse content on videos. CBIS, which has been targeting Apple since 2016 and has engaged with the telecom companies more recently, plans to work with them to identify specific actions they can take to mitigate the problem, McCroy said.
AT&T and Verizon both said that they are cooperating with CBIS and have taken extensive measures to train employees and add safety protections to their systems to limit abuse. Apple declined to comment.
Other tech companies are also seeking ways to limit CSAM. Facebook has long had algorithms to detect such images uploaded to its social networks. Google’s YouTube analyzes videos on its service for explicit or abusive content involving children. Adobe Inc. has similar protections for its online services.
Facebook said it also flags groups that add inappropriate comments to non-pornographic images of children and limits interactions between children and non-related adults on apps such as Instagram.
When CBIS first broached the child pornography issue with Apple about five years ago, McCroy said he felt like he was the “only investor in the room.” That has changed as more investors such as the Sisters of St. Dominic of Caldwell, New Jersey, and School Sisters of Notre Dame Cooperative Investment Fund have engaged with Apple. Others, such as Proxy Impact, Maryknoll Sisters and members of The Interfaith Center on Corporate Responsibility have raised the issues with companies such as Facebook. McCroy said he would welcome more investors speaking up.
CBIS submitted a proxy proposal in 2018 asking Apple to publish a report assessing whether its practices were sufficient to prevent material impacts to the company’s finances, brand reputation, or product demand in light of public concern over sexual exploitation of children online. “It wasn’t clear to investors what they were doing,” McCroy said.
The firm withdrew the proposal after Apple outlined ongoing efforts. The following year, CBIS said it met with Apple management to further discuss areas for improvement, including setting goals and publicizing their efforts. The money manager said it had meetings with Apple’s director of global security investigations and child safety, as well as its law enforcement compliance and app development teams.
CBIS said it also urged Apple to attend a 2019 meeting at the Vatican on fighting child exploitation. Apple did attend the meeting, but had no comment on whether CBIS played a role in the decision.
McCroy said investors should look beyond technology companies to the finance, toy and gaming industries as well as domain providers to press firms on child pornography prevention. With a new proxy season approaching, he said he’s hoping even more investors will take the opportunity to encourage companies to do more.