© 2024 WMRA and WEMC
NPR News & NPR Talk in Central Virginia and the Shenandoah Valley
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Foreign Interference Persists And Techniques Are Evolving, Big Tech Tells Hill

Facebook's Like logo is shown on a sign at the company's headquarters in Menlo Park, Calif.
Jeff Chiu
/
AP
Facebook's Like logo is shown on a sign at the company's headquarters in Menlo Park, Calif.

Foreign influence-mongers are altering their tactics in response to changes in the practices of the big social media platforms since the 2016 election, three Big Tech representatives told House Democrats on Thursday.

Leaders from Facebook, Twitter and Google told the House Intelligence Committee that their practices have prompted hostile nations to make some of their information operations less clandestine and more overt than they have in recent years.

"It reminds us the challenges we faced in 2016 are constant and this is an evolving security challenge and we have to keep constantly watching out for bad actors who change their behavior," said Nicholas Pickles, Twitter's director of Public Policy Strategy and Development.

He and his colleagues told Democrats in a videoconference they are seeing fewer coordinated efforts to use social media to spread false information or inflame tensions among Americans and more of that work via official government channels — some of which, however, have a presence on social media.

The tech companies say they're willing to label such material and, in some cases, demote it or remove it. That is where much of the war has moved, they said, although covert work continues too.

Facebook has taken down nearly 2 billion fake accounts just this year and it has dismantled 18 coordinated influence networks, said Nathaniel Gleicher, its head of security policy. They included three Russian and two Iranian networks and two based in the United States.

"We're proud of the progress we've made to protect authentic discourse on our platforms but there's obviously more work to do," he said.

The witnesses told members of Congress they so far haven't seen foreign actors using the old covert techniques to exploit divisions in the U.S. over race and law enforcement or target voting or the presidential election.

Those references are being made semi-officially or officially.

"Those media entities and those government accounts are engaging in the geopolitical conversation; for example, Chinese actors comparing the police response in the U.S. with the police response in Hong Kong," Pickles said. "That's a shift from platform manipulation over to state assets is something we've observed."

Americans support some limits

Americans are in favor of some limitations on what can be posted online, according to a new Gallup/Knight Foundation poll out this week. More than 80% of Americans think false health information and false information about voting and politics should be prohibited, for instance.

But it's also clear that Americans don't trust the companies to conduct the oversight of their content themselves.

A wide majority, 84% of Americans, said they either had little or no confidence in the social media companies' abilities to make "the right decisions" about what can be posted on their platforms.

Mostly, people think the companies haven't been tough enough when it comes to removing content up to this point, with only about a fifth of Americans saying the companies are "too tough." Men, whites and less educated Americans are all more likely to say content oversight is "too tough."

Democrats only

No Republicans participated in the videoconference hearing on Thursday.

The majority members who did take part peppered the Big Tech witnesses with questions about individual issues, often becoming exasperated by what they called actions that lag the biggest problems rather than getting ahead of them.

Alabama Rep. Terri Sewell wanted to know why the platforms aren't doing more to quash what she called attempted voter suppression targeting black people.

Sewell cited a CNN story about a Russian election interference network operating out of Ghana — likely to avoid efforts to detect traffic originating from Europe — which resulted in Facebook and Twitter deactivating a number of false accounts.

Sewell implored the witnesses to do more to keep black Americans from being targeted and divided in the way they were during the interference in the 2016 election.

"My community has been the victim of misleading information about voting for decades," she said.

Other members of the committee exhorted the Big Tech witnesses to redouble their efforts to turn down the volume on partisanship in the United States, because the high resting temperature, in this metaphor, means the nation is in constant danger of boiling over.

"We don't recognize each other," said Rep. Jim Himes of Connecticut. "If every American house is full of toxic gas, as I think it is today ... all it takes is a spark from Russia to set off a conflagration."

Copyright 2021 NPR. To see more, visit https://www.npr.org.

Miles Parks is a reporter on NPR's Washington Desk. He covers voting and elections, and also reports on breaking news.
Philip Ewing is an election security editor with NPR's Washington Desk. He helps oversee coverage of election security, voting, disinformation, active measures and other issues. Ewing joined the Washington Desk from his previous role as NPR's national security editor, in which he helped direct coverage of the military, intelligence community, counterterrorism, veterans and more. He came to NPR in 2015 from Politico, where he was a Pentagon correspondent and defense editor. Previously, he served as managing editor of Military.com, and before that he covered the U.S. Navy for the Military Times newspapers.