Last week, after what must have been a series of extremely grim meetings in Menlo Park, Facebook admitted publicly that part of its revenue includes what appears to be politically motivated fraud undertaken by a shady Russian company. The social network, perhaps motivated by a Washington Post scoop on the matter, released a statement outlining the issues at hand, but leaving the most important questions unanswered. Only Facebook knows these answers, and we should assume they won’t be eager to volunteer them.

After last week’s reports, Facebook received a round of emails and calls from reporters asking for clarifications on the many glaring gaps in the social network’s disclosure:

  • What was the content of the Russian-backed ads in question?
  • How many people saw these ads? How many people clicked them?
  • What were the Facebook pages associated with the ads? How many members did they have?
  • What specific targeting criteria (race, age, and most importantly, location) did the Russian ads choose?

Given that Facebook reaches a little under 30 percent of the entire population of our planet, the answers to these questions matter.

The response I received from Facebook PR (“We are not commenting beyond the blog post at this time”) is typical.

But even when Facebook does decide to talk to journalists, it has the tenor of an occult priest discussing something from beyond an eerie void: Just last week, when faced with a report that its advertising numbers promised an American audience that, in certain demographics, well exceeded the number of such humans in existence, judging by U.S. Census Bureau numbers, Facebook told the Wall Street Journal that its numbers “are not designed to match population or census estimates. We are always working to improve our estimates.” Facebook’s intercourse with the public need not adhere to the so-called norms of so-called reality.

But beyond the technology sector’s penchant for PR non-statements and Facebook’s understandable reluctance to further insinuate itself into the political controversy of the decade is the fact that complete inscrutability is just the way the company works. Facebook, even more than Apple, which has taken corporate secrecy to quasi-military lengths, operates as a black box. No one outside of the company knows exactly how the site’s algorithms, by which media and advertising industries now live and die, function. Academics and other researchers must go on what can be learned of Facebook merely by using Facebook. “Many of us wish we could study Facebook,” professor Philip Howard of the Oxford Internet Institute told The Guardian in May, “but we can’t, because they really don’t share anything.” When details of how Facebook suppresses certain content and amplifies other information leaks, it becomes a media sensation.

At the same time as it operates in near-total opacity, Facebook trumpets just how well the black box works; its advertising case study library is ample, including stories boasting how Facebook can swing political elections. Facebook crucially never makes it clear exactly how it will help you win an election (or sell more fried chicken, or bracelets, or subscriptions). It just does. This magical efficacy, the company’s apparently unparalleled power to make people look at and maybe even click on things, is helping Facebook reach quarter after quarter of mammoth profits and swallow whole larger and larger chunks of advertising and media around the world. This is disturbing enough if you happen to work at a newspaper, but ought to be something even beyond disturbing if you happen to be a citizen of any country in the world.

Facebook prides itself on its ability to create successful influence campaigns at the same time that a foreign influence campaign against the U.S. is pretty much the only story anyone cares about (and lest we forget, just last November, Facebook founder and CEO Mark Zuckerberg wrote the whole thing off as “pretty crazy”). Today, we have reason to believe the two are connected. And while $100,000 is a pittance to Facebook, the bottom line is that even Facebook is willing to admit to the existence of bad-faith efforts to persuade its users by entities unknown and for purposes unknown. Shouldn’t the roughly 80 percent of Americans who use Facebook know as much as possible about how they’re being remotely manipulated?

It’s good and worthwhile that Congress and Special Counsel Robert Mueller are probing the company’s ability to influence political thought and action, and that Facebook staffers have briefed the House and Senate intelligence committees, but if this is truly a democratic crisis, it can’t be a closed-door crisis.

Zuckerberg should publicly testify under oath before Congress on his company’s capabilities to influence the political process, be it Russian meddling or anything else.

 

If the company is as powerful as it promises advertisers, it should be held accountable.

 

And if it’s not, then we need to stop fretting so much about it.

 

Either way, threats to entire societies should be reckoned with publicly by those very societies and not confined to R&D labs and closed-door briefings. If democracy can be gamed from a laptop, that shouldn’t be considered a trade secret.

Nearly 80 years ago, then-junior Sen. Harry Truman began a seven-year inquiry into American military waste, largely targeting the vast scale with which private corporations were duping and defrauding the public to enrich themselves. Military contractors aren’t known for their transparency or willingness to level with the public, so it had to be pried from them — the process saved an estimated $230 billion in today’s dollars. In 1994, tobacco executives were brought before Congress to testify under oath as to the threat their ubiquitous social product posed to the public.

David Carroll, a professor of media design at the New School and director of its Design and Technology MFA program, has been a vocal critic of Facebook’s business practices through the 2016 election. Carroll told The Intercept he believes “the public needs to hear Zuckerberg respond to crucial questions in his voice and in his words as the custodian of most of the public’s personal data, social relationships, and media consumption.” In particular, Carroll thinks Zuckerberg ought to explain the fact that bad actors are good business: “How is he going to protect future elections while also fulfilling his fiduciary duty, which is going to involve ‘leaving money on the table’ — how is he going to do it?”

Ron Fein is the legal director of Free Speech For People, an advocacy group targeting corporate influence in democratic processes. Last year, FSFP filed a formal complaint against the Russian government and President Donald Trump with the Federal Election Commission over what it described as “violations of the federal campaign finance law prohibiting foreign nationals from spending money in U.S. elections, including through paid social media.” The group is not getting any less angry after last week’s Facebook news, and Fein is among those who’d like to see Zuckerberg testify before Congress. He told The Intercept:

In the past, advertising was all public. If the Russian government had bought advertising in an election in the 1980s or 90s, it would have been on television, radio, or newspaper ads that anyone could see. We would know what the ads said, and where they were shown.

 

Facebook ads are only visible to the users that the advertiser, using Facebook’s proprietary algorithm, chooses to target. We don’t know what the ads said, or who saw them. The public can’t understand the scope of the problem of foreign election interference if Facebook continues to insist on secrecy. We can’t fully protect ourselves from future election interference if the only information we have is what Facebook has chosen to share to this point. Right now, our laws prohibit foreign nationals from spending money to influence U.S. elections, but that is difficult to prove if the only evidence is what Facebook chooses to disclose.

Mark Zuckerberg’s open testimony could be very helpful in helping Congress and the public understand what happened in the 2016 campaign, and what risks we face in the 2018 and 2020 elections.

Daniel Kreiss, a professor at the University of North Carolina who studies the effect of technology on political communications and practices, is one of the many researchers around the world frustrated by Facebook’s unwillingness to explain how Facebook may or may not affect over 2 billion people — many of them potential voters. Kreiss admitted he’s unsure of “whether there’s the political will to bring Zuckerberg in and make him answer questions about Facebook’s role in the election,” but that it could answer long overdue questions. “I just find it stunning that we’re learning about these ad buys 10 months after the election is over as opposed to when they would’ve been consequential,” he said. But, Kreiss says, we also can’t be certain it was ever consequential: Facebook’s crucial advertising and algorithmic data is “not public,” meaning “academics can’t go in and really clearly analyze the effects of advertising, let’s say, on the members of electorate, in a rigorous way. … That data isn’t independently verifiable, therefore it’s just what fuels speculation. The point is we can’t really know.” And that’s by design.

It’s reassuring that Facebook is cooperating with the ongoing Russia-related probes. But this is bigger than Russia, bigger than Hillary Clinton, and bigger than 2016. Should Facebook continue to simply allude to its ominous potential rather than sharing it in full, there’s only one good option left: Bring in Mark Zuckerberg and have him sworn in live on C-SPAN. No spokespeople required.