With this change, combined with the algorithms which filter out certain content people post by limiting its distribution, Facebook has become a powerful gatekeeper that can decide which stories will go viral, and which ones will remain virtually unknown. Facebook also poses a danger to free speech by policing and censoring what people post, and if something is deemed ‘too politically incorrect,’ then posts are automatically deleted and users may have their accounts completely shut down.
Most news websites now rely on Facebook for the majority of their traffic from users posting links to their articles. An Internet analytics firm showed that Facebook was responsible for driving 43% of web traffic to over 400 major sites in 2016. 400
According to their study, in 2014 Facebook was responsible for 20% of all traffic to news sites, and in just two years that figure more than doubled as people became accustomed to scrolling through their Facebook feeds to see what articles their friends had posted and because they were now ‘following’ news websites on Facebook instead of bookmarking the websites in their Internet browser and visiting them directly. 401
CEO Mark Zuckerberg has said one of his goals is, “To build the perfect personalized newspaper for every person in the world.” 402Facebook even began hosting articles from major publishers so users who clicked on a link wouldn’t leave the Facebook ecosystem and could now view the content within Facebook’s app. 403
The company wants to be the primary hub of the Internet, bypassing search engines and web browsers altogether. 404For those who were using the Internet in the late 1990s and early 2000s, we recall most companies encouraging people to visit their websites at the end of their commercials, but those calls to action have been replaced by now encouraging people to follow them on Facebook instead, making Mark Zuckerberg one of the most powerful (and unnecessary) middlemen in the history of the Internet.
As the 2016 election approached, many media analysts and tech bloggers began to realize that with so many people relying on Facebook as their primary news aggregator, that the site could leverage their power hoping to influence the election. New York Magazine published an article which asked, “Could Facebook help prevent President Trump?” and went on to say, “Not through lobbying or donations or political action committees, but simply by exploiting the enormous reach and power of its core products? Could Facebook, a private corporation with over a billion active users, swing an election just by adjusting its News Feed?” 405
Paul Brewer, a communications professor at the University of Delaware, said, “Facebook would, like any campaign, want to encourage turnout among the supporters of its preferred candidate, persuade the small number of genuinely uncommitted likely voters, and target apathetic voters who could be convinced to get out to the polls.” 406
Josh Wright, the executive director of a behavioral science lab, also admitted, “There’s lots of opportunity, I think, to manipulate based on what they know about people.” 407Wright pointed out how the site could fill people’s news feeds with photos or stories showing a particular candidate engaged in activities that Facebook knows they like in order to use “in-group psychology” to get people to identify with a candidate who shares some of their interests.
We tend to judge someone by what other people we like are saying about them, and so Facebook could highlight statements made by celebrities that people follow, or even our own friends, about a candidate in order to influence our opinion of that person. If you think Facebook wouldn’t engage in this kind of personalized high-tech manipulation, you would be wrong, because they already have.
A secret study Facebook conducted during the 2010 midterm elections, with help from researchers at the University of California, San Diego, investigated what’s called social contagion which is how behavior or emotions are copied by others. Facebook included over 60 million of their users in the experiment and found that they could influence people to actually get out and vote by showing people that their friends had voted, which then influenced others to go vote as well. “Our study suggests that social influence may be the best way to increase voter turnout,” said James Fowler, a UCSD political science professor who conducted the study. “Just as importantly, we show that what happens online matters a lot for the ‘real world.’” 408Their experiment increased voter turnout by 340,000 people. 409
Facebook obviously has a political agenda. They’ve hosted a Q & A for Barack Obama, 410they hung a huge Black Lives Matter banner at their headquarters, 411and Mark Zuckerberg has been very outspoken about his support of illegal immigration, 412gay marriage, 413and other liberal causes. The company conducts internal polls of employees where they submit questions and vote on them in hopes of getting Zuckerberg to answer, and one poll in March of 2016 showed that a bunch of employees asked if the company should be used to help prevent Donald Trump from winning the election. 414
UCLA law professor Eugene Volokh told Gizmodo, “Facebook can promote or block any material that it wants. Facebook has the same First Amendment right as the New York Times. They can completely block Trump if they want. They can block him or promote him.” 415Technically the First Amendment only prevents the U.S. government from suppressing someone’s speech, not a corporation.
Gizmodo’s report on the political bias of Facebook pointed out, “Most people don’t see Facebook as a media company◦— an outlet designed to inform us. It doesn’t look like a newspaper, magazine, or news website. But if Facebook decides to tamper with its algorithm◦— altering what we see◦— it’s akin to an editor deciding what to run big with on the front page, or what to take a stand on.” 416Whether they are legally allowed to do such a thing is one issue, whether such favoritism and censorship is deceptive and immoral is another.
“If Facebook decided to,” professor Volokh says, “it could gradually remove any pro-Trump stories or media off its site◦— devastating for a campaign that runs on memes and publicity. Facebook wouldn’t have to disclose it was doing this, and would be protected by the First Amendment.” 417
“If Facebook was actively coordinating with the Sanders or Clinton campaign, and suppressing Donald Trump news, it would turn an independent expenditure (protected by the First Amendment) into a campaign contribution because it would be coordinated◦— and that could be restricted,” he said. “But if they’re just saying, ‘We don’t want Trump material on our site,’ they have every right to do that. It’s protected by the First Amendment.” 418
Censorship of Trending Topics
In May of 2016, tech blog Gizmodo confirmed what many had suspected and what was obvious to those with common sense◦— that Facebook was systematically suppressing news stories from conservative outlets and those which presented a positive conservative message. 419“Facebook workers routinely suppressed news stories of interest to conservative readers from the social network’s influential ‘trending’ news section, according to a former journalist who worked on the project,” reported Gizmodo. 420
The whistleblower revealed that the company suppressed stories about CPAC (the Conservative Political Action Committee conference), Mitt Romney, Rand Paul, and other topics from showing up on the trending module, even though they would have appeared there organically from so many people posting about them.
It wasn’t just one whistleblower, but several, and they also revealed that employees would manually insert topics into the trending list that they wanted to get more attention. One former employee said that positive stories about Black Lives Matter were often inserted into the trending box to help them go viral when they didn’t organically trend from people posting about them. 421
Читать дальше