Trending

How the Internet Fosters Hate Speech

Contrary to popular belief, there is no hate speech exception to the First Amendment.

The social networking site Gab has been taken offline since it was confirmed that the Pittsburgh synagogue gunman used it to post anti-Semitic hate speech and to threaten Jews. The site is popular with the far right and describes itself as "an ad-free social network for creators who believe in free speech, individual liberty, and the free flow of information online. All are welcome." Gab was originally created by conservative businessman Andrew Torba in response to Twitter clamping down on hate speech in 2016.

Robert Bowers logged onto the platform shortly before killing 11 people at the Tree of Life synagogue on Saturday to post the following.

Consequently, the site has been abandoned by payment processing firms PayPal and Stripe, as well as hosting service Joyent and domain register GoDaddy. A statement on Gab's website Monday read that the platform would be "inaccessible for a period of time" as it switches to a new web host. It said the issue was being worked on "around the clock." The statement went on to defend the website, saying, "We have been systematically no-platformed [and] smeared by the mainstream media for defending free expression and individual liberty for all people."

Regarding Bowers' use of the site, Torba said, "Because he was on Gab, law enforcement now have definitive evidence for a motive," Mr. Torba wrote. "They would not have had this evidence without Gab. We are proud to work with and support law enforcement in order to bring justice to this alleged terrorist."

But companies associated with Gab were not satisfied by the site's cooperation with law enforcement and continue to abandon the site. PayPal, the platform Gab used to manage donations from users, said in a statement, "When a site is explicitly allowing the perpetuation of hate, violence or discriminatory intolerance, we take immediate and decisive action."

A tweet from Gab on Monday morning implied that the people behind the site believe themselves to be a victim of intentional defamation.

Set aside the questionable intent of the decidedly tone-deaf tweet; and, legally, Gab did not do anything wrong. Contrary to popular belief, there is no hate speech exception to the First Amendment. The Supreme Court reaffirmed this in 2017 in Matal vs. Tal, deciding, "Speech that demeans on the basis of race, ethnicity, gender, religion, age, disability, or any other similar ground is hateful...the proudest boast of our free speech jurisprudence is that we protect the freedom to express 'the thought that we hate.'" Despite this, many people are calling for the permanent removal of the site, as Wired points out, "Momentary political rage can blind people into abandoning sacred values."

However, the internet inarguably contributes to the creation of extremists, as we have seen in the case of terrorists, rapists, school shooters, and now the synagogue shooter in Pittsburgh. Sites like Gab allows users to easily find other people who share their most extreme viewpoints, inevitably normalizing disturbing rhetoric the user may have otherwise suppressed or self-corrected in time. Therefore, sites like Gab become polarizing spaces that can help to sew the kinds of ideas that lead to violent acts. But, if there's no legal action to be taken against a site like Gab without damaging free speech, what can be done?

GAB Logo

Justice Anthony Kennedy said in his opinion following Matal vs. Tal, "A law that can be directed against speech found offensive to some portion of the public can be turned against minority and dissenting views to the detriment of all. The First Amendment does not entrust that power to the government's benevolence. Instead, our reliance must be on the substantial safeguards of free and open discussion in a democratic society."

While what exactly those safeguards are remains unclear, one can speculate that what Kennedy meant is exactly what Gab calling unjust now. As previously mentioned, the site has been abandoned by all of the companies whose services were needed for the site to remain online. And just as Gab has the right to allow freedom of expression on their site as they see fit, these companies are also free to express themselves in refusing to work with websites that allow hateful rhetoric.

Indeed, the conversation surrounding the fate of Gab has revealed that freedom of speech online is not decided by the government, but by social media platforms, servers, and domain registers who are free to decide with what kind of opinion their company wants to be associated. This also means that, on some level, what is seen as acceptable online is driven by consumer outrage and approval.

Daily Mirror

For example, after facing criticism for allowing users to post prejudiced content, larger social networking sites like Twitter and Facebook have been actively fighting against hateful rhetoric with varying degrees of success. In 2016, a code of conduct was established by the European Union in collaboration with Facebook, Twitter, YouTube, and Microsoft. The code is aimed at fighting racism and xenophobia and encourages the social media companies to remove hate speech from their platforms.

So, instead of outraged Americans calling for the legal suppression of sites like Gab — an impossibility if the First Amendment is to remain intact — the real power of the individual to fight hate speech is in one's ability to support or boycott companies based on how they handle free expression.


Brooke Ivey Johnson is a Brooklyn based writer, playwright, and human woman. To read more of her work visit her blog or follow her twitter @BrookeIJohnson.

Myanmar Military Used Facebook to Incite Genocide, Ethnic Cleansing

700,000 Muslims were forced to flee to neighboring Bangladesh in 2017.

On Monday, Facebook said it removed 13 pages and 10 accounts controlled by the Myanmar military in connection with the Rohingya refugee crisis.

The accounts were masquerading as independent entertainment, beauty, and information pages, such as Burmese popstars, wounded war heroes, and "Young Female Teachers." Fake postings reached 1.35 million followers, spreading anti-Muslim messages to social media users across the Buddhist-majority country.

Facebook's move comes a year after 700,000 Rohingya, a Muslim minority group in Myanmar, were forced to flee to neighboring Bangladesh amid widely-documented acts of mob violence and rape perpetrated by Myanmar soldiers and Buddhist mobs. The United Nations Human Rights Council denounced the crisis as "a textbook case of ethnic cleansing and possibly even genocide."

Rohingya children rummaging through the ruins of a village market that was set on fire.Reuters

Last month, the social media giant announced a similar purge, removing Facebook and Instagram accounts followed by a whopping 12 million users. Senior General Min Aung Hlaing, commander-in-chief of the Myanmar armed forces, was banned from the platform, as was the military's Myawady television network.

Over the last few years, Facebook has been in the hot seat for their tendency to spread misinformation. In the 2016 U.S. presidential election, inauthentic Facebook accounts run by Russian hackers created 80,000 posts that reached 126 million Americans through liking, sharing, and following. This problem has persisted in the 2018 midterm elections, ahead of which 559 pages were removed that broke the company's policies against spreading spam and coordinated influence efforts. Recent campaigns originating in Iran and Russia target not only the U.S., but also Latin America, the U.K., and the Middle East.

The situation in Myanmar is particularly troubling—it's not an effort by foreign powers to stoke hate and prejudice in a rival, but rather an authoritarian government using social media to control its own people. According to the New York Times, the military Facebook operation began several years ago with as many as 700 people working on the project.

Screen shots from the account of the Myanmar Senior General Min Aung Hlaing, whose pages were removed in August.

Facebook

Claiming to show evidence of conflict in Myanmar's Rakhine State in the 1940s, the images are in fact from Bangladesh's war for independence from Pakistan in 1971.

Facebook


Fake pages of pop stars and national heroes would be used to distribute shocking photos, false stories, and provocative posts aimed at the country's Muslim population. They often posted photos of corpses from made-up massacres committed by the Rohingya, or spread rumors about people who were potential threats to the government, such as Nobel laureate Daw Aung San Suu Kyi, to hurt their credibility. On the anniversary of September 11, 2001, fake news sites and celebrity fan pages sent warnings through Facebook Messenger to both Muslim and Buddhist groups that an attack from the other side was impending.

Facebook admitted to being "too slow to prevent misinformation and hate" on its sites. To prevent misuse in the future, they plan on investing heavily in artificial intelligence to proactively flag abusive posts, making reporting tools easier and more intuitive for users, and continuing education campaigns in Myanmar to introduce tips on recognizing false news.

The company called the work they are doing to identify and remove the misleading network of accounts in the country as "some of the most important work being done [here]."

Joshua Smalley is a New York-based writer, editor, and playwright. Find Josh at his website and on Twitter: @smalleywrites.

Big Data and Our Elections

Sites like Facebook will have more and more influence over our elections in the future.

America's favorite uncorroborated news story of the moment is that the Russian government masterminded Trump's rise to power. It's easy to understand why. Introspection after a loss is difficult, and rather than face themselves, the DNC decided to have a seance, evoking a Cold War ghost to explain their defeat. It's somewhat comforting to assume an international conspiracy was behind the Hillary Clinton's failure in the 2016 election. It absolves the DNC of any responsibility to change their conduct or adjust their political strategy. That said, there is no hard evidence of collusion, but rather a string of awkward encounters by Trump's largely inexperienced, and frankly stupid, staff. The meat of Russia's "interference" came in the form of social media bots, fake accounts that would automatically repost sensationalist headlines to drum up support for Trump. These accounts are pretty easy to spot however, as they don't even come close to passing a turing test.

Keep reading...Show less

The Real Reason Millennials Get a Bad Rap

Even though some Millennials are almost forty, people are still bashing them.

Last year the New York Post ran an article about Millennials making up the largest portion of the American workforce, ignoring a glaringly obvious point: of course 22-37 year olds are the largest portion of the labor market; they're adults. In an effort to make a distinctly un-newsworthy article newsworthy, the Post settled on an old trope, pick on the Millennials. For its part, this article wasn't as bad as most. The author refrained from using words like "entitled" and "coddled" and "irresponsible," but there's still a certain connotation attached to the term Millennial, particularly in the way it pertains to work ethic and maturity. Repudiating a stereotype often doesn't have the desired effect; in fact, it has a tendency of validating the stereotyper.* That said, my editor's asked me to dissect the maelstrom of insults and unfair generalizations that surround my generation, so here it goes.

Keep reading...Show less