Social networks face legal threats, censorships

By Jacob Vaughn
Contributing Writer

Photo Illustration by Eriana Ruiz
Social media companies become more accountable for online terrorist activity in the public eye.

The increasing threat of terrorism has the masses surrendering their rights in an aimless search for safety and security.

A new German law, the Enforcement on Social Networks, or NetzDG, and recent lawsuits in the U.S. could set a precedent for social media censorship for years to come.

Facebook, Twitter and Google were sued in recent months over the alleged roles they played in acts of terrorism around the world, including the San Bernardino, Dallas and Paris attacks.

SOCIAL MEDIA SUED

According to HuffPost, victims’ families of the terrorist attacks in San Bernardino, California on December 2015 filed a lawsuit against the three tech companies for knowingly providing “material support” to terrorist groups, which enabled them to carry out attacks.

Another lawsuit claims the social media companies created a hostile atmosphere that led to a gunman shooting and killing five Dallas-area police officers on July 2016, according to Fox 4.

Families of Americans killed by the Islamic State group in the Paris attacks sued Twitter for violating the Anti-Terrorism Act, which is supposed to keep terrorist organizations off its platform, according to Business Insider.

“The tech companies have made significant progress on this issue, but we need to go further and faster to reduce the time it takes to reduce terrorist content online, and to increase significantly their efforts to stop it from being uploaded in the first place,” U.K. Prime Minister Theresa May said, during the U.N. General Assembly Sept. 21.

Hazel Carlos, a Brookhaven College English professor, said she believes social media corporations have a social responsibility to keep harmful content off their platforms. “I still think, however, that it’s more than punishing the person who owns the platform,” Carlos said, “It’s going after the perpetrators, those who misuse the system.”

The issue of “fake news” and racist content was prevalent in Germany ahead of their national elections. Some politicians worried such content might sway public opinion. On June 30, the German Parliament approved NetzDG, which will fine social networks up to 50 million euros if they fail to remove illegal hate speech within 24 hours, according to the BBC.

CONTENT MODERATION

Miguel Teran, a student, said he does not agree with NetzDG. “Social media is merely the platform. It’s the sources that create fake news that should be held responsible,” Teran said. “It is our responsibility as consumers to think for ourselves and take time to verify whether the information is accurate or not.”

The increased emphasis on the threat of extremism online requires an increase in moderation of content on internet platforms. In a statement regarding the bill, a spokesperson for Facebook said they will employ 700 staff members in Berlin for content moderation via a partnering company, Arvato, by the end of the year, according to Reuters. This mirrors Mark Zuckerberg’s latest announcement of 3,000 new jobs to help improve its content review process in the U.S., according to Forbes.

FREE SPEECH ONLINE

Though Facebook has refused to comment on whether employees or contractors would be used, the problem with using contractors to moderate content on these platforms is that it would allow private companies, rather than the courts, to judge content illegality, a spokesperson for Facebook said.

The recent moves by France, Germany and the U.K., as well as the lawsuits in the U.S., could affect free speech online for years  if successful.

Bernhard Rohleder, manager for Bitkom, a digital association that represents companies around the world, expressed concerns about the bill.

“Given the short deadlines and the severe penalties, providers will be forced to delete doubtful statements as a precaution,” Rohleder said. “That would have a serious impact on free speech on the internet.”

According to Reuters, Rohleder also said the government should manage specialist teams to monitor internet content for potential infringements, rather than expect social networks to do it themselves.

Comments are closed.