Big Tech Faces Big Challenge With Supreme Court

One can only hope that the latest court battle may finally swing against Big Tech, considering that the massive companies seem to get away with … well, literally everything.

They literally just admitted to engaging in a mass censorship campaign that undoubtedly influenced some voters in the 2020 election, yet they apparently will face no culpability whatsoever in spite of their clear interference with democratic processes.

Moreover, their platforms oftentimes serve as a conduit for various militant organizations, from the Democrat-endorsed BLM to a broad array of different terrorist groups that have exploited social media platforms for radicalization purposes.

And, predictably, every single tech giant that has come under fire has immediately thrown up the Section 230 defense, which contains the following provision: “No provider or user of an interactive computer service shall be treated as the publisher of or speaker of information provided by another information content provider.”

In other words, neither Google nor Facebook nor any other left-leaning tech company is apparently responsible for effectively supporting terrorists through letting them operate unchecked.

Meanwhile, a former American president remains banned.

However, Big Tech’s arrogance may actually be tested for once, depending upon the outcome of a few key Supreme Court cases, as reported by The Hill, as the family of a victim of terrorism has filed suit against Google for apparently not only permitting ISIS videos to flourish, but also to permit its own algorithms to recommend ISIS videos to already radicalized individuals.

“The family of Nohemi Gonzalez, a 23-year-old U.S. citizen killed during a 2015 series of Islamic State terror attacks in Paris, sued YouTube parent company Google, arguing the video sharing site not only provided a platform for videos containing terrorist content, but also recommended the videos to users,” the outlet reported.

Indeed, the Reynaldo Gonzalez et al. vs. Google, LLC case should be quite interesting, considering the significant legal question that it poses.

In a petition filed with the Supreme Court, the issue at stake is whether or not companies can be held liable for the havoc their algorithms can wreak upon general stability and social safety.

“Does section 230(c)(1) immunize interactive computer services when they make targeted recommendations of information provided by another information content provider, or only limit the liability of interactive computer services when they engage in traditional editorial functions (such as deciding whether to display or withdraw) with regard to such information?” the petition inquired.

Does it indeed? One thing is for sure: If Big Tech was 10 percent as zealous about restricting actual terrorist content as it was about censoring Trump, then perhaps some terrorist recruitment efforts would not have been nearly as successful.

“[O]ver the last two decades, many interactive computer services have in a variety of ways sought to recommend to users that they view particular other-party materials, such as written matter or videos,” the petition continued, “those recommendations are implemented through automated algorithms, which select the specific material to be recommended to a particular user based on information about that user that is known to the interactive computer service.”

And, lo and behold, some “recommended” videos turned out to be none other than terrorist promotional videos, launched by none other than ISIS.

“The public has only recently begun to understand the enormous prevalence and increasing sophistication of these algorithm-based recommendation practices,” the petition added.

No kidding. Not to mention the degree to which such algorithms demonstrate obvious political bias.

The petition also notes questionable rulings in other cases, such as Dyroff v. Ultimate Software Group, Inc.

“The nature of the panel decision below presents this Court with an extraordinary situation. A majority of the panel, regarding itself bound by Ninth Circuit precedent in Dyroff, held that section 230 immunizes an interactive computer service from liability for recommending other-party content, in this case for recommending ISIS proselytizing and recruitment videos, at least so long as the provider is dispensing recommendations even-handedly to terrorists and non-terrorists alike,” the petition noted.

Such “immunity” is ridiculous considering the damage that can occur to society as a whole.

“[W]hether section 230 applies to these algorithm-generated recommendations is of enormous practical importance,” the petition continued.

Also, of “enormous practical importance” is what Big Tech considers to be a real danger to the public.

If Trump is a so-called “danger to the public,” one that must be banned from social media platforms, what exactly is ISIS?

“To be sure, many of those recommendations, such as of a clever TikTok dance, are benign. But other recommendations suggest that users look at materials inciting dangerous, criminal, or self-destructive behavior,” the petition continued.

Yep. Too bad the FBI and all other so-called intelligence agencies appear more fixated on January 6 than other terrorist groups.

“In this case, the defendants are alleged to have recommended that users view inflammatory videos created by ISIS, videos which played a key role in recruiting fighters to join ISIS in its subjugation of a large area of the Middle East, and to commit terrorist acts in their home countries,” the petition added.

Something suggests that this type of behavior is a whole lot more “destructive” than daring to hold a different political opinion or (gasp!) exercising one’s right to free speech.

In a different petition, Twitter Inc. v. Taamneh, a similar issue is called into question, chiefly the degree to which Big Tech should be responsible for the potential havoc brought about by its algorithms.

It will be quite interesting to see if Big Tech’s power is finally reigned in … at least to a degree.

Author: Ofelia Thornton


Most Popular

These content links are provided by Content.ad. Both Content.ad and the web site upon which the links are displayed may receive compensation when readers click on these links. Some of the content you are redirected to may be sponsored content. View our privacy policy here.

To learn how you can use Content.ad to drive visitors to your content or add this service to your site, please contact us at [email protected].

Family-Friendly Content

Website owners select the type of content that appears in our units. However, if you would like to ensure that Content.ad always displays family-friendly content on this device, regardless of what site you are on, check the option below. Learn More



Most Popular
Sponsored Content

These content links are provided by Content.ad. Both Content.ad and the web site upon which the links are displayed may receive compensation when readers click on these links. Some of the content you are redirected to may be sponsored content. View our privacy policy here.

To learn how you can use Content.ad to drive visitors to your content or add this service to your site, please contact us at [email protected].

Family-Friendly Content

Website owners select the type of content that appears in our units. However, if you would like to ensure that Content.ad always displays family-friendly content on this device, regardless of what site you are on, check the option below. Learn More