Father of a woman who was killed by ISIS requests that the U.S. Supreme Court considers a case concerning algorithms, terroristic acts, and freedom speech.YouTube is celebrating its 17th anniversary. The game-changing, user-generated video platform has been criticised for its role in encouraging extremism. YouTube was once well-known for sharing funny videos with pets. Evidence is abundant doubtAlthough the claim that social media sites are radicalizing American youth may be true, it is difficult to counter such allegations with high-profile stories about negative turns caused by YouTube, Facebook and other websites. Now one such story may come before the Supreme Court—and threaten a foundational internet speech law.
This is the caseGonzalez v. Google LLC() is about a man who lost his daughter in the 2015 ISIS attacks in Paris. Reynaldo, Gonzalez’s grieving father sued YouTube, its parent company, Google, for violating the U.S. Anti-Terrorism Act. Gonzalez alleges that ISIS uploaded recruitment videos on YouTube and YouTube suggested these videos to its users. This led to Gonzalez’s daughter’s suicide.
Although Gonzalez’s situation is tragic, his claims are convoluted. You must buy the notion that the terror group would not be able to recruit members without YouTube and would also cease using existing networks for attacks to execute their attacks.
Gonzalez’s case has yet to be heard by the Supreme Court. However, if the Court does decide to review Gonzalez’s case it will likely consider Section 230. It is a federal law that prevents internet companies or users from being responsible for third-party speech.
SCOTUSblogNotes that Justice Clarence Thomas in 2020 suggested that we “in an appropriate instance, we should examine whether the text this increasingly significant statute aligns with current status of immunity enjoyed Internet platforms.” Thomas made the same suggestion earlier in this year.
“The petition was in Gonzalez v. Google LLCAndrew Hamm writes:
Gonzalez’s claim was dismissed by the district judge on the grounds that Section230 protected Google in its recommendations. This is because ISIS created the videos, and not Google. U.S. Court of Appeals, 9th Circuit, affirmed that Section 230 protected such recommendations. It did so if its algorithm treated the content of the website in the same way. But, the panel rejected this conclusion. GonzalezA recent case on this same issue, which a 9th Circuit panel ruled while it was being considered bound for the result, led to its acceptance by itself Gonzalez The matter was still pending. The GonzalezFurther, the panel found that Section 230 does not cover content recommendations by providers, even if it can resolve this question.
Gonzalez concedes that this is a question for all circuits. Gonzalez suggest that if Gonzalez’s case had been decided before another 9th Circuit decision then Google would have received review. He maintains that the financial dependency of providers on advertising and thus on targeting algorithms makes it important for this issue to be resolved.
Gonzalez isn’t the first to sue social-media companies because of terrorist acts. As Tim Cushing pointed out, these individuals “have not yet to convince a court into accepting their arguments.” Techdirt last fall.
These cases have invariably been horrendous. Often, victims were killed brutally and loved ones seek redress. It is natural and reasonable to want to offer them some kind of relief from somebody, however, as we argued, someone cannot be an online platform.” Cathy Gellis, a constitutional lawyer, wrote in 2017.
This could be due to many reasons, some of which have nothing in common with Section 230. Even if Section 230 was not in place, platforms may still be liable for harms resulting out of users using their services. However, they would need to establish a direct connection between harm and platform use.
Gellis stated that Section 230 should prevent courts from even considering the tort law analysis. A platform shouldn’t be forced to defend itself against any liability that might have been caused by how it was used,” Gellis said.
A recent study supports the long-held belief of advocates for Section 230: It protects smaller companies and creators better than tech giants.
Section 230 is best understood if you are familiar with the mechanism. Actually works, the key is that it gets frivolous and wasteful lawsuits kicked out of court much earlier—when the costs are still intense, but more bearable,” points out Mike Masnick in a TechdirtPost about the study. “Without Section 230, the end result of the court case may be the same—where the company wins—but the costs would be much, much higher. This is not going to be a big difference for Google and Facebook. It can mean the difference between life or death for small companies.
“Larger enterprises are less impacted by lawsuits,” point out the authors of the study, in a report titled “Understanding Section 230 & the Impact of Litigation on Small Providers.” Large companies may already possess: 1) attorneys who are available to handle lawsuits, which allows the company to go on with its business without disruption; 2) insurance that covers the costs of defenses, settlements, or damages awards; and 3) enough revenue and cash reserves so that cases can be viewed more as an expense of doing business than a catastrophic event.
It is less common for smaller companies to be able or afford these items, so it’s less likely that such businesses will defend lawsuits.
Alas! Politicians have used Section 230 to make excuses for anything that they don’t like or support about the internet. And lawyers keep getting creative in their anti–Section 230 arguments, suggesting that while Section 230 may broadly protect internet platforms from liability for many things that users post, the law does not covers various facets of these platforms—their design, their algorithms, etc..
Sometimes judges are sympathetic. In Twitter: Taamneh, which the 9th Circuit ruled on last year, Judge Marsha Berzon wrote that Section 230 protects “only traditional activities of publication and distribution—such as deciding whether to publish, withdraw, or alter content—and does not include activities that promote or recommend content or connect content users to each other. The Court is urged to reconsider the Court’s en banc precedent in that Section 230 does not cover the application of machine-learning algorithms for recommending content or connecting users.
Tim Cushing, last summer, wrote that this is a dangerous and strange place to draw the line on Section 230 lawsuits. Algorithms are affected by inputs from users. YouTube is not responsible for any videos uploaded by its users. Therefore, YouTube should be inoculated against suggesting content to be based on algorithmic criteria. UsersYour preferences and actions. Without input from viewers, the algorithm can do almost nothing. To get the algorithm moving, it takes an user.
Federal appeals court rejected the ban by a university on discriminatory harassment According to the 11th Circuit Court of Appeals, “discriminatory harassment ban” at the University of Central Florida is likely to violate the First Amendment. This case is Speech First, Inc. v. CartwrightThe entire decision can be accessed here. The court ruled that the school’s anti-discriminatory speech policy was too expansive.
[T](1) The policy bans many “verbal,” physical, and electronic expressions concerning any of these (depending how you count them); (2) the prohibition states that speech may take many forms including name-calling and graphic or written statements. could Be humiliating; (3) uses a gestaltish, “totality” approach to decide whether a particular speech “unreasonably alters the educational experience” of another student; and (4) includes a student’s speech as well as her behavior “encouraging,” “condoning,” or “failing” to interfere to stop another student’s speech.
The policy, in short, is staggeringly broad, and any number of statements—some of which are undoubtedly protected by the First Amendment—could qualify for prohibition under its sweeping standards. Just to name a few, the policy prohibits “verbal and physical conduct” that is “racially, ethnically, religiously or otherwise based. [or]Non-religion, “sex,” or “political affiliation” are all acceptable. Speech First’s members claim that abortion is wrong, that government shouldn’t be able force religions to recognise marriages they do not agree with and that affirmative action is unfair. They also believe that a man cannot marry a woman if he feels like one.
Stanley Marcus concurred in the opinion and warned about “the grave danger posed by policy that effectively polices intellectual dogma.”
Marcus said that history is full of warnings about times when universities and colleges have abandoned truth in favor of becoming cathedrals of dogma worship. A society can fall into ignorance if it does not have access to academic institutions that are committed to truth. The human brain is not capable of generating ideas beyond what can be challenged and discussed.
The Volokh ConspiracyMore information on the ruling, including the charming line “The state is really big boy”) can be found here.
A private space mission returns to the earth.Today marked the beginning of the first private-funded return trip to International Space Station. CNN has even more information:
Axiom Space is a Houston-based company that arranges rocket rides for people who have the means. It also coordinates flight to the ISS and provides training.
The four crew members—Michael López-Alegría, a former NASA astronaut-turned-Axiom employee who is commanding the mission; Israeli businessman Eytan Stibbe; Canadian investor Mark Pathy; and Ohio-based real estate magnate Larry Connor—left the space station aboard their SpaceX Crew Dragon capsule on Sunday at 9:10 pm EST….They will spend about one day free flying through orbit before plummeting back into the atmosphere and parachuting to a splashdown landing off the coast of Florida around 1 pm ET Monday.
AX-1 was launched on April 8. It had been originally scheduled for a 10-day mission. But delays led to the extension of the mission by approximately a week.
Private citizens can now pay for spacecraft seats. This isn’t the first occasion. CNN reported that AX-1’s crew was entirely private and there were no members of the government who traveled with them aboard the capsule on their journey to and from the ISS. It’s the first time that private citizens have ever traveled to the ISS in a US-made satellite.
This is a strong reminder that space exploration doesn’t have to be controlled and funded by the government.
• Air Force Major General William T. Cooley has been found guilty of abusive sexual conduct. This is the “first court-martial prosecution and conviction of an Air Force Major General in 75 years of military branch history.” The New York Times reports.
• Twitter is “reportedly prepared to accept Elon Musk’s offer to buy the company.”
• Wynn Bruce, a climate activist who set himself on fire in front of the Supreme Court on Friday, has died.
• The European Union is demanding that tech companies crack down on “hate speech.”
• What happened to CNN+?
• Actor Johnny Depp’s defamation trial against ex-wife Amber Heard is underway. Depp gave evidence last week, and is scheduled to be cross-examined today by Heard’s legal staff.
• Shanghai is fencing in COVID-19 quarantine areas. Al Jazeera reports that images on social media of government employees wearing hazmat suits went viral. They sealed off the entrances to residential blocks and shut down entire streets using green fencing. This prompted questions from residents and raised concerns. Many fences were placed around areas designated “sealed area”, which refers to residential buildings in which at least one person has been confirmed positive for COVID-19. This means that those living there are prohibited from going outside their doors.