Tag: section
Supreme Court rules in favor of Twitter and Google, avoiding the issue of Section 230 for now
On Thursday, the Supreme Court resolved two adjacent cases aiming to hold social platforms liable for dangerous content. The pair of cases, Twitter v. Taamneh and Gonzalez v. Google, both sought to hold tech platforms accountable for hosting content from the Islamic State that promoted the terrorist organization in connection to violent attacks. The Supreme […]
Supreme Court rules in favor of Twitter and Google, avoiding the issue of Section 230 for now by Taylor Hatmaker originally published on TechCrunch
How to Accelerate Weight Loss after C section
How to Accelerate Weight Loss after C section Blog – HealthifyMe Blog – HealthifyMe – The definitive guide to weight loss, fitness and living a healthier life.
Welcoming a new life into the world through pregnancy and childbirth is a momentous occasion for every woman. Sometimes, circumstances may lead to a C-section, an alternative to vaginal delivery. However, the road to weight loss after a C-section can be more challenging than vaginal delivery. Factors such as an extended recovery period, hormonal fluctuations, […]
The post How to Accelerate Weight Loss after C section appeared first on Blog – HealthifyMe.
Instagram Reels adds a series of creator-focused updates, including a dedicated ‘trends’ section
As governments around the world express growing concerns about TikTok, its rival Instagram Reels is getting a series of updates aimed at creators. Meta announced today that it’s adding a dedicated destination for trending audio and hashtags on Reels, expanding gifts on Reels to more countries, enhancing Reels editing tools, adding new metrics and more. […]
Instagram Reels adds a series of creator-focused updates, including a dedicated ‘trends’ section by Aisha Malik originally published on TechCrunch
Instagram Reels adds a series of creator-focused updates, including a dedicated ‘trends’ section
As governments around the world express growing concerns about TikTok, its rival Instagram Reels is getting a series of updates aimed at creators. Meta announced today that it’s adding a dedicated destination for trending audio and hashtags on Reels, expanding gifts on Reels to more countries, enhancing Reels editing tools, adding new metrics and more. […]
Instagram Reels adds a series of creator-focused updates, including a dedicated ‘trends’ section by Aisha Malik originally published on TechCrunch
Senators Are Largely United in Desire to Rewrite Section 230
During a combative Senate Judiciary Committee hearing Wednesday, Senate lawmakers from both sides of the political aisle doubled down on calls to gut major provisions of the internet’s most important legal liability shield. The senators slammed tech companies for allegedly putting profits over user safety and…
Don’t leave developers behind in the Section 230 debate
Policymakers should recognize the critical role of developers and work to support them not stifle innovation.
Don’t leave developers behind in the Section 230 debate by Walter Thompson originally published on TechCrunch
ChatGPT won’t enjoy same protection under Section 230 as social media, expert says
A quick guide to the Section 230 hearings
The Supreme Court is currently reviewing the cases of Gonzalez vs. Google and Twitter vs. Taamneh to determine if YouTube and Twitter are liable for terrorism-related content hosted on their platforms.
Of course it’s abhorrent that terrorists use YouTube and Twitter to recruit and plan their activities. But those sites are used by millions (and in YouTube’s case, billions) of people, and host billions of pieces of content, most of which are not related to terrorism. And because of that, the law says YouTube and Twitter are not responsible for bad actors on their platform. Here’s how the Gonzalez vs. Google and Twitter vs. Taamneh are attempting to change the Supreme Court’s mind.
What is Section 230?
Section 230 preserves a free and open internet. In 1996, just as the then-new internet was gaining widespread acceptance, Congress committed to supporting that development in Section 230 of the Communications Decency Act of 1996.
Snowflake helped Tor users thwart Russian censorship. Now the VPN is branching out as Snowstorm.
In less than 800 words, Section 230 recognizes that the internet and the services on it give Americans access to a “diversity of political discourse…cultural development [and] intellectual activity.” It states that the internet should remain free from government regulation so that it, and free speech, can flourish. Services like YouTube and Twitter are free to moderate user content and speech according to their own guidelines.
Why are YouTube and Twitter in hot water?
Supreme Court cases Gonzalez vs. Google and Twitter vs. Taamneh allege that YouTube and Twitter should be liable for aiding and abetting terrorism because they recommended terrorism-related content (in the case of Gonzalez vs. Google) and hosted terrorism-related content( in the case of Twitter vs. Taamneh).
As of now, YouTube and Twitter are protected from that liability by a Section 230 that states: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”
Basically, you are responsible for what you do online. Services like YouTube and Twitter cannot be held responsible for the content posted to their platform, and neither can fellow users of the platform. In simpler terms: when someone posts something hateful online, the speaker is responsible, not the service that hosts the post.
Gonzalez vs. Google and Twitter vs. Taamneh allege that YouTube and Twitter should not be protected under Section 230 and are liable for promoting terrorism-related content, not just hosting it.
What does the Supreme Court have to decide?
The Supreme Court must break Section 230 into teeny, tiny pieces, down to the word, to determine if it will protect YouTube and Twitter in these cases.
Justices have quibbled over the definition of “aiding and abetting” and whether either platform could be considered as having aided and abetted terrorist organizations. They also discussed whether or not YouTube’s recommendation algorithm and the platform’s suggestions for what to “watch next” could be considered an endorsement of a piece of content or just a “neutral” tool for cataloguing YouTube’s massive library.
The Supreme Court is also considering the implications of their decision in the long term. Should it find YouTube and Twitter liable, and therefore move to regulate parts of big tech that have previously been left untouched? Or would that open all internet services to liability and undoubtedly overwhelm the court systems with thousands, if not millions, of new lawsuits?
And what about free speech? Would finding YouTube and Twitter liable stifle a free and open internet and put individuals at risk for legal action every time they share a video or post in an online forum? Or would it be better to hold YouTube, Twitter, and other open platforms responsible for any terrorism-related activity on their sites?
What would the internet look like if Twitter and YouTube became responsible for the content on their sites?
The shape if the internet as we know it was made in the image of free speech. To make platforms responsible for what is said or hosted on their sites means that those platforms be open to a countless lawsuits. It would also mean that you as a user would be liable for anything you say on those platforms that upsets somebody enough to pursue legal action under the amended Section 230.
To avoid being buried in legal fees, platforms would resort to significant, if not complete, censorship to restrict how individuals interact online. That could hinder innovation, communication, and generally make the world a much smaller place.
Section 230 liability protections on trial in Google Supreme Court case
The US Supreme Court today heard oral arguments from lawyers representing Google, the Department of Justice, and the family of a 23-year old woman killed in Paris by terrorists in 2015. The case, Gonzalez v. Google, represents a crucial legal landmark in how the US legal system holds large technology platforms like Google responsible for the content they host.
The family of Nohemi Gonzalez argues that Google acted as a recruiting platform for the Islamic State group, which the US State Department describes as a terrorist organization. By recommending Islamic State-related videos on YouTube, Google violated US laws against providing aid to terrorist groups, the family argues.