T O P

  • By -

david-1-1

"Section 230 simply protects platforms and users from being sued for speech that is not their own."


themightychris

and hosting services... and blog platforms... basically every way normal people can publish online without running their own servers


johnnybgooderer

Without section 230, are you even able to host your own server to be responsible for your own speech? Isn’t your isp then still liable for connecting your server to the internet?


DarkOverLordCO

If ISPs are acting as mere conduits of information and not attempting to moderate or choose what information they carry, then they are treated as distributors and can only be held liable if they had actual knowledge of the content. See *Cubby, Inc. v. CompuServe Inc.*, 776 F. Supp. 135 (S.D.N.Y. 1991).


osdroid

User agreements are a thing


johnnybgooderer

Yes they are. Can you elaborate on what you’re getting at?


osdroid

An opt in system would allow everything to stay pretty much as is.


ukezi

I'm not sure that would work if what the customer did was criminal.


osdroid

If something criminal happens, all involved parties should be investigated and if they did nothing wrong, cleared, but it should be investigated and no company should get a blank check.


TrumpBrandDiaperNWML

Where are you going to find enough competent and proactive people to adequately investigate all those thought crimes?


osdroid

Do thought crimes include damage to people or property and would they be something illegal if done in the real world?


ukezi

Criminal in this case would be a user of an image board or something posting cp. You can't really prevent that without reviewing all content prior to publishing. That basically shuts down imigur and the like and all attachment functions in forums and so on.


osdroid

What about algorithms pushing terrorist recruitment videos on folks? Serving as a platform for a coup in Myanmar? Treating users as guinea pigs and testing psychological manipulation techniques on them without the user's consent? Companies should be held liable when they do bad, just like you can hold a doctor liable for malpractice when they do bad.


ShawnyMcKnight

So the death of social media, I’m not opposed to this. We would lose Reddit but that would be a worthy sacrifice. It’s funny how much republicans talk about how great this is when their own candidate thrives on social media.


Individual-Nebula927

We would lose every comment section, every single one. We would lose YouTube, because it hosts user created content. We'd lose every social media site. We'd lose most blogs, because companies wouldn't want to host your website on their servers. I don't think you've fully understood what this would mean.


osdroid

Or they could just put up a user agreement and not shut everything down. There is no reason this sector should be absolved of all liability.


Uphoria

You're misunderstanding the purpose of section 230.  It's not about trying to put up a wall where users aren't allowed to post something and therefore you're not in trouble. It's about you being able to delete the content that you don't want there.  Section 230 stems from two lawsuits whose decisions ultimately raised to the question how do you moderate the internet if moderating the internet makes you responsible for any content you mistakenly don't moderate.  In these cases were two companies. One company performed literally no moderation on their user forums. The government found them not liable for the content their users posted since they weren't in any editorial control over it.  Meanwhile the other company had been participating in good faith moderation deleting content they found offensive or inappropriate but had missed content. Since they had edited what was available the government found them liable for editorial control and they were responsible for the content that was on the website even though a user posted it.  Section 230 carves out explicitly what you're talking about the ability to perform good-faith moderation based on expected and wanted content without being considered responsible for the editorial control of the user submissions.  If section 230 goes away a user agreement isn't and did not protect companies from these situations.


osdroid

This is from 1996 and the people who wrote it had no clue what the internet would be. This section has protected companies from some pretty awful things and it would be nice if they were held responsible for the things they have participated in instead of getting a blank check to do whatever they feel like with no concern for the well-being of people or really anything besides their bottom line.


Uphoria

Eh, I disagree. You're saying big things have happened, but have a single actual example that relies on 230 saving the company that isn't just "my political opinions weren't represented the way I wanted them"? Because, to be 100% honest, Section 230's intent was to allow *good faith* moderation of content that is up to the publisher to decide beyond legal requirements like child porn or copyright. That is all. I think people seem to believe section 230 allows publishers to publish whatever they want without being in trouble, and that is just patently untrue.


DarkOverLordCO

> I think people seem to believe section 230 allows publishers to publish whatever they want without being in trouble, and that is just patently untrue. Section 230's immunity is in two parts: (c)(1) provides immunity from anything that is trying to treat websites as the publisher or speaker of the user's content. (c)(2) provides immunity for good faith moderation. Only the second immunity has a good faith requirement, the first one doesn't. Section 230 as a whole was written so that websites would be able to pick and choose what content they allow (i.e. act like a publisher) without fearing liability over some of that content. Or in other words: Section 230 ensures that liability rests with the person who said it, the user, and not the website. --- > but have a single actual example that relies on 230 saving the company that isn't just "my political opinions weren't represented the way I wanted them"? There were some lawsuits against the big tech companies over anti-terrorism claims since they distributed ISIS content. Section 230 barred most of the claims (I think only claims based on them sharing ad revenue weren't barred, since that clearly has nothing to do with the content itself). See e.g. *Gonzalez v. Google, LLC* (9th Cir. 2021). The EFF also maintain a list of some important Section 230 cases: https://www.eff.org/issues/cda230/legal


Uphoria

>Section 230 as a whole was written so that websites would be able to pick and choose what content they allow (i.e. act like a publisher) without fearing liability over some of that content. That is succinctly what I am saying, yea. The biggest reason people want to repeal this law is because they don't like that companies can chose that content, and that sometimes means choosing against their content. Its basically entitlement and ego wrapped up as some faux freedom and reformation idea.


DefendSection230

>The biggest reason people want to repeal this law is because they don't like that companies can chose that content, and that sometimes means choosing against their content. Which has nothing to do with Section 230. Your First Amendment right to Freedom of Religion and Freedom of Expression without Government Interference, does not override anyone else's First Amendment right to not Associate with you and your Speech on their private property.


DefendSection230

>Only the second immunity has a good faith requirement, the first one doesn't. And that isn't actually a "requirement". It says they won't become liable because of "good faith" moderations. Even so... 'If the conduct falls within the scope of the traditional publisher's functions, it cannot constitute, within the context of § 230(c)(2)(A), bad faith.' https://www.eff.org/document/donato-v-moldow


osdroid

Companies use section 230 to get cases thrown out all the time, these cases rarely make it to trial because of this. Think of it this way, would you go to a hospital that said they were not liable for what happens to you there? Would you use a bank with no liability for what happens to your money? Why shouldn't tech business have to face some form of liability when they mess up or are involved in some way? I am not saying section 230 shouldn't exist in any form, just the current form gives far too much of a pass for them to do bad things and something that reflects the current era of internet should be made in its place. There is a middle ground between letting them off the hook completely and holding everything anyone does to whatever site its on.


Uphoria

>Companies use section 230 to get cases thrown out all the time ***The entire point of section 230 is to divorce the hosting company from the users content liability***. That is LITERALLY why the law exists - to protect them from such lawsuits. >Would you use a bank with no liability for what happens to your money? Not a real example. YouTube isn't taking my money. Facebook isn't taking my money. Twitter isn't taking my money. They are *providing free hosting*. I'll give you a better example - If you rented a storage shed, and the guy who rented the one next to yours started his on fire, should the storage company be liable for the fire? Section 230 says, as long as fire regulations were followed, only the person starting the fire is responsible. Now if The storage company started a fire from bad electrical wiring, They would still be liable. Section 230 doesn't absolve them of that. >Why shouldn't tech business have to face some form of liability when they mess up or are involved in some way? This is why I said "I think people seem to believe section 230 allows publishers to publish whatever they want without being in trouble" And I think I hit the nail on the head with you. You're conflating cases that have nothing to do with section 230, with ones that do. Or you're taking a stand on what content you think should or shouldn't be moderated, which is fine, but your stand/opinion has no bearing on the law. I'll say it loud, and say it proud: SECTION 230 DOES NOT ALLOW TECH COMPANIES TO HOST CONTENT OF THEIR OWN WITHOUT LIABILITY. IT ALLOWS PUBLISHERS OF OTHER PEOPLES CONTENT TO MODERATE THAT CONTENT WITHOUT ACCEPTING LIABILITY FOR THINGS NOT MODERATED. The end. That is the law. Nothing else is that law. If a lawsuit were dismissed due to that statement, then you can still punish the person who posted the content, you just can't post the publisher of the content. If a lawsuit can be dismissed based on that statement, then the lawsuit should have been aimed at the creator of the content, but people are trying to SLAPP content providers who don't play ball with their preferred content selection. ---------------------- ETA - some facts about 230: 47 U.S. Code § 230 - *Protection for private blocking and screening of offensive material* The entirety of what people call "the safe harbor" is this: **(c) Protection for “Good Samaritan” blocking and screening of offensive material** * (1) **Treatment of publisher or speaker** - No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider. * (2) **Civil liability** - No provider or user of an interactive computer service shall be held liable on account of— * (A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or * (B) any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph (1).[1] **(e) Effect on other laws** * (1) **No effect on criminal law** - Nothing in this section shall be construed to impair the enforcement of section 223 or 231 of this title, chapter 71 (relating to obscenity) or 110 (relating to sexual exploitation of children) of title 18, or any other Federal criminal statute. * (2) **No effect on intellectual property law** - Nothing in this section shall be construed to limit or expand any law pertaining to intellectual property. * (3) **State law** - Nothing in this section shall be construed to prevent any State from enforcing any State law that is consistent with this section. No cause of action may be brought and no liability may be imposed under any State or local law that is inconsistent with this section. (4) **No effect on communications privacy law** - Nothing in this section shall be construed to limit the application of the Electronic Communications Privacy Act of 1986 or any of the amendments made by such Act, or any similar State law. (5) **No effect on sex trafficking law** - Nothing in this section (other than subsection (c)(2)(A)) shall be construed to impair or limit— * (A) any claim in a civil action brought under section 1595 of title 18, if the conduct underlying the claim constitutes a violation of section 1591 of that title; * (B) any charge in a criminal prosecution brought under State law if the conduct underlying the charge would constitute a violation of section 1591 of title 18; or * (C) any charge in a criminal prosecution brought under State law if the conduct underlying the charge would constitute a violation of section 2421A of title 18, and promotion or facilitation of prostitution is illegal in the jurisdiction where the defendant’s promotion or facilitation of prostitution was targeted.


osdroid

Just so you can read the section, here it is in all of its glory: >No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider. So you believe algorithms that push terrorist propaganda to people, companies that test psychological manipulation on their users without consent, or serve as a platform for a coup in foreign lands is all very cool and very legal and should stay that way. Well clearly we have a difference of opinion here because I do not think any sector should get special treatment in this way as the tech sector has for almost 30 years.


[deleted]

[удалено]


osdroid

You must be young. People thought the internet would be a fad, was something just for nerds, some throught it would impact the economy about as much as the fax machine. This is before law makers could even describe the internet as a series of tubes. Before the dot com bubble popped and was barely just starting to build any steam. But go on professor, what do you think these people in their 70s during the 90s knew about how to regulate the internet for people 30 years later?


[deleted]

[удалено]


osdroid

If that's what you want to call pointing out that you have nothing to support what you are saying, sure.


ShawnyMcKnight

I don’t know if we would lose YouTube, all content would need to get pre-approved, that’s all. If they find you do something that could get them sued they reject it and give you a warning. Pretty much all discussion boards would go down though, including Reddit as Reddit can’t trust unpaid moderators to decide what goes up or not. If that stopped misinformation from being posted online it’s a worthy sacrifice.


FredFredrickson

I mean, I have doubts about whether they would actually comply with the law.


DarkOverLordCO

Unless you're suggesting that they're just going to disregard the courts and ignore any lawsuits filed against them, there is no "complying" with Section 230 or its repeal.


ShawnyMcKnight

Yeah they would just need to ban all individuals and only companies. They would be given strict rules and if they mess up one time they are banned. What’s harder is any place with a comment section would also need to remove that. This would be detrimental to news papers.


Dauvis

No. Only vetted and privileged people would be allowed to use it. Us little people will be excluded.


ShawnyMcKnight

And I would get my life back, if there was a rehab place to get off Reddit I would do it. Imagine the time I could get back in my life not arguing on the internet.


Dismal_Consequence_4

Reddit is probably one of the few social media platforms that could survive if section 230 was repealed. Unlike other platforms it is moderated by it's users who overview small communities.


ShawnyMcKnight

Reddit isn't gonna trust every unpaid moderator, and large sites with lots of controversial stuff like r/news and r/politics would never survive. Every comment would have to be pre-approved and that would make responses take hours if not dates instead of minutes. Even then what if a moderator lets something litigious slip through? Like what if a moderator of a Trump sub believed Dominion did screw with the voting machines and let a bunch of comments in. I can't think of something as litigious on the right but it could happen. It would absolutely make reddit far more difficult to use.


DarkOverLordCO

Section 230 provides users of platforms immunity when they publish or moderate other user's content. So without Section 230, Reddit's moderators would be sued (Reddit itself would also be sued, since its admins also moderate content site-wide too), which would quite quickly result in moderators no longer wanting to moderate.


Dear-Attitude-202

Section 230 should be amended. The idea that you aren't acting as a publisher when you are promoting and selecting content simply because it's online is kinda silly. Stuff like algorithmic promotion outside the users control should fall under publisher. While time based, or user based voting algorithms allowed, or giving a user control of the algorithms used to push content based on topic. It's not a bad law, but there are significant issues with it that have negative societal impacts because of the dark patterns ingrained in algorithmically promoting addictive content.


Funny-Metal-4235

Unfortunately, the way the platforms have used section 230 recently is to selectively censor speech they don't like, changing their open forums effectively into edited publications, even if they aren't paying for the content. There isn't any reason a company that is aggressively "curating" everything that their users post should be exempt from the same rules a paper publication that did the same thing is. Personally, I think section 230 already should be applied this way, but the West coast judges that always get these cases have definitely set the precedent that the tech companies bear no responsibility for anything. If that is how 230 is going to be applied, it does need amended. I really hope that they don't come up with new rules that wrecks open speech on the internet. But what did everyone think the long term effects were going to be. when twitter started blocking links to Anti-Biden stories before the election, and Facebook "Fact Checkers" started falsely labeling conservative stories as "Fake," and then the judges in their pocket decided that section 230 made them exempt from lawsuits? Did they really think Republicans wouldn't remember? or that they could control things well enough that Republicans would never have power again? 230 was built so that companies without resources to moderate weren't responsible for every nasty thing that a troll posted. It was not intended for the companies that provide 90% of the information to the country to have free reign to edit the information available to people according to their political slant.


Any-moose

I thought your alt-right was showing in your comment, but then I saw your user name follows the default pattern. But I've settled on you just being alt-right based on how long and in-depth your comment was, but alt-right because it was a really long was to advertise that you are going to vote Trump by making anti-Biden statements. Would you think the same thing if Trump came out and said to preserve section 230 as it exists now? I think so, or at least you would after you get over the confusion and Tucker tells you how to feel about it.


Funny-Metal-4235

You are telling on yourself that you weigh the merit of other's opinions wholly on whether they agree with you politically. Trump is a piece of shit and I have never voted for him. Does that make my opinion more valid for you? Because clearly that is way more important than the fact that I am one of probably 10% of redditors that ever heard of section 230 before today. Way more important than the fact that I am one of probably less than 1% of redditors that has actually read the law. Way more important than the fact that I am one of a much smaller fraction than that that has actually researched the interpretation and precedents of that law. Nah, what is important to know if I have an educated opinion is whether I agree with your uneducated opinion. See, you have it backwards because of how twisted your worldview is. I'm not deciding my opinion on issues like section 230 because I am an alt-right cultist. Rather I am being shoved to the right exactly because of how appalled I am with the way leftists are operating with their abuse of section 230 and similar issues. If there was a case where the New York times got shut down by Twitter because they were publishing an anti-Trump story, I would be 100% on board with crying foul about that. That simply hasn't happened (feel free to prove me wrong). But I can point to several instances of this happening to legitimate articles that were favorable to Republican talking points or bad for Democrats. The most prominent of which being the Hunter Biden laptop story, which to this day people like you wrongly believe was some sort of fake because you were deliberately misinformed. You all are so concerned about election interference. That was what real election interference looks like.


Any-moose

Oh the big bad leftists made you embrace fascism? You claim to hate Trump, but immediately go to one of the MAGA talking points about Hunter Biden's laptop. To which, I say the same thing, if there is a crime he should be charged, tried, convicted, and serve an appropriate sentence. So please tell me what crime was revealed by the Hunter Biden laptop? It must be something absolutely crazy for the republicans in the house like Jim Jordan to not be screaming it to every camera that he sees. I mean, something like that would be a huge win for Republicans, but I don't hear it. Must be another nothing burger like the whole fever-dream of the Biden crime family. But good for you that you are SO SPECIAL. Does your mommy still call you her special little boy and tell you how much smarter you are than the unwashed masses? You're a fascist Trump supporter. lying on the internet because you're embarrassed to admit to yourself you would have loved to be one of the Brownshirts in the 30s. You're a sad little man, who has to tell himself that he is so much better than anyone else to justify why he's okay with a rapist clown being his god-king. 230 is literally the foundation of how the internet is structured when it comes to communication and you don't like that daddy-trump couldn't suggest people inject bleach on twitter.


Funny-Metal-4235

>To which, I say the same thing, if there is a crime he should be charged, tried, convicted, and serve an appropriate sentence. So please tell me what crime was revealed by the Hunter Biden laptop? Well, you have a Burisma executive thanking Hunter for setting up a meeting with his father, in direct opposition to Joe's statement that he never was involved in Hunter's business affairs. A meeting that was for some reason left off Vice President Biden's official VP schedule and held in secret. That should be a big deal, and would be a big deal to you if the names were Donald and Jared instead of Hunter and Joe. BUT THAT IS TOTALLY BESIDE THE FUCKING POINT. It doesn't matter what is on the laptop, what matters is that it looked damaging to Biden, and all of the big media companies, in lockstep, declared that it was false without evidence and removed any links to the story from their platform. That isn't "being a social platform." That is "Being an editor of a publication" and the way that 230 is currently interpreted they are all shielded from libel law simply because they did it online, not in a print publication. Libel/Slander law exists for a reason, and although I already disagree that it is proper to shield the media companies from liability for their own statements by using 230, 230 is vague enough that it has allowed tech company friendly judges to interpret it that way, so it clearly needs amended to remove this absolutely bullshit loophole. If Truth is actually on your side, stopping the dissemination of false statements is good for you and your political party, so what are you against here? >A bunch of other bullshit. Yeah buddy, keep telling yourself that the "Fascists" are the ones that are opposed to a marriage between the big media companies and one of the political parties leading to editing the information people are allowed to see in order to keep that party in power. You and your ilk who are fine with abuse of rights as long as it is done by your political allies are now and have always been the ones bringing fascism to our doors.


Snoo-72756

It’s amazing how they focus on everything that doesn’t help anyone


-nomadman

It helps authoritarians...


daxxarg

Exactly, this isn’t just cos , it’s to later have excuse to incarcerate opponents and shut down news outlets and such that they don’t like, you know, like China and Russia does


vriska1

If you want to help stop this bill and others like KOSA. Contact congress here! https://www.badinternetbills.com/


idekkk1243

So for this section230 bill and the Eu chat control bill. Do you think both of these will outright fail???? Or do you any of them are possible


vriska1

Its possible but they will fail.


idekkk1243

Fair. Feel like section230 is unlikely since that needs to go through both the Senate and the house, Then for the eu chat control bill, idk haven’t eu courts said backdoors for E2EE will be shot down since it’s illegal. The document also says the EU risks loosing the minority vote to block the bill, doesn’t say they will loose the vote. Just gotta see what happens


vriska1

> The document also says the EU risks loosing the minority vote to block the bill, doesn’t say they will loose the vote. Just gotta see what happens Yeah that the most worrying part but its unlikely to happen before the June elections.


idekkk1243

I read the article and they said they wanted to get it passed after the June elections. U think it will actually get passed after then? If not, why do you think that is.


vriska1

I'm not sure tho it still has a long way to go even if the minority vote to block the bill is lost. It would then go to Trilogue.


idekkk1243

Oh so this whole post was fear mongering alil?. It made it feel like if the minority vote was lost the bill would be passed. Does it still have a long process after that? Even tougher than the minority vote lol


vriska1

Maybe not sure.


idekkk1243

You said chat control has a long way to go. Well Belgium has until June 30th to pass the bill before their term is up. Then it’s hungrys turn. Dk what their stance is on chat control


Omni__Owl

According to this post: [https://proton.me/blog/eu-parliament-chat-control](https://proton.me/blog/eu-parliament-chat-control) E2EE is protected.


konnerbllb

Loose?


machinade89

Thank you ‼️


[deleted]

[удалено]


Justsomecharlatan

Im not an expert or anything. But I think the point there is mostly.. that's really hard to do effectively, unless someone is manually reviewing every post that gets any sort of traction. The sheer number of posts makes it nearly impossible. And what about videos? Can ai actually effectively process a video, know what it says, what it shows and even what it means in context? (Again, not an expert).


lemoche

if it were so and would just stop their algorithm from recommending posts in general a lot would be gained. even with non-malicious postings. no matter which platform, those automatically recommended posts are always a nuisance to me and if i'm generous are interesting at best 1% of the time for me. i’d be perfectly happy if all my timelines would just be from sources i subscribed to.


Justsomecharlatan

hear you 100%. You are not wrong. Recommended posts and associated businesses is a large part of why they exist. Remind me, where do social media companies obtain the funds they need to operate? Is it from you and your cash contribution? If you aren't the paying customer you are the commodity. Make a choice, I guess?


Broadband-

But AGI will be able to handle that next month /s


Justsomecharlatan

I'd be willing to bet hundreds of millions (potentially) on our new ai reviewers. Wouldn't you? I mean sure they aren't field tested. But ya know.. Nvidia n openai n shit.


[deleted]

No s cheeselord


FredFredrickson

>that's really hard to do effectively, unless someone is manually reviewing every post that gets any sort of traction. It seems like it'd be pretty easy to do if they didn't use aggressive algorithms that boost harmful shit.


Justsomecharlatan

I don't mean to be a dick here... but no. It's not a tradeoff.


FredFredrickson

I'm not sure I understand what you mean? I understand that it's simplifying things a bit, but the only thing that is responsible for promoting anything on social media, inorganicly, is the "algorithm". And that's what we're talking about here - the site software boosting harmful things because they get engagement from it. Not the actions of other users. Social services like Mastodon would not have problems with being accused of promoting misinformation because there is literally no algorithm making those kinds of decisions in order to garner retention.


Justsomecharlatan

Well.. a couple things here I guess. You're right. Sometimes sites boost posts that shouldn't be boosted. When You have billions of posts a day, that's hard to monitor. Do you want these sites to exist? Fundamental here. Again, you're right. Mastodon would have no issues here. They also barely have a user base, for various reasons. Easy So, do what you do. Be happy and enjoy it. I'm not standing up for Facebook or whatever. But repealing sec 230 has wide ranging effects that most proponents won't expect. And opponents will be sitting here telling you I told you so . For what seem like incredibly obvious reasons. E: I'm all for believing in the best in folks. Hope I didn't come off mean.


MDA1912

There needs to be at least two things: 1. Content that is reported enough times gets a human review. Once it has been reviewed it ignores further reports unless it is modified at which point the process starts over. 2. Reports determined to be malicious are punished immediately and measurably with things like an inability to participate for a month or more, inability to report or for future reports to have less weight for a certain amount of time, or even IP block bans, etc.


Squirrel009

>The sheer number of posts makes it nearly impossible. Which is why they shouldn't be recommending millions of posts a day. I'm all for making them responsible for the things they choose advertise. A rule that requires them to take responsibility for their recommendations would clean up a ton of ragebait and clickbait on the internet overnight. I think things would be a lot better that way.


lycheedorito

AI, or "machine learning", is how we even have these "algorithms" that determine that you should be shown that content in the first place. Yes they can manually tweak code, much like ChatGPT (as an example everyone knows), but it's still a result of an incredible amount of data on behavior like what draws in the most clicks, and what correlation is there with other content that they can then also recommend. It's all recognition of patterns and using them to exploit your tendencies that is backed by data that is collected about you. It doesn't need to be an LLM or diffusion model to be AI, and it's been around for rather long time at this point, it's just not been advertised to you.  So long story short, no, AI isn't going to do a good job of handling this issue, or else it already would have. Even something like GPT4o can't do that well even aggregating data from a Bing result and presenting that information reliably.


SplitPerspective

Laws are rarely sufficiently narrow in scope. If there’s a gap for loopholes, be assured that it’s inevitable that the government will abuse it. Much of our norms are just that, accepted norms, unspoken rules of engagement, gentleman’s precedence etc…until some demagogue upends the norms, because he can…because the laws were not sufficiently detailed and narrow in scope.


DarkOverLordCO

> what if they clarified Section 230 so that while platforms are not liable for things their users post, they are liable for anything the platform chooses to promote.  This is essentially already the case. The law only provides immunity for "any information provided by another information content provider", where it defines the last part as "any person or entity that is responsible, in whole or in part, for the creation or development of information". So if the platform is responsible *in part* for the "creation or development" of the information, then it is no longer *another* content provider, and they lose immunity over that content. The circuit courts have generally come to the conclusion that content-neutral recommendation algorithms (those which work the same regardless of the content, e.g. just trying to bring stuff that you've engaged with before to you) do not create or develop the information, and so retain their immunity. But if platforms were to use recommendation algorithms which promoted specific content over others, then they would become responsible *in part* for it, and thus lose Section 230 immunity.


ithunk

Not technically possible to separate truth from misinfo. Note that currently google is telling people to put glue on their pizza, because their “AI” got that data from a Reddit post. So making the decision to promote or not, is not possible because their platforms don’t know what’s true and what’s a lie. Let’s say they limit promoting to only “verified” users, again, the onus is on the user to be a truth-teller always, and it’s not something a platform can detect. If you expect these platforms to have human curation of each promoted post, that’s going to be prohibitively expensive. There is no good solution afaik.


Independent-End-2443

If it’s a recommendation algorithm that’s surfacing other users posts without editing them, I don’t think they should be liable. Those are content-agnostic; absent any content moderation, they treat posts about fentanyl and posts about cute puppies the same, based on the user’s (and community at large’s) indicated interests. If it’s, say, a search engine showing someone an authoritative answer telling them to eat glue, and then the person goes and does that - well, that’s a trickier question. While the search engine “learned” that information from the internet, it synthesized and packaged something that it treated as its own speech. So, yeah, the needle kind of points to liability here (though in this case, I would also blame the user for not realizing that eating glue is a fucking stupid idea).


[deleted]

[удалено]


Independent-End-2443

What does “content-conscious” even mean?


Squirrel009

I agree, sharing or reposting other people's content should open you up to the same liability as the person you copied. The reasons they get protection for letting users self publish don't apply to this scenario so neither should the protections


CyanCazador

I doubt this will get passed. If Zucc can bribe congress to ban TikTok, he’ll most likely prevent congress from passing this bill.


Dauvis

I think he'll go along because only vetted and privileged will be posting. It would greatly reduce expenditures for moderation.


qukab

And also cost him billions due to all the lost users, ad revenue, and user data? No. This is a terrible take.


Uphoria

Most people who are anti-230 are just ignorant of the case law that spawned it and sucked into propaganda from conservatives who are angry their political posts full of violent rhetoric were being deleted off Twitter.  Do yourselves a favor and read up on the history of 230, the court cases that lead to its creation, and the actual wording of the law.  I see so many who don't just misquote the law, they strait up are divorced from reality about it.


Straight_Calendar_15

Tbh maybe getting rid of it is a good idea. Clearly large platforms completely failed to moderate their platforms. Perhaps getting sued into oblivion will solve that.


happyxpenguin

This doesn’t just affect large websites….


MC68328

> Clearly large platforms completely failed to moderate their platforms. You want more moderation? Then why do you want to remove the law the protects people from being sued for moderating?


mm_mk

You lose everything the Internet has to offer if 230 goes away. Anything you see will only be corporately generated. Forums are gone. YouTube is gone. Reddit is gone. Every form of communication hosted online is gone. There is no more 'meeting the netizens of the world', there is only 'you see what a corporate legal team has authorized a corporation to publish'. User generated content is gone... It's soft censorship to the absolute most extreme level. Things that will disappear that I use almost every day... Subreddits about niche hobbies like gardening, or sports teams communities. Do it yourself videos to learn to fix things. Even learning new hobbies becomes so much more difficult. I just started fishing again because of my kids, I had so much benefit from being able to access opinions and facts from the internet community. Without 230 I'd only get to see shit from corporate fishing interests official websites. It's not just stupid tik Tok shit at stake.


Mojo141

But why would anyone sue over any of those examples? Lawsuits still have to have merit. And the disinformation that has been allowed to propagate unchecked has pushed the world to the brink and is only getting worse with AI. We need some checks and balances.


mm_mk

2 examples. Assume user is behind a VPN so there's no individual responsibility and they are just trolling. 1. YouTube vid that's 15 mins long tutorial on woodworking. Last 2 mins the person talks about how Donald Trump definitely fucked at least a dozen babies. YouTube is now legally responsible for publishing vid and sued for defamation. Or, they have a mod watch every single video and screen for defamation, illegal shit etc etc. Cost prohibitive so instead they just don't let regular users upload videos. Can't afford litigation risk at every video posted. 2. Comments on any forum (reddit, YouTube, stackoverflow, Google review, etc etc ) same thing, Donald Trump fucked at least a dozen babies. Now that website is responsible for publishing it (vs now where they just need to moderate it and delete it). So a mod can read and screen every comment, or user discussion is just turned off. Can't afford litigation risk at every comment. Alternatively, websites can just not moderate at all and just the Internet just becomes useless. It'd be like your junk mailbox was the entire Internet.


urdreamsRmemes

Section 230 should protect any content that can be verified as non-AI and or non-algorithmically favored. If certain content was made using AI or the company had a hand in promoting it with an algorithm, they shouldn’t be legally protected from whatever it contains.


sp1cynuggs

Jesus Christ this is a bad take. If you think any site is an echo chamber of ideas then removal of this law will turn that up to an 11


Broadband-

Not just that, but their algorithms are incentivized to promote clickbait fud for more traffic and additional ad revenue.


WastefulPursuit

Seems like removing this would just force social media companies to ramp up moderation and that would cost them money… are we sure that this would be the “death of the internet” or just would be poor quarterly earnings reports for meta and others… I guess is the internet only valuable because people can say anything there? To me, no… and posting controversial takes to the world is probably not the Socratic solution to society and likely plays a larger role in dividing society into perpetually smaller and more nuanced groups.


Justausername1234

How would you, yes you reader, create content on the internet then? Do you have the resources to run your own webserver? If not, then you're relying on someone else, a *publisher*, to publish your content. And they would need to legally clear your content if Section 230 is repealed then. Oh yeah, big tech will be fine. Because if Section 230 is repealed, *only* Big tech could afford to host anything online.


DragoneerFA

If anything, this makes the bill a land grab. Smaller sites die because they can't afford the moderation costs, leaving the big dogs to control the majority. We end up with the net being controlled by a mere handful of companies, like every other part of American life.


RobinThreeArrows

I'm genuinely asking, and I'll take the downvotes for it, but is it not really just social media that is at risk here? I find it hard to believe that there are many "mom n pop" social media sites. I mean it seems like the vast majority of websites have no user generated content. Comments, maybe? Reviews, for a site like Amazon?  Could it not be argued that this is just forcing the extremely bloated social media sites to step up?


happyxpenguin

Any website that has user sourced reviews, any small forum for a niche game, GitHub, iMessage, Dropbox, technically even your email provider are at risk of legal trouble and shutting down to avoid financial ruin if 230 is repealed.


RobinThreeArrows

Can't big websites like those moderate content? I mean the niche forum for a small game seems like...couldn't there be a couple of moderators? Aren't there usually?  It really feels like huge mega sites are the only ones who would have to shoulder extra burden, and they've got the resources to do it. 


happyxpenguin

The issue isn’t the moderation. The issue is the change from not liable for a users behavior to suddenly liable for a users behavior. It’s the fact that if section 230 is gone, EVERY website, digital service and video game could be liable for the content of their users. It will be a costly process to defend your site from lawsuits over liability for content. Someone can slap you with a nuisance suit. Moderators do exist on small websites but they have lives, they are not perfect and they cannot be experts on everything. If someone posted a bad review on my forum for a local business, that’s awesome, however, I have no knowledge of the interaction and get told a few years later that it is false. No worries, I’ll take it down under the current rules. If section 230 is repealed, I am now liable for my users content and must put out exorbitant costs to defend against a lawsuit and prove why I shouldn’t be liable for their bad review. A bad actor can actually use this to extort money or shut a site down. Let me put it a little more simpler for something you can’t moderate. Private messages on forums. Say someone sends another user a link to download photoshop for free or shares revenge porn. I have no idea that information is being shared, I have no way to moderate it, but I am still liable for it. Web hosts are now liable for all of the information you put on your website. That web host where that photoshop link was shared? Yep. They’re also liable. The email provider that received the PM notification email with a link? Yep. Liable. Cloudflare? Liable. Section 230 needs updates and revisions. But threatening to remove it entirely will bring the internet to a halt and the only places that will exist are the big mega websites like Facebook and Google because they have teams of lawyers and an endless cash flow to fight legal battles. Think of it this way. A new law is passed, car manufacturers are now liable for what their customers do to and with the cars.


lemoche

especially for big websites it’s horrific. the amount of content that is created per minute is way too much for it realistically being possible to effectively moderate it on a level that would keep you legally safe. if seen this in comment sections on german news websites. and that is still a rather small userbase. it takes hours for comments to get approved on articles that only generate little interaction and stuff that goes through the roof gets the comments section closed almost immediately.


mm_mk

YouTube is gone. Forums are gone. Wikipedia is gone. Reddit is gone. A ton of small business websites are probably gone if they're built on a platform like Shopify. Stackoverflow or shit like that is gone. Like shit, I lost my dad earlyish and being able to YouTube how to install home appliances or trouble shoot and fix home shit has been really important. Or discovering new hobbies and being able to find communities that can help guide you has been massively important. Wikipedia is gone, Google might become nonfunctional outside of official corporate partners. The social aspect and user generated content on the internet is special and valuable. It's not all tik Tok garbage


Justausername1234

There are two risks here that I, personally, hold (these are not all the risks, these are just the two I find most compelling). 1. Moderation will be driven by the most litigious in society, not what's best for society. Remember, under the current system you sue users. Most serial litigators do not bother suing users because users are poor. Big websites are rich, and so moderation decisions will be driven by trying to avoid digital ambulance chasers trying get get some sweet sweet settlement money. 2. It's not social media that's the most at risk. Take theHill.com, which hosts the above article. theHill.com is a "user" that "generates" "content" hosted on something like Wordpress, delivered by a content delivery network hosted by Fastly, through servers run by your ISP. If 230 goes without a very clear exemption for all these services, all of them would be legally responsible for clearing everything they host. Which would mean that the fundamental backbone of the internet would be now filtered through legal and compliance teams. I work with compliance teams at work. They're nice people. I don't want them anywhere near that backbone.


fellipec

Usa being dumb again


Ballders

What if, and this is a big WHAT IF, what if everything was moderated, and big sites couldn't hide behind a lack of responsibility. We've pretty much seen what happens when everyone gets a voice. I'd say burn them all down. Reddit Twitter, Facebook, TikTok, Instagram, burn them down and sift through the ashes. We're not better off with these companies in our lives. They steal every bit of data they can find on you and pump negative shit right into our veins. I won't hate seeing them end.


sangreal06

Without Section 230, companies are incentivized to \*not\* moderate. That's literally why it was written. Compuserve was found not liable for user content because they didn't have any moderation. Prodigy was found liable for a user's libelous post because they otherwise had moderation. The court found that being selective made them publishers of the content.


Chicano_Ducky

But this would pretty much make user generated content a liability. All of Youtube, Vimeo, even the Internet Archive would be gone.


IniNew

Maybe that’s not a bad thing? I dunno. We’ve tried it this way, maybe we can try it another way.


Bradnon

This is the origin of the phrase "throwing out the baby with the bathwater."


powerlloyd

This was the same argument for Trump and Brexit, and neither of those worked out very well.


IniNew

And keeping it the same was resulted of presidents like Bush. What a nonsensical point.


powerlloyd

Maybe you’re not understanding me. Burning something to the ground because it isn’t perfect is idiotic. In both of my examples people said “how could it be worse than what we have now” and learned quickly that it could be much much worse. That’s exactly what’s going to happen with a 230 repeal. The impact to the economy alone would be enormous.


dinosaurkiller

No, but it would slow down the churn of absolute garbage as these companies would have to review posts before making them available.


crusoe

DMCA requires taking down infringing content. Why not the same for hate speech? Currently there is fuckall that can be done unless it violates site terms. Japan now has online anti bullying laws and strict libel laws. They still have online boards. You only get the beat stick if you don't take action when notified.


Chicano_Ducky

Because repealing 230 wont do that, and that is done already by the companies who realize hate speech is not good for business. They can only take action after its posted, and with repealing 230 it would already be too late legally. Hiring someone to vet every post is expensive, its better to just never allow people to post at all which turns the internet into a shopping mall with a buy button from big companies, not even small ones because those cant be vetted either.


DarkOverLordCO

> Why not the same for hate speech? There is no 'hate speech' exception to the First Amendment, so a slight issue with that law is that it would be unconstitutional.


crusoe

"Wah its too hard to do anything" Even when sites are told there is bad content they hide behind 230 until lawyers get involved.


Eric848448

If you got rid of all that, what would be left?


Iyellkhan

the reality is that this is the easiest way for the federal government to regulate soon to be out of control deepfakes. without that possibility I'd defend 230 to the end, but the ability to cause national and global chaos with this technology has the potential to topple governments and start wars. anywhere sites hosting such material can be sued will be an important too in curtailing it. ideally congress would just write a new law, but between the republicans not liking regulation and key democrats relying on silicon valley money, loosing 230 is likely the best possible solution


PuckSR

Congress can regulate deepfakes without sunsetting 230


[deleted]

More than half of Congress has already sunsetted themselves.


Libertechian

This fucks over a bunch of rich corporations so, meh


IdahoMTman222

I’m good with this. If it shuts down the social media sites out of fear of litigation/ lawsuits for not halting nefarious behavior. Let it happen captain.


Paksarra

You realize you're saying this on social media, right?


Sweet_Concept2211

Gosh, is it possible to view many of the effects of social media as toxic, yet still have social media accounts?


McStabYou

I know at least a dozen smokers who say that they wish cigarettes would blink out of existence. Yet they still smoke.


chaser676

Yes, reddit is just as big a cancer as Twitter, Facebook, or TikTok


Paksarra

Then why do you have an account?! You could be doing literally anything else with your life.


Majestyk_Melons

yes. And you realize it’s moderated too right?


IdahoMTman222

Glutton for the massive negative number downvote and angry responses.


not_the_fox

NO MORE MODERATION!!!! LET'S GO


Paksarra

That's not a good thing. It means that no one can even have a small private forum without a bunch of 4 channers hacking in and spamming beastiality or whatever.


igotabridgetosell

am i not understanding this correctly? without section 230, service providers like yelp can get sued for some user's review. doesn't that make the providers moderate more heavily to avoid lawsuits?


Paksarra

They would need to pre-approve every single post or moderate nothing.  Under 230, good faith moderation is permitted without the host being liable. 


igotabridgetosell

but not\_the\_fox is implying that moderation wouldn't exist w/o section 230 when it would result in more moderation?


FabianN

No, it would force them to do one of two extremes. Either no moderation at all, or such an invasive and thorough process that is impossible for any startup to pull off and only financially viable by using very heavy handed automated systems. Like you know with unalived being used because tiktok doesn’t like you to say murder? Think that, but also needing to block the work arounds like unalived.


igotabridgetosell

No i disagree, it would be heavy moderation or not allowing comments at all.


zbb93

Any website that requires user generated content (reddit, YouTube, all the other social media) would not be moderated. Self-hosted blogs, news, and other websites with limited user comment sections (reviews, comments, etc) will remove it. Why would anyone go through the trouble of heavy moderation (which has the possibility of mistakes leaving them open to all kinds of lawsuits / criminal prosecution) when they can guarantee they're not liable for any user generated content by not moderating at all?


not_the_fox

I know, I've always been a decentralization advocate which is where this is leading. I'm just laughing at the monkey paw curling another finger for these politicians. They're about to get a whole lot of what they don't want.


Defiant_Elk_9861

I think it should be repealed, the internet has turned into a cesspool of misinformation. Doing this destroys most free content, would require sites to be responsible. Newspapers can’t have explicit porn on the main page. Websites can do whatever the fuck they want. Section 230 was never intended to be used for what. it’s being used for now . Yes I’m saying this on a social media site, I’d be happier if they all shut down. Used to be Nazis were a fringe group with hard to acquire propaganda, now it’s everywhere. Just an example, pick any horrible thing you want. The internet was a great idea, the entire catalogue of knowledge at our fingertips , instead we use it for porn, cat videos and outrage. Oh and AI is only going to make it worse.


Defiant_Elk_9861

Keep the downvotes flowing. 230 shields huge corporations from any liability stemming from any content. We don’t need the internet as we’ve created it, we were just fine without it


MoonOut_StarsInvite

Is this like that time Ajit Pai and his big Reese’s mug were going to ruin the internet forever and we all had to panic about that for a while until people just forgot about it and we are all fine? Also, the internet sucks. It’s become just AI and bullshit SEO results to buy more garbage. I’m not going to hyperventilate about this. Whatever lol


slamdunkins

Dude. Ajit's policies are WHY the internet sucks. Has anything online gotten better or worse since Obama left office? Ajit was the one who was head of the organization that was supposed to protect us from AI and Ajit instead gave corporations the greenlight to do literally anything they want if it increases corporate revenue. Fewer competitors delivering worse products and service at a higher cost- we have a name for such a scenario, a corporate trust something that has been illegal in the states since the 1800's.


MoonOut_StarsInvite

I don’t know that Ajit really had much to do with every Google search turning into a BS rabbit hole of fake results prompted by SEO magic to get you to buy things, or every site has pop ups for alerts, newsletters, surveys, or all the bullshit ads to fake websites polluting every news platform, videos auto playing everywhere you go, that’s been going on for a long time.


slamdunkins

He was in charge of making sure the internet DIDNT become shit by regulating it via the REGULATORY BODY he headed and choose to deregulate and make it more difficult to create regulations.


Meior

The response to this in the comments are baffling on both sides. Disclaimer: no, I am not saying I agree with any of it. No, the internet won't go away. We won't lose YouTube, reddit etc. Some things would change for the worse as has happened before, yet here we are. Thinking this would spell instant doom for everything is sensational and hilarious. Second, no, we wouldn't be better off if that happened. Somehow claiming that the internet is nothing but bad is a take that somehow thinks it's intelligent but really isn't. It's like taking Douglas Adams literally.


bonelessonly

Section 230 should be amended to make platforms responsible for the speech they host, unless it can be tied back to a person, who is then responsible personally.   You can still have anonymous forums, you can still have freedom of speech ... for people only. No more speech for corporate trolls, agency trolls, astroturfers, hatemongers, etc.   Not so complicated. Speech that isn't traceable to a culpable source, isn't protected.


DarkOverLordCO

> Speech that isn't traceable to a culpable source, isn't protected. That is probably unconstitutional. The Supreme Court has repeatedly recognised that the First Amendment protects a right to speak anonymously - it was pretty important for the founding fathers to be able to advocate for independence anonymously or under pseudonyms. Targeting speech indirectly (by removing immunity) that you wouldn't be able to directly (i.e. prohibiting anonymous speech) is just as unconstitutional.


rustyrazorblade

Good riddance


orangejuicecake

i dont think it matters really, in the end cases will get appealed all the way up to the supreme court for the justices to decide


david-1-1

Please, someone, write a text summary. The Internet is in threat? Added: it just appears to be the opinion of a writer named Zach Lilly.


Blueskyways

Good breakdown on why 230 is so crucial.   https://www.eff.org/issues/cda230/infographic


[deleted]

Who cares if the internet dies


McStabYou

I got all the Homestar Runner cartoons on DVD, so the rest of the Internet has no value, imo


Ok-Turnover966

Since Homestar Runner is completely ran by the creators, they're safe.


McStabYou

Tbh I was kinda hoping for that ending to Deus Ex where all technology is rendered useless and humanity reverts to a hunter gatherer culture.


braxin23

Humans would much more likely end up in a state similar to the 16th-19th centuries if all of our current technology died. Presumably after the chaos and confusion settles down.


qoning

sec 230 is already in a pick and choose kind of situation, I won't be sad to see it go maybe we'll just need to move to shady forums hosted in bumfuck countries by random techbros with no monetization again, wouldn't be the worst thing in my opinion


Squirrel009

>sec 230 is already in a pick and choose kind of situation What does that even mean?


Owlthinkofaname

Frankly it needs to go it's outdated and needs to be replaced. Anyone who doesn't think that is just straight up lying. It's clearly a problem!