User not logged in - login - register
Home Calendar Books School Tool Photo Gallery Message Boards Users Statistics Advertise Site Info
go to bottom | |
 Message Boards » » Section 230 Page [1]  
afripino
All American
10617 Posts
user info
edit post

Why is this such a big conservative gripe? Just trying to understand it.

1/12/2021 9:27:03 PM

UJustWait84
All American
25324 Posts
user info
edit post

Republicans LOVE unfettered capitalism when it benefits THEMSELVES, but once the capitalism becomes fettered (or worse, starts benefiting others) they are SO NOT DOWN

1/12/2021 9:29:00 PM

aaronburro
Sup, B
52093 Posts
user info
edit post

It ain't just a conservative gripe. Dems hate it too

1/23/2021 11:53:01 AM

A Tanzarian
drip drip boom
10458 Posts
user info
edit post

Tell me why I hate Section 230.

1/23/2021 11:56:28 AM

aaronburro
Sup, B
52093 Posts
user info
edit post

Progressives and Dems hate it for the same reason Trump hates it: because they can't sue tech platforms over speech they don't like. The only difference between the two sides is which speech they dislike.

It cannot seriously be argued that there is not bipartisan hatred of Section 230.

https://www.bloomberg.com/news/articles/2020-08-11/section-230-is-hated-by-both-democrats-and-republicans-for-different-reasons
https://www.cnet.com/google-amp/news/democrats-and-republicans-agree-that-section-230-is-flawed/
https://reason.com/2020/12/18/the-bipartisan-push-to-gut-section-230-will-suppress-online-communication/?amp

[Edited on January 23, 2021 at 12:24 PM. Reason : ]

1/23/2021 12:02:43 PM

A Tanzarian
drip drip boom
10458 Posts
user info
edit post

There's a bit more daylight between those positions than your post implies. But you've always been a both-sides kinda guy, so yeah, no surprise there.

1/24/2021 2:37:10 PM

aaronburro
Sup, B
52093 Posts
user info
edit post

For fuck's sake, man Joe Fucking Biden said it should be revoked. Trump said it should be revoked. How much "daylight" do you see between those two positions?

https://www.npr.org/2020/05/30/865813960/as-trump-targets-twitters-legal-shield-experts-have-a-warning

1/24/2021 8:44:29 PM

daaave
All American
1108 Posts
user info
edit post

you mean the right and the far right want it gone. i don't know any progressives who do.

1/24/2021 9:46:39 PM

A Tanzarian
drip drip boom
10458 Posts
user info
edit post

Quote :
"hate it for the same reason"


This is not true.

1/24/2021 9:56:04 PM

GrumpyGOP
yovo yovo bonsoir
17968 Posts
user info
edit post

Certainly I hate it. Websites are publishers and should be treated as such. The state of the country and world would be immeasurably better if we'd been doing that for the last 20 years.

It's a little silly to say everyone wants to sue tech firms for speech "they don't like." The First Amendment still exists. But when Alex Jones lunatics post videos telling people to harass shooting victims, posting their home addresses, etc., YouTube needs to be accountable for enabling that.

1/25/2021 10:18:04 AM

wdprice3
BinaryBuffonary
45883 Posts
user info
edit post

^

1/25/2021 12:24:41 PM

darkone
(\/) (;,,,;) (\/)
11460 Posts
user info
edit post

Why should YouTube bear the legal burden over Jones personally? Or, if Jones is liable, why should YouTube bear the burden at all?

1/25/2021 1:14:31 PM

qntmfred
retired
40020 Posts
user info
edit post

RIP TWW

1/25/2021 1:27:31 PM

Cabbage
All American
1555 Posts
user info
edit post

Is it even feasible for a website like Youtube to monitor their users on a level that this would require? I really don't know, I'm fairly ignorant with respect to this, that's why I'm asking.

I'm thinking Youtube has millions of users, so I don't think it's feasible for Youtube to have human monitoring of all these interactions. Sure, you can have some level of algorithmic monitoring, looking for hate speech and threats, but I feel sure things can be phrased in such a way to slide past most any algorithm like that; in that case, I don't think Youtube should be held responsible with or without 230.

1/25/2021 1:38:51 PM

afripino
All American
10617 Posts
user info
edit post

so should TWW be responsible for the shitty poasts and child pr0n (shit's gross, son) users may upload?

[Edited on January 25, 2021 at 1:49 PM. Reason : not saying it exists, just a more relatable scenario]

1/25/2021 1:49:00 PM

GrumpyGOP
yovo yovo bonsoir
17968 Posts
user info
edit post

With other media, one who publishes material that is, say, defamatory, has the same liability for the material as the individual who created it, on the theory that a publisher has the means and responsibility to check its veracity. Nobody bats an eye at this, and for that matter, very few people successfully sue these publishers for anything. Again, holding a publisher liable doesn't mean it's open season on everything that someone might disagree with. The standards for successfully suing someone over speech are extremely high.

So why should YouTube get different treatment from a newspaper or magazine? If the News & Observer is held responsible for what it publishes, why isn't Facebook? Cabbage points to scale. OK, how much volume does a publisher have to put out before they're immune from liability? And what kind of fucked up logic is it that the bigger and more powerful you get, the fewer rules we're going to put on you?

I get that moderating content would be enormously difficult for YouTube, Facebook, etc. So? They made this enormous, unaccountable thing. These guys are Dr. Frankenstein - they created a monster; the monster started causing damage; they hemmed and hawed about reining in the monster; the monster caused a lot more damage; now they need to set off on the quest to destroy it. That might mean massive changes to how these sites operate. Would that be the worst thing in the world?

Oh, and one more thing before I move on - most states impose penalties on people that raise losing defamation cases against publishers. Anti-SLAPP laws, in a rare case of legal terms being fun. There won't be millions of people frivolously suing - at least, not for long, not after they all get their cases thrown out and have to pay $texas to Mark Zuckerberg.

Quote :
"so should TWW be responsible for the shitty poasts"


Shitty posts aren't legally actionable. Whatever existing material on here that might be actionable would be from before the rule change and would presumably be grandfathered in. Anything that comes after...well, as far as TWW goes, let's not kid ourselves. This place is getting, what, a couple hundred posts a day? qntmfred could designate one or two lieutenants to moderate pro bono and easily police everything that goes on here.

1/25/2021 3:00:13 PM

Cabbage
All American
1555 Posts
user info
edit post

Quote :
" OK, how much volume does a publisher have to put out before they're immune from liability? "


"Immune" is not the word I had in mind, it's more like simple acknowledgement of the impossibility of vetting the individual posts of millions of users. And yeah, I think the idea that these sites being too big to monitor implies they are too big to exist is worthy of consideration; I'm not trying to argue in defense of social media, merely trying to be realistic about it.


Quote :
"And what kind of fucked up logic is it that the bigger and more powerful you get, the fewer rules we're going to put on you?"


I don't think that's what I am saying. There's a completely different dynamic between a book/newspaper publisher and social media, so there has to be a different set of rules. It's my understanding that book publishers (outside of vanity publishers and things like that) vet the material before it's even accepted for publishing. I can't simply submit the manuscript for my biography of Kevin Bacon and expect them to print and sell it; it first has to be vetted by the publisher (is it accurate, will it sell,....)

On social media, on the other hand, I type what I want, I click submit, and it's there (with the exception of some basic flag words/phrases that might trigger some autocensor); I'm the only human that sees the content before it's posted to the audience.

Now, if there's a social media site with a limited number of users, sure you could insert a step where all content is vetted by a moderator before being made public, but that's simply not possible on the scale of a Facebook or Youtube (maybe it's "possible" but it would slow interaction down to a prohibitive degree).

If we're going to take the step where we allow such social media giants to exist, and we are going to hold them responsible for content posted, the only way I can see it working (well, maybe not the only way, but the most immediately obvious way to me) would be:

Have some form of an official, open source algorithm agreeable to everyone to filter objectionable content from social media. Require all social media giants to use this filter. Inevitably, some objectionable content will slip past, the platform should remove this content when pointed out to them. Failure to do any of this can result in action against the platform, but so long as the platform takes these steps, it will be considered a good faith effort to be in compliance.

I'm sure this is far from perfect, but I'm not at all comfortable with holding Twitter or whatever accountable for, for example, some isolated, random post that led to some unfortunate event.

1/25/2021 3:51:35 PM

Cabbage
All American
1555 Posts
user info
edit post

And now that I've thought about it a bit more, one problem with having an open source algorithm to filter content is that it would actually enable people to bypass the filter even better, so I'm really not sure if that's the best approach after all. I'm just spitballing some ideas here.

1/25/2021 4:06:48 PM

GrumpyGOP
yovo yovo bonsoir
17968 Posts
user info
edit post

So you touch on the only real distinction between traditional publishers (books, newspapers) and online publishers: whether material is reviewed before it is published rather than after. And this is the area in which I'd agree there do need to be separate rules to deal with the practical aspect, but which ultimately arrive at the same end: responsibility for what gets published. A grace period is in order; I don't expect YouTube to take down defamatory material within minutes or even hours. So whereas a newspaper publisher can be expected to do their due diligence before running the presses, we might allow social media a day after the fact before they're liable.

You point out that holding these companies accountable will lead to massive, even catastrophic changes in how they operate. I get that. I'm happy about that. Putting social media companies in a position where they have to get really serious about fake news, bots, troll farms, etc. is a good thing. Surely we can't spring it on them overnight - give them time to reorient their businesses. But if the future of social media is slower, less wild, more moderated, and finally curated with an eye towards truth and civic responsibility rather than algorithmically click-generated profit, I say: so much the better.

(Note to any TWWers who fall into the more libertarian/free market camp: First of all, hi, I didn't think any of you were left. Second of all, I'm not anti-profit or generally in favor of forcing private enterprise to behave in accordance with my understanding of "civic responsibility." In this case, I refer merely to their having to follow the same rules as the rest of us.)

1/25/2021 5:33:43 PM

A Tanzarian
drip drip boom
10458 Posts
user info
edit post

Some 64% of users who join an extremist Facebook group do so because Facebook recommends it to them. Facebook actively promotes defamatory and extremist content to drive engagement and make money. Social media companies are not passive participants here and should not be treated as such. The speed and range of social media is far beyond anything possible with physical media. If anything, requirements for online publishing should be more restrictive than those for traditional publishing.

Quote :
"I'm the only human that sees the content before it's posted to the audience."


The problem isn't that your incendiary Cabbage-Anon content exists, it's the algorithmic promotion to excite and engage others. You might be the only human who sees the content before it's posted, but you are not the only human involved in making that content available to others. The algorithms are designed and operated by humans, and humans are aware of the consequences of the algorithms' actions. 'The computer did it' shouldn't be a valid excuse for social media's destructiveness.

1/25/2021 6:54:55 PM

Cabbage
All American
1555 Posts
user info
edit post

^^Yeah, I agree with pretty much all of that. In case my position wasn't clear, I'm really not trying to advocate a pro or con social media position here, merely trying to point out some issues that I think make it complicated.

I remember seeing on Bill Maher's show a few years ago they had, I think, a Google employee who had a job title something like "Social Engineer". Another guest mentioned how Orwellian that sounded, and he defended it by basically saying that Google gets such widespread use that they need to be aware that their search results and such (how they are ordered, what results come at the top of the list, for example) contribute to shaping opinions and points of view, and that it is incumbent upon them to put effort into at least pushing that into a socially positive direction.

I'm not presenting that example to promote a pro-Google position; I don't know enough of their sorting algorithms (and their results) to have a pro or con opinion on how good a job Google is doing on that front. I merely bring it up to point out I'm aware of the issues you mention, in fact I believe I mostly agree with your position. I just think we should be careful not to let the pendulum swing in the opposite direction, introducing draconic measures that will hold a social media platform responsible for the conduct of any of its members whenever there's some post that may vaguely reference that conduct. There should be some clear metric that says THIS is what you, as a platform, need to maintain for the good of society. It won't be perfect, and we shouldn't hold the platform accountable in those cases.

1/25/2021 8:21:03 PM

wdprice3
BinaryBuffonary
45883 Posts
user info
edit post

All good points. But can we start with the simple? These platforms, since their inception, have had or developed policies to combat some of these issues, but time and time again, they've defended not removing content / users except in the most extreme cases, up until recently. Point being, in addition to ^^, these platform operators know about this content, talk about this content, and outright refuse[d] to act. They are complicit and only recently have seen at least some light. They enable. They invite. They willingly give platforms to known bad actors.

1/26/2021 10:04:11 AM

darkone
(\/) (;,,,;) (\/)
11460 Posts
user info
edit post

Anyone want to talk about the consequences of removing Section 230 protection?

The legal standard before Section 230 - and the standard we would return to if repealed - is that a platform could only avoid lability for third-party content if they imposed no moderation what so ever. In this day and age that means everywhere would turn into a cesspool of porn bots and Nazis because everyone else would be driven off.

If websites did impose moderation to remove what undesirable content they could, they would have liability for what's left. If repealed, that would mean that there would be no new or small social media sites or forums or sites with comments since even the costs of defending frivolous litigation would break them. You can effectively and comprehensively moderate at scale. If you try to automate moderation, you're going to miss bad content and have a lot of false positives.

Any remotely controversial content would never see the light of day since banning it is way cheaper than potentially being sued over it. So there would be no negative product reviews. Blogging platforms (remember LiveJournal?) would likely go away completely. Fan fiction sites gone. YouTube would just be corporate content. Want to complain about local government? You're going to have to mail a newsletter. There are platforms for the disadvantaged that only exist because of the low barrier of entry provided by sites otherwise protected by Section 230. So many useful sites would be stripped of their utility if that protection is stripped. This place literally wouldn't exist because Ken isn't going to risk losing this house for when The Soap Box devolves to name calling or someone posts pics of Anita Flick and then a lawyer gets involved.

Section 230 was created for a reason and it wasn't to help mega corporations or enable assholes to harass people. It's purpose is to allow online communities to be able to moderate what they can without getting sued for the content they missed. The goal is to foster good communities. The Internet is bigger than Facebook and YouTube and people need to carefully consider the consequences of what they're asking for.

1/26/2021 11:17:10 AM

HaLo
All American
13255 Posts
user info
edit post

That’s one way it could be done I suppose. There are tons of smart engineers out there that would find ways to moderate content without being so onerous. Shoot there’s already message board systems that moderate manually content posted for your first X number of posts to determine that you’re not a piece of shit before allowing general access. Also verified/trusted users are a thing that can be done.

I firmly believe that there is a way to reign in the Wild West culture while still respecting the “town square” culture.

This isn’t a technical problem it’s a scale problem.

[Edited on January 26, 2021 at 2:31 PM. Reason : Z]

1/26/2021 2:30:07 PM

GrumpyGOP
yovo yovo bonsoir
17968 Posts
user info
edit post

Quote :
"The legal standard before Section 230 - and the standard we would return to if repealed - is that a platform could only avoid lability for third-party content if they imposed no moderation what so ever."


Ideally you'd be liable regardless of whether you moderated. Possibly this would require something in addition to simple repeal of 230.

Quote :
"If repealed, that would mean that there would be no new or small social media sites or forums or sites with comments since even the costs of defending frivolous litigation would break them."


Quote :
"Any remotely controversial content would never see the light of day since banning it is way cheaper than potentially being sued over it."


Robust anti-SLAPP laws would deal with this easily enough. People aren't going to be trigger happy with the lawsuits if they could end up having to pay through the nose. To what extent these laws would need to be modified or expanded, I'm not qualified to say; any such changes could be part of the legislation repealing 230.

Newspapers, magazines, and other traditional press print negative reviews and controversial material all the time. If these dying industries can manage, why can't juggernauts like YouTube?

Quote :
"This place literally wouldn't exist because Ken isn't going to risk losing this house"


Would repealing 230 also take away the ability to make an LLC? Ken will be fine. And while you can't generally sue someone for name calling, maybe we should be open to the possibility that posting somebody's naked pictures without their consent is...wrong?

1/26/2021 3:30:50 PM

darkone
(\/) (;,,,;) (\/)
11460 Posts
user info
edit post

Even defending frivolous suits takes time and $$$. You can be bankrupt and on the street before the process plays out. It's begging the rich to prey on the weak. Also, Anti-SLAPP isn't universal. It's a play on the adage, "The process is the punishment." Just look at the Devin Nunes Cow defamation cases. They're obviously a farce but you have folks six-figures in debt defending those cases.

Quote :
"Newspapers, magazines, and other traditional press print negative reviews and controversial material all the time. If these dying industries can manage, why can't juggernauts like YouTube?"


This content is manually vetted. They're also taking on responsibility by hiring the writers and whatnot. They're not producing millions of articles per day. YouTube sure as hell isn't paying anyone for video comments.

You can have third party content on the internet or you can have scale. You can't have both.

At the end of the day, this is about speech and I'm not going to be easily convinced we need less of it.

1/26/2021 3:57:15 PM

GrumpyGOP
yovo yovo bonsoir
17968 Posts
user info
edit post

Speech is great. Free speech is great. For 250 years in this country, we've largely agreed on this point. We've also agreed that free speech is not boundless. You can't stand in the middle of the street at 3:00 AM with a bullhorn and should that Barack Obama is a lizard person from planet Muslim. We're fine with these limits in every other context; why are they suddenly so horrifying in the context of the internet? Because it will be hard for poor, put-upon tech giants to deal with? The novelty of their industry doesn't grant immunity.

When Ford rolled out the Model T and people started getting hit by cars, we didn't say, "Well, I know it's terrible that these people are being run over, but [traffic laws, safety standards, etc.] are going to put an insurmountable burden on car manufacturers. Nobody will want to buy one because if they hit somebody they might get sued. At the end of the day, this is about freedom of movement and I'm not going to be easily convinced we need less of it." No. We made rules about what you could do with cars and what would happen if something went wrong. In the process, we created a new industry - auto insurance - to deal with the fact that people were going to have to take responsibility for their actions and mistakes. And now there are 273 million automobiles in the United States.

If you get rid of 230 and provide a robust legal mechanism to penalize frivolous or bullying lawsuits - effectively through a beefed-up version of anti-SLAPP rules at the Federal level, given that online publications are necessarily interstate - then I envision some enterprising portion of America's glut of lawyers will move to fill a new gap defending against such cases, free or at very low cost to the client, the lawyers to be paid out of the penalties. Possibly this would take the form of a new arm of personal liability insurance. Whatever the form, well-structured penalties offer an incentive for accessible legal defense and against spamming websites with lawsuits.

1/26/2021 5:14:12 PM

darkone
(\/) (;,,,;) (\/)
11460 Posts
user info
edit post

Let's come at this from another angle.

Why should the website hosting third party content be liable if you can just go after the author?

Should I sue Ford because someone driving an F-150 ran over my cat? Should they have not vetted the driver before they passed over the keys to the vehicle they built with their name plastered across the side?

1/26/2021 9:54:11 PM

GrumpyGOP
yovo yovo bonsoir
17968 Posts
user info
edit post

First problem: Assuming that you can even identify the author of something on the internet. Anonymity is baked into so many sites, and even places that nominally require your real name (say, Facebook), that's hardly enforced. But even if you could require that users verify their identify before signing up for a site, and even if you made those identities publicly available (so that they can be held liable) - how are you going to verify that the owner of the account is the one that posted a given item?

Second problem: Even if you could identify everybody on the internet, most of them live outside of the United States and are beyond the practical reach of the U.S. legal system.

Third problem: Going after the author doesn't get rid of the material causing the problem. Your hypothetical F-150 doesn't keep driving around flattening cats after you bring the first driver to trial, but a defamatory video keeps getting views unless the website takes it down. OK, you say, we'll make the websites take down videos that are the subject of lawsuits. But what are we going to do if they don't? Sue them?

None of which is to say that authors should be immune, obviously.

1/27/2021 7:39:49 AM

daaave
All American
1108 Posts
user info
edit post

Quote :
"Section 230 was created for a reason and it wasn't to help mega corporations or enable assholes to harass people. It's purpose is to allow online communities to be able to moderate what they can without getting sued for the content they missed. The goal is to foster good communities. The Internet is bigger than Facebook and YouTube and people need to carefully consider the consequences of what they're asking for."


Look on the bright side. If they repeal section 230, the entire internet will become Digg 2.0 and alternatives will become more appealing. See y'all on the dark net.

1/27/2021 10:56:39 AM

A Tanzarian
drip drip boom
10458 Posts
user info
edit post

^^^ A better analogy is Tesla's Autopilot. Technology advances from a '48 F-1 to an Autopilot-equipped Tesla mean that manufacturers are taking on responsibility for operating the car. They're no longer providing a simple machine; they're actively involved in the operation of that vehicle. Should Tesla be liable when a Model 3 runs over your cat while on Autopilot? Maybe. Certainly no one would argue Tesla should enjoy blanket immunity for incidents involving their vehicles. (See also 737 Max)

Likewise, user-based websites have evolved from basic forums to massive social media platforms. I believe forums (e.g. TWW, craigslist, and even cesspools like Stormfront, 4Chan, etc.) should generally be free of liability for user-generated content. A key feature for me is that these sites perform little promotion or curation and have relatively limited reach.

Social media companies should be open to liability for content posted on their sites, a distinguishing feature being they select specific content and promote it to other users (potentially millions). Users don't just see what others post; they see user-content the platform has selected for them (promoted, you might like, autoplay, etc.). These sites take an active role in what content their users see and should share responsibility for that content.

Obviously there's a continuum between small, basic forums like TWW and a massive social media company like Facebook that actively curates its content. I'm not exactly sure where the lines should be drawn between promotion of selected content and responsibility for that content. Where would a Reddit-style voting system fall?

I also share your concern for chilling effects. Craigslist was browbeaten into removing its adult forums. I'm sure it wouldn't take much for Ken and other small operators to pull the plug, and it's not hard to imagine larger companies deciding it's not worth the trouble and pulling the plug on support forums, comment sections, etc.

1/27/2021 1:22:12 PM

aaronburro
Sup, B
52093 Posts
user info
edit post

Quote :
"Ideally you'd be liable regardless of whether you moderated. Possibly this would require something in addition to simple repeal of 230."

Except, the liability which spawned 230 was over the moderation itself, too. As in, folks were suing content providers because their crap was moderated off. Your pie-in-the-sky notions of anti-SLAPP doesn't help here, nevermind that most states have wildly ineffective SLAPP laws anyway.

The reality is this: most places with user-generated web content will shut those portals down without section 230 protection. Period. TWW will not exist. Comment sections on news articles will not exist (which might have good thing.) Reviews will no longer exist. Facebook might continue to exist, but a shadow of its former self, with all content pre-moderated and anything short of cat pictures rejected.

No, AI is not the solution. Unless you like racially disparate outcomes like have occurred with facial recognition technologies and the few algorithmic attempts to determine sentences and bail amounts, that is. Or, the algorithms will be so stringent that, again, all you get is cat pictures.

Quote :
"Newspapers, magazines, and other traditional press print negative reviews and controversial material all the time. If these dying industries can manage, why can't juggernauts like YouTube?"

Because those outlets dont let any old slapnut publish whatever the fuck he feels like at a moment's notice. It's the literal difference between the two mediums: one is a publisher, the other is a bulletin board in a giant digital hallway. You wouldn't sue NC state because some slapnut scrawled "Ian Booth likes Nickelback" on the stall of a bathroom in Daniels Hall. A publisher exercises so much more control over what they produce that it makes sense to hold them liable for what they publish. What you are suggesting would cause State to take down the bathroom stalls and whitewash the campus every other week.

A repeal of Section 230 would have a chilling effect on speech across the internet, and it's not going to stop at "defamation," (though I can't see why you are so focused on that) "hate speech," or even misinformation. Police will sue to shut down BLM groups. Fuck, you JUST FUCKING SAW cops arrest a dude for a shitty photoshop of an album cover, for fuck's sake. Cheeto would be suing everyone who says he's a liar a billion times over. We've seen groups on craigslist and reddit where sex workers would go to get help escaping their situations get shut down over the SESTA bill. SLAPP laws just will not help with these situations.

This is not to say that we do nothing. A chief problem has been that the platforms have been allowed to get too big and are stifling competitors or outright buying them to kill them. (Yes, you heard a libertarian say that some companies are too fucking big) It is a problem that FB is hyping the most controversial shit and steering people to them, and maybe there is a line to draw, as ^ says, at the promotion of content. You use an AI to promote it, you have more liability. But the solution can't be to just pretend that people aren't going to use lawfare against opinions they don't like, which is exactly what was happening to cause 230 to be passed in the first place.

Quote :
"maybe we should be open to the possibility that posting somebody's naked pictures without their consent is...wrong?"

I think most folks think that's wrong, but the devil is in crafting a statute which captures that, and only that, with no chilling effects on any other protected speech. It's not as simple as it seems.

Quote :
"Would repealing 230 also take away the ability to make an LLC? Ken will be fine."

How do you propose Ken's LLC for a website that he openly admits he loses money on is going to pay the legal bill for the first suit? Greg would have had this place shut down.

Quote :
"But if the future of social media is slower, less wild, more moderated, and finally curated with an eye towards truth and civic responsibility rather than algorithmically click-generated profit, I say: so much the better."

The future will be that none of it exists, because no one will take the liability.

1/27/2021 10:25:57 PM

aaronburro
Sup, B
52093 Posts
user info
edit post

Quote :
"If you get rid of 230 and provide a robust legal mechanism to penalize frivolous or bullying lawsuits - effectively through a beefed-up version of anti-SLAPP rules at the Federal level, given that online publications are necessarily interstate - then I envision some enterprising portion of America's glut of lawyers will move to fill a new gap defending against such cases, free or at very low cost to the client, the lawyers to be paid out of the penalties. Possibly this would take the form of a new arm of personal liability insurance. Whatever the form, well-structured penalties offer an incentive for accessible legal defense and against spamming websites with lawsuits."

That's a cute idea, but the opposite is far more likely. An industry of greivance lawyers will probably pop up, similar to ambulance and asbestos chasers. "Did somebody hurt your feelings on the Internet? Tell em you mean business." It's trivially simple for a competent lawyer to craft a suit that evades even the strongest anti-SLAPP laws. The damage of these suits is in the discovery phase, and getting there is not hard. You can probably win after that and recover, but you'll be paying that cost upfront. And get enough of these suits at once, and the provider will just say "fuck it, back to cat pictures." And, no, lawyers aren't gonna work for free upfront for Amazon and Facebook at those scales. Instead, Lewis & Fuckface will just serve their robosuits and get settlements every time.

You're also grossly misjudging the current crop of lawyers being cranked out of law schools for how much they value free speech. Most of them are just as speech-intolerant as their college cohorts, frequently going so far as to demand that law professors not use the word "negro" when quoting supreme court opinions, or demanding that classes covering rape litigation be struck for fear of traumatizing victims who may be in the classes. These are not the people who will be running pro-bono speech-defending firms. They're gonna be at Lewis & Fuckface suing others.

And this only covers defamation. Truth, misinformation, those wont be helped. Being a fuck stick like hooksaw or a liar like salisburyboy isn't civilly liable, so those douches will continue to spread their Santorum ad nauseam. And if you do silence them, they'll sue then. Imagine Cheeto being able to sue Twitter because they suspended him. Yeah.

I'm not sure how you plan to crack down on misinformation, extremism, or lies via this mechanism. Defamation isn't even the fucking problem, and it's honestly already handled fairly well by our courts as is. Rudy and Sidney are both fucked by Dominion at this point, as are Newsmax and OANN. "There was massive fraud" isn't covered by defamation suits unless you name a specific person or company. The only way you are seriously tackling those is a regulatory framework. Which would plainly be unConstitutional at this current juncture. I'd also dare say that's a damned good thing that Cheeto didnt get a chance to influence regulatory agencies over what people said about him on the internet.

1/27/2021 11:10:09 PM

GrumpyGOP
yovo yovo bonsoir
17968 Posts
user info
edit post

Regardless of why 230 was created to begin with, I'm concerned with the problems it enables now.

The ineffectiveness of state SLAPP laws is immaterial when I'm suggesting that any repeal of 230 come with a Federal equivalent - which seems logical enough, given the necessarily interstate nature of the internet - that would need to be considerably more effective for me to favor it.

The doom-and-gloom predictions that all these sites are going to close or become pale imitations of their former selves doesn't seem to be standing on much. Many of these businesses are enormously profitable and are not going to throw up their hands and walk away because of a regulatory change. Plenty of rules have made plenty of industries more expensive or complicated, but few have imploded under the weight.

Comment sections on news articles? Good riddance. It isn't as though people had a right to mass-mail, for free, their opinions to everyone who reads the New York Times.

Reviews will go away? How do you figure? Reviews exist in plenty of traditional media outlets. Reviews are, at their core, opinion, and thus protected speech.

Quote :
"It's the literal difference between the two mediums: one is a publisher, the other is a bulletin board in a giant digital hallway. You wouldn't sue NC state because some slapnut scrawled "Ian Booth likes Nickelback" on the stall of a bathroom in Daniels Hall."


It's a difference that is completely arbitrary. The News & Observer and YouTube are both companies whose primary function is delivering media to a wide audience as a means to sell advertising.

And this shit about the NCSU toilets is preposterous. Vandalizing State property is a crime of which the State is a victim, not an enabler. I'm not suggesting that we sue websites who are hacked to display criminal material.

Quote :
"A repeal of Section 230 would have a chilling effect on speech across the internet, and it's not going to stop at "defamation," (though I can't see why you are so focused on that) "hate speech," or even misinformation."


Defamation is something I have personal experience with and at least a little knowledge of, it being the very first thing for which someone tried to sue me. I don't believe you can be sued for hate speech, misinformation, or otherwise "hurting people's feelings" (which is what you people seem hung up on, for some reason).

The bar for defamation is high. Harassment is a potentially serious offense and should be treated as such. I'm happy with chilling these.

You reference police overreach, which is a separate problem. The police department that arrested the photoshop guy should be hammered, hard.

Yeah, Trump and other public figures with more money that sense might try a lot of lawsuits. They'd lose virtually all of them, and under a well-designed penalty system it would bankrupt them. It will be a rough couple of years at the start, but once folks see the outcome of lawsuit-spamming, they'll chill the fuck out.

Quote :
"And, no, lawyers aren't gonna work for free upfront for Amazon and Facebook at those scales. Instead, Lewis & Fuckface will just serve their robosuits and get settlements every time."


Amazon and Facebook don't need lawyers to work for free. They can, under a well-designed system, front the initial cost and get paid back out of penalties. Small sites (TWW) don't have to work at anything like such a scale, and I'm absolutely confident that enterprising lawyers will work at little or no up-front cost to defend against doomed cases brought against them.

As to the lawyers...yeah, some are gonna be in Lewis & Fuckface. I'm almost happy that there are predatory lawyers to take money from people stupid enough to try an obvious failure of a lawsuit. But don't talk to me about how the latest generation of lawyers being cranked out is going to land in all this. Lawyers gotta eat like everybody else, and there's a lot more lawyers than there are good lawyer jobs.

Quote :
"I'm not sure how you plan to crack down on misinformation, extremism, or lies via this mechanism."


The direct impact would be getting these websites to deal more quickly with the handful of categories that are criminally or civilly liable - defamation, harassment, incitement, etc. The knock-on effect would hopefully be that, by forcing a greater degree of moderation, we could also make these sites more responsive to speech that's merely atrocious rather than tortious.

Quote :
"And if you do silence them, they'll sue then."


How do you figure? What right does anybody have to getting their bullshit put on a given website?

1/28/2021 11:03:59 AM

Cherokee
All American
8256 Posts
user info
edit post

Quote :
"Certainly I hate it. Websites are publishers and should be treated as such. The state of the country and world would be immeasurably better if we'd been doing that for the last 20 years.

It's a little silly to say everyone wants to sue tech firms for speech "they don't like." The First Amendment still exists. But when Alex Jones lunatics post videos telling people to harass shooting victims, posting their home addresses, etc., YouTube needs to be accountable for enabling that."


I need to think about this but GrumpGOP usually has well reasoned arguments so I'll probably end up agreeing haha.

1/29/2021 5:53:29 PM

A Tanzarian
drip drip boom
10458 Posts
user info
edit post

Quote :
"Of the top 100 most-active US civic groups, 70 percent "are considered non-recommendable for issues such as hate, misinfo, bullying, and harassment," [internal Facebook researchers] said."


https://arstechnica.com/tech-policy/2021/02/70-of-top-civic-facebook-groups-are-toxic-or-violent-report-finds/

2/2/2021 1:50:53 PM

aaronburro
Sup, B
52093 Posts
user info
edit post

Quote :
"The doom-and-gloom predictions that all these sites are going to close or become pale imitations of their former selves doesn't seem to be standing on much. Many of these businesses are enormously profitable and are not going to throw up their hands and walk away because of a regulatory change. Plenty of rules have made plenty of industries more expensive or complicated, but few have imploded under the weight."

I think it's fair to say that they are enormously profitable because of Section 230 and how it freed them from having to worry about a massive volume of bumptious harassing lawsuits. Lawsuits which the early tech companies were very clearly facing, and which threatened their very existence. You may not care why it was created, but that's not a good argument for ripping it out. I'm not blind to the problems we now have (which I think are far more related to the monopoly positions of the tech companies than anything Section 230 did), but I'm also not blind to what was there before. I'm also not oblivious to the fact that people sue over frivolous shit all the fucking time, and they do it using arguments that pass just enough muster to sustain themselves in a courtroom, just to harass someone else, and in ways that are clever enough not to be sanctioned as frivolous. That Twitter now has a shit ton of money isn't a reason to shackle the next Twitter from forming because some dipshit got butthurt over someone calling him an asshole and decided to pursue a scorched-earth policy against every link in the technology chain. What you don't seem to understand is that the discovery portion of a lawsuit is where the damage from these kinds of suits happen, it's just not particularly difficult to get that phase of a lawsuit, and it's fucking cheap to do so for the person bringing the suit. Even states with well-formed anti-SLAPP statutes still have this problem.

Big companies might be able to weather it for a while, with some modifications, but TWW will no longer exist; ken has explicitly said so in this very thread. Small time forums like this will disappear. Most are run on shoestring budgets by enthusiasts. That's just how it is.

I'm not going to particularly shed a tear over comment sections going away on some news sites, but it will certainly happen. Reviews absolutely will go away. Bumptious defamation suits almost always target opinion, and the fact that it is protected speech doesn't change that. Online reviews are a literal gold mine of material for litigators to go against, because bad reviews can harm the company being reviewed; it becomes economically viable to sue over bad reviews. Section 230 is currently a massive protection for online reviews, because tech companies can so easily throw these suits out. The clear and obvious legal strategy for a litigant would be to claim the review was fake. To claim that ALL reviews are fake, that defendant knew they were fake, and they did nothing to stop it and the defendant is trying to harm mein poor little pillow company. It has to be fake, cause my product is perfect, and those pictures look to be counterfeit items or aren't actually my product. And I want discovery to prove it. Lots of discovery. Reviews don't bring in much money, and few if any companies are going to keep that invitation to a lawsuit open for long. The issue is about how much pain a company is willing to go through to deflect lawsuits over generally non-revenue driving parts of their business in the hope that they will recover attorneys fees at the end of it, versus just ending the lawsuits permanently by shutting down the feature. (Obviously Facebook and Twitter are different, because the core part of their platform is what they'll be sued over.)

I think you are also greatly underestimating the reach of what Section 230 is protecting. It's not just Facebook and Twitter who are protected. Amazon isn't liable for hosting the blog platform where my ex gf's writes about how smelly my feet are and how I only lasted 30 seconds in bed and scream "Boom goes the dynamite" when I'm done. Wordpress isn't liable for making the blogging software she uses. GoDaddy isn't liable for providing the domain name. Quad9 isn't liable for resolving the DNS requests. Google isn't liable for someone sending me a nasty email, nor is my favorite open-source IMAP email client for displaying it to me. AT&T isn't liable for someone sending me lots of anonymous texts through a spamming program the spammer wrote (though I would like to see meaningful changes on spam and scams). Comcast isn't liable for sending the evil bytes through the intertubes into my computer. Starbucks isn't liable for the hotspot on the end of those evil bytes as my ex gf writes her next post. Verizon isn't liable for passing data from Starbucks' hotspot onto the intertubes. OpenVPN isn't liable for the tunnel she makes to avoid Starbucks' tracking her next failed book project. Google isn't liable for search results to the blog. RSS feeds aren't liable for automated delivery of updates from my favorite websites which somehow scraped the blog. XboxLive isn't liable for someone calling me a n00b and griefing me constantly. Epic games isn't liable for hosting the Fortnite servers where I get griefed. evan isn't liable for writing the code that allows that griefer to insult me. The list goes on and on. All of these services are relying on the promises made by Section 230 that they can safely process and transmit data agnostically if they so choose, or host and remove content at their whim.

Quote :
"It's a difference that is completely arbitrary. The News & Observer and YouTube are both companies whose primary function is delivering media to a wide audience as a means to sell advertising."

That's a hasty generalization so massive that normally I would be the one accused of making it, not you. You know better than that. A publisher is employing literally dozens of people to scrutinize every word before it is published. They are explicitly hiring the writer for the piece (excluding letters to the editor). They have multiple lines of editing and review. They are paying people to arrange it and lay it out along with other pieces. They make explicit decisions about what pieces to publish and what pieces to axe. The massive effort to put the words out is why we hold them to a higher standard. Contrast that with this place, where ken has no idea whatsoever what I'm going to type next. The two situations aren't even remotely comparable. The point raised about vandalism wasn't over criminality of method but over how much control the entity had over the person expressing the opinion. I also note you specifically did not deny liking Nickleback :p

Quote :
"The bar for defamation is high. Harassment is a potentially serious offense and should be treated as such. I'm happy with chilling these.
You reference police overreach, which is a separate problem. The police department that arrested the photoshop guy should be hammered, hard."

You're smarter than this. Chilling effects have nothing to do with speech we don't like and everything to do with speech we don't want to discourage. I'd also point out that harassment already has criminal protections against it in most states. And of course the cops mentioned should be hammered, but you are ignoring the part where I said police and their associated organizations would sue others for defamation wantonly. My reference to the arrest was to show how much police want to attack speech (to the point of doing plainly and obviously unConstitutional things), not to suggest that this particular incident was protected by Section 230.

2/4/2021 1:00:37 AM

aaronburro
Sup, B
52093 Posts
user info
edit post

Quote :
"Yeah, Trump and other public figures with more money that sense might try a lot of lawsuits. They'd lose virtually all of them, and under a well-designed penalty system it would bankrupt them. It will be a rough couple of years at the start, but once folks see the outcome of lawsuit-spamming, they'll chill the fuck out."

Even under the best of systems, recovery (if it even happens) against even a modestly crafted lawsuit would come at the end of the years long suit. It sounds great to say "yeah, you'll get recovery," except you'll be on the hook upfront for years of bumptious discovery requests which cost the plaintiff almost nothing to file and amend and amend. Large tech firms could probably sustain a few of these. They can't sustain thousands upon thousands at a time. And that's if they even get recovery, which under even the most strict of anti-SLAPP laws today is not particularly common unless your lawsuit is Vic Mignogna levels of stupid. Meanwhile, smaller tech firms would just collapse under a couple of these, leading to further consolidation of tech and media companies which are already too massive.

Contrast this with Section 230 today, where a firm pays a couple grand to a lawyer to show up, say the words "Section 230," and walk out of the courtroom. You keep speaking of a "well-designed system" but have proffered nothing so far. I'd love to hear an idea which would be this effective at dismissing bumptious suits and not require a firm to expend hundreds of thousands of dollars upfront. Right now the best you've given is an invitation for an as yet undefined federal anti-SLAPP statute, which is more powerful than existing state statutes, while suggesting it'll give tech platforms an incentive to also go after misinformation, which will make them subsequently liable for additional lawsuits against moderation which would have otherwise been barred by the current law you are proposing we nuke. It's mind-boggling. I'm not in disagreement that I'd like to see a federal anti-SLAPP statute passed, but let's be clear that both Washington (state) and Minnesota had anti-SLAPP statutes struck down recently, and Washington's was largely modeled on California's law, which is seen as "best in class" among most 1A advocates. What's more, if you make the anti-SLAPP law too powerful it could discourage otherwise legitimate lawsuits in instances where the complainant actually needs discovery to prove their case, a fact which weighed heavily in striking down the aforementioned statutes.

Beyond all that, I'm having a hard time making a connection between the problem and the proposed solution. If our problem is extremism, misinformation, and hate speech, (with a spackling of cyberbullying which I don't think will be addressed by any of these proposals) I don't understand how removing a shield which largely blocks bumptious defamation suits is going to do any of that. There aren't complaints of defamation running rampant on social media. You can't sue folks for hate speech, extremism, or misinformation not directed to a specific person. So, what the shield protects isn't an issue, and what the problem actually is can't be addressed even without the shield. Add to that the collateral loss of protection for actually moderating offensive content, and, it just doesn't make sense. The whole plan just doesn't seem particularly well thought out. You're uninstalling your anti-virus because your RAM is acting up.

Quote :
"I need to think about this but GrumpGOP usually has well reasoned arguments so I'll probably end up agreeing haha."

He usually does, which is what makes this one so baffling. The only legit argument I've heard so far came from A Tanzarian who pointed out that some of the platforms are curating and recommending information, which is a meaningful distinction.

2/4/2021 1:01:09 AM

GrumpyGOP
yovo yovo bonsoir
17968 Posts
user info
edit post

Quote :
"I think it's fair to say that they are enormously profitable because of Section 230...I'm not blind to the problems we now have (which I think are far more related to the monopoly positions of the tech companies than anything Section 230 did)"


And now you've given me a reason that hadn't previously occurred to me: that Section 230 specifically nurtured these platforms into becoming enormous monopolies. With no incentive whatsoever to moderate, it encouraged them to grow by any means necessary.

Quote :
"I'm also not oblivious to the fact that people sue over frivolous shit all the fucking time, and they do it using arguments that pass just enough muster to sustain themselves in a courtroom, just to harass someone else, and in ways that are clever enough not to be sanctioned as frivolous."


And yet this doesn't seem to happen in traditional media. Of course, traditional media is failing, just not because of constant lawsuits. Why are they failing? Oh, right - they're being outcompeted by online platforms who have legal protections that they don't.

Speaking of things that don't seem to happen to traditional media: getting sued out of existence over product reviews.

Quote :
"I think you are also greatly underestimating the reach of what Section 230 is protecting."


And I think you're vastly overestimating, based on the laundry list of entities you think will be targeted. If a newspaper prints libel, people don't get to sue the paperboy, the ink manufacturer, the paper mill, and the estate of Johann Gutenberg.

If there's some legal quirk that would leave GoDaddy, Quad9, AT&T, etc. exposed to lawsuits where their traditional media analogues aren't, then by all means, let's address those.

Quote :
"A publisher is employing literally dozens of people to scrutinize every word before it is published... The massive effort to put the words out is why we hold them to a higher standard."


It's also why so many are dying in the face of competition from protected entities that don't have to put in any effort whatsoever. Surely there's some acceptable middle here?

Quote :
"Chilling effects have nothing to do with speech we don't like and everything to do with speech we don't want to discourage."


I am essentially OK with a degree of chilling on all online speech. You have an inalienable right to free speech. You do not have an inalienable right to a bullhorn, a printing press, or twitter. Nor do you have a right to impunity from harassment laws by way of anonymity.

Quote :
" You keep speaking of a "well-designed system" but have proffered nothing so far. I'd love to hear an idea which would be this effective at dismissing bumptious suits and not require a firm to expend hundreds of thousands of dollars upfront."


I'm not a lawyer nor a legislator, so my powers at designing such a system are limited. What I'll say is that so far the examples people have presented of the lawsuits they anticipate are prima facie bullshit targeting opinion and other protected speech, should not have a chance of making it even so far as discovery, and should be met with penalties sufficient in scope and promptness to offset damages. I'll allow that this might require reform of tort law more generally, which is something I would very dearly like to see regardless of 230 (both as a citizen and as an individual who has been targeted by not one, not two, not three, but four separate and frivolous lawsuits). It is my strong preference that Section 230 removal come with a package of these necessary reforms; however, I'm not sold on the doomsday predictions of what would happen in their absence.

Quote :
"If our problem is extremism, misinformation, and hate speech, (with a spackling of cyberbullying which I don't think will be addressed by any of these proposals) I don't understand how removing a shield which largely blocks bumptious defamation suits is going to do any of that. "


Sites that specialize in criminal or tortious speech can be obliterated. Sites that don't can be prodded into active moderation to suppress that speech.

---

I'm not, like, a crusader on 230. It's not in my top ten. But we've got a problem and I don't see a lot of other proposed solutions. This is a country in which a President cannot be held accountable in court for anything they do, and social media cannot be held accountable for aiding and abetting them. That's a proven recipe for disaster. Since 1/6, these companies have been more cooperative with reality, and that has been a good thing for the country. In the future, I'd rather they have an incentive to cooperate other than a sudden change in who chairs the Senate committees.

Then there's the grassroots misinformation, a separate problem and one not explicitly addressed by removing 230 protections, though I have hope that the existence of some degree of active moderation will help.

Point is: Present me an alternative, man. I don't need 230 to be my hill to die on. But I also don't want to die lynched by Proud Boys or wheezing in a COVID ward from some variant that developed in the lungs of anti-masker, anti-vaxxer knuckle-draggers.

2/4/2021 9:52:03 AM

darkone
(\/) (;,,,;) (\/)
11460 Posts
user info
edit post

Quote :
"...Section 230 specifically nurtured these platforms into becoming enormous monopolies. With no incentive whatsoever to moderate..."


Section 230 was designed to allow and encourage sites to moderate. You're claiming the opposite which is untrue on it's face.


Quote :
"Speaking of things that don't seem to happen to traditional media: getting sued out of existence over product reviews."


You might have wanted to do a 5-second search before you wrote this. People get sued (usually frivolously but none the less to great personal cost) all the time.

Quote :
"And I think you're vastly overestimating, based on the laundry list of entities you think will be targeted. ...

If there's some legal quirk that would leave GoDaddy, Quad9, AT&T, etc. exposed to lawsuits where their traditional media analogues aren't..."


Pretty much all the early lawsuits that lead to Section 230 involved ISPs and Webhosts getting sued: https://www.eff.org/issues/cda230/legislative-history

Quote :
"You do not have an inalienable right to a bullhorn, a printing press, or twitter. Nor do you have a right to impunity from harassment laws by way of anonymity."


The First Amendment disagrees.

Quote :
"Then there's the grassroots misinformation, a separate problem and one not explicitly addressed by removing 230 protections, though I have hope that the existence of some degree of active moderation will help."


And who gets to decide what's misinformation?

---

The problems you're trying to address are cultural. You're not going to legislate your way into a world with less assholes and more educated and more civically engaged voters.

We already know that removing Section 230 harms speech and useful commerce. It's not that old. If you want to tweak the law, try starting from a do-no-harm starting point.


Also, let's put the actual text for 230c in this thread:
https://www.law.cornell.edu/uscode/text/47/230


...
(c)Protection for “Good Samaritan” blocking and screening of offensive material
(1)Treatment of publisher or speaker
No provider or user of an interactive computer service shall be treated as
the publisher or speaker of any information provided by another information
content provider.

(2)Civil liability
No provider or user of an interactive computer service shall be held liable on account of—
(A)any action voluntarily taken in good faith to restrict access to or availability of
material that the provider or user considers to be obscene, lewd, lascivious, filthy,
excessively violent, harassing, or otherwise objectionable, whether or not such
material is constitutionally protected; or
(B)any action taken to enable or make available to information content providers or
others the technical means to restrict access to material described in paragraph (1).
...

2/4/2021 1:24:48 PM

GrumpyGOP
yovo yovo bonsoir
17968 Posts
user info
edit post

I'm backing off of this one. I still think 230 needs changes but in the course of arguing for that I've backed myself into a corner of advocating, or seeming to advocate, its obliteration without any replacement, which is not a position I can defend.

Section 230 has done a number of important things that I do not want to destroy. It has also inadvertently created an environment in which the services it protects now incubate an existential threat to the liberal democratic system. I find it difficult to believe that there is no way to reduce the damage without preserving the benefit.

[Edited on February 4, 2021 at 3:19 PM. Reason : ]

2/4/2021 3:18:49 PM

 Message Boards » The Soap Box » Section 230 Page [1]  
go to top | |
Admin Options : move topic | lock topic

© 2021 by The Wolf Web - All Rights Reserved.
The material located at this site is not endorsed, sponsored or provided by or on behalf of North Carolina State University.
Powered by CrazyWeb v2.38 - our disclaimer.