Are social media's recent bannings a freedom of expression issue?

Then I don’t know what you are advocating for. You said that you want the TOS to be clear and to be consistently followed. Is that not it?

You are looking for some sort of legal remedy if I feel as though a social media platform has violated its TOS, are you not?

Rather than talking around it, and then getting upset that I misunderstood what you are advocating for, why not just come out and say exactly what it is that you are advocating for. Last time that you said:

But now you say that that is not really what you are advocating.

Give me an example of what it is that you want, something, anything, rather than making me play the game of guessing what it is that you want, and then taking me to task because that wasn’t it. It’s a very tiresome game.

I’d agree that it does. But I’d also say that twitter and facebook do as well. As well as they can with tens of millions of users, in dozens of languages across as many countries, anyway. You may disagree that they make such a good faith effort, but then, people disagree that the mods here do as well.

Those ex-posters, as well as some current posters, disagree.

Right now, they have no standing. If the filed a case, it would be dismissed. I don’t know exactly how the SDMB would react, but considering that right now, if one of them were to file a suit, they would have to pay any fees that the SDMB had to pay, it may not cause TPTB to shut it down.

If it was a two-way contract, then I would have standing to sue if I felt that they violated that contract. They may still prevail, but they would have to put up a defense.

Exactly this. Right now any suit against SDMB for moderation would be dismissed. However, removal of Section 230 or allowance for standing for suit would subject SDMB to lawsuits for moderation which would tax the budget I’m sure.

Yeah, well, people who worked there, disagree – with you.

Right, but perhaps there’s an alternative to regulating or putting some limits in place besides just removing Section 230, which generally serves not just tech companies but consumers and the general public well. Social media emerged about 10 years after the Communications Decency Act was put into place, so perhaps we need legislation or regulation that takes the unique nature of modern social media into account.

Sure, the moderation here is flawless at all times - but if there’s a legal avenue to suing social media companies for not following their own TOS, then that’s not going to save the SDMB. Even a successful legal defense requires money, and the board does not have a lot of that lying around. Any legitimate threat of legal action against the boards, and the owners (whoever the fuck that is right now) are going to pull the plug on the SDMB rather than waste the money necessary to fight it.

Some of the people who worked there disagree. And that’s fine. Like I said, there are posters and ex-posters who would disagree with your assertion that the SDMB makes a good faith effort at fair moderation.

As it says in the article, there are 5 million things that could be considered hate speech, out of 5 billion. The article does talk about ways that they are trying to address this. They are improving their AI, they are increasing their staff, they are, IMHO, making a good faith effort to address this problem.

But, I ask you again, what is it that you are advocating for? What remedies do you seek if you do not believe that they are living up to their good faith efforts?

I mean, from what you were saying, facebook could fall in line of what you want by simply deciding to allow hate speech and disinformation, leaving actual illegal speech to the authorities to track down and prosecute, and then they wouldn’t have to spend all this time and money trying to squash it.

It seems to me, even from that buzzfeed article, that they are trying to do the right thing, but it is complicated and difficult to do. It will take some time, and they will probably make some mistakes, but I’d far rather a private company, even one as large as facebook, make those mistakes than for the govt to make mistakes in trying to “fix” the problem.

Twitter and Facebook may have, but you had message boards even before the CDA. You can carve out portions of Section 230 regarding people violating federal law (FOSTA, the proposed SAFE TECH Act) and that’s fairly straight forward. But carving out Section 230 protections for vague things is simply going to lead to a lot of chaos and likely smaller social media sites (like message boards) closing due to the uncertainty of the application of the protection.

Then take necessary legal protections in advance. Users could agree in advance that they don’t sue the SDMB – I don’t know. I know some websites instaban/permaban users who even hint at legal action against a website.

Look, I would acknowledge that there’s always a risk that regulation can create new problems, but I don’t think that’s an excuse for doing nothing. Poorly regulated social media is a problem.

Then that would be part of the sign up agreement for facebook as well, leaving us right back where we started.

You haven’t really said what it is that you want. Pointing at problems and demanding a vague solution is how you end up with bad regulations. The reason for doing nothing is that it has not been shown how doing something would improve the situation that you say is untenable.

Well, no, poorly regulated speech is a problem. But it is a problem that I am willing to tolerate, as any remedy seems far worse.

And then Twitter and Facebook put the same clause in their TOS, and we’re back to exactly where we started.

FTR, this is also SDMB policy.

I’ve already indicated the direction I’d like to go in, but without a legal background I can’t say with specificity what would work and what wouldn’t (i.e. what can be dealt with at the regulatory level, the legislative level, or public pressure level). I’ve made that clear over and over again; sorry if you missed it while you were having your “whatabout” and “both sides” and “ZOMG! Tyranny!” parade.

Ideally, we wouldn’t need any regulation or legal impetus, and if the threat of legal action is enough to get Twitter, Facebook, and others to get off their asses and do more now, then I’m fine with that. Having worked in regulatory matters before, I’ve often observed that the fear of regulation can be a highly effective motivator, and often forces companies and industries to do the right thing. But I don’t want to just rely on that alone.

The concerns about over-regulation or counter-productive regulation are warranted – not saying otherwise.

I think it’s possible to develop systematic regulation that basically promotes free speech among users but forces social media companies to take action to curb the most egregious misuses of its platforms (i.e. spreading misinformation, hate speech, etc).

In fact, I think it’s actually better to regulate social media sooner rather than later, lest we get to the point where the backlash causes an even more extreme degree of hostility toward big tech and their services.

Within the coming years, I expect most major nations, including democratic ones, to set up a more robust regulatory framework, so I don’t think there’s any way to avoid the issue. Germany requires companies to take down manifestly illegal content or face fines. Australia requires companies to take down manifestly abhorrent content or face fines.

I don’t know if we should or legally can necessarily copy what other countries do in all cases, but I think that we may need to set up an FTC/FCC for social media or internet content. I’d mostly envision approaching regulation as a task force framework, with the threat of legal action (fines) if companies don’t take action.

So, just vague assertions that something needs to be done. I really don’t understand what the direction it is that you want to go in. You say that certain types of speech are incompatible with a liberal democracy, but nothing in the direction you have said you want to go in would have any effect on that speech.

You don’t need to be a lawyer or have a legal background to answer these two simple questions:

Do you want to regulate the sort of speech that is allowed on social media, yes or no?

Do you want social media sites to be liable for inconsistent or uneven moderation, yes or no?

If you answer those, then I will understand what direction that it is that you want to go in.

How about an example, how do you feel about how India is handling it?

I assume that this is not the direction that you want to go in?

BTW, while you are free to say it, your hyperbolic stramen like this do not actually do anything to add any actual substance to your argument, and make it hard to take you seriously. It is a perfectly acceptable question to ask, “What about…?” when you are proposing changes, especially the vague do something changes that you are demanding, nothing that I have said has anything to do with bothsidesm, and I am not the one who is going on and on about how our society cannot survive unless these vague changes are made. I hope this to be my last comment on your unwarranted tone, but it truly is annoying to deal with such baseless accusations as you continue to level at me.

Fear of losing users, advertisers, and as you brought up, employees, is also a good motivator, and it doesn’t require the govt to step in. I don’t see that much of a difference between the govt regulating, and threatening to regulate, either one is the govt putting pressure to conform to what the govt wants.

Yes, India is a democracy, and they have done so.

As does the US, we just have different standards as to what is illegal or abhorrent. If someone posts child porn to twitter, they will have to take it down or face fines. I agree with that entirely. The question is, is how far do we take what is illegal or abhorrent.

If you want to know the direction I’d like to consider going in, you can read my post to Miller.

Yes, I acknowledge my posts are vague in that respect, and I see no problem with it, as I’d prefer a slow and deliberate approach, rather than bullshitting myself and everyone else into believing that I, or anyone, has a silver bullet solution ready to go – we don’t.

We obviously don’t want to do anything that’s going to be overly-intrusive, but we don’t want to just sit on our hands and let a known problem fester without action. It’s possible to address the worst of social media, with both regulatory and a non-regulatory framework.

Fear of losing advertisers is not going to stop social media. Advertiser pressure and user boycotts have had absolutely ZERO impact on social media behavior. Any change in behavior is almost certainly driven by concerns about their image, and particularly with regard to regulation. Social media companies know that due to their newness, they’re somewhat unique in that they’ve largely been an unregulated marketplace. They know that this is going to change eventually.

Any reaction is strictly to make sure that regulation doesn’t impact their business model. But they’d probably laugh their asses off at your comments about advertising pressure – it’s non-existent. It’s the advertisers who need social media access, not the other way around.

Relevant to this thread:

I think this law is really going to backfire in unintended ways on Texas and conservatives. It may essentially turn into a troll-protection law.

I’m not a big fan of picking on legislators for attending to minor things when major problems are unresolved, but really, Texas lawmakers: don’t you think it’d be nice to do something meaningful to fix the state’s screwed-up power supply problems before worrying about dingbats being booted off Twitter?

There’s a lot to be said for disbanding the Texas state legislature entirely, but then we’d lose out on the amusement value.

Bumping, since the Cheeto™-faced, ferret wearing shitgibbon finally decided to take my advice.