AOC as James Mason in The Verdict - Not a Good Look

Disinformation appears to work. Anonymously post an outright lie. (so you can’t get blamed for saying it). Post it often enough that people who are elderly or poorly educated see it repeatedly and internalize it. Those people then vote for your team. Even if this is just 10% of the population that easily will swing elections.

Do liberal causes do this? Clear and outright lies? I mean if a strategy works…

I don’t think so but there have been a couple of news stories that have made me raise a critical eyebrow. The “Mission Accomplished” banner was a lot of grief for George W. Bush because Iraq was still all effed up at the time and the mission was clearly not accomplished. However, when the military said Mission Accomplished they did not mean the status of Iraq overall. They meant a specific plan for dropping bombs, suppressing air defenses, et cetera. I get the greater point about it being tacky to hang up a banner saying “Mission Accomplished” when you could easily argue no such thing had happened. However, the **military **mission *was *accomplished. What happened afterward was not a part of that mission but the criticism was applied as if it was.

I also remember a news story about how stupid Trump’s people were because they couldn’t figure out how to turn on the lights in the White House conference rooms. Look, these dudes are so goddamn dumb they can’t figure out a literal light switch! And if that is indeed the case then by all means have these people be the first ones up against the wall. However, I don’t think that was the case. I feel like the lighting controls in one of the most secure buildings on the planet might be a little more complicated than switches on the wall. If I was the Secret Service I would not want any random asshole to have the power to plunge a room into darkness by flicking a switch. I don’t know how it actually works but there’s plenty of footage from inside the White House. Anyone see light switches in there?

In any case, those are debatable examples. Going forward, I think Republicans should recognize and understand that Trump has shown just how flexible the rules can be and just how much immunity one can enjoy courtesy of flimsy arguments and non-cooperation. The Republican empathy deficit becomes obvious. They do not think Democrats will turn around and pull the same shit on them. But they certainly could.

Political debate can now be dominated by sources like Facebook. This would be very bad if a variety of loonies all had equal access to sow disinformation, but the reality is even worse. For example, it is now known that the Kremlin commandeered social media and had a huge effect on the 2016 election.

Of course there have been lies and propaganda for centuries. But the speed with which lies can propagate today and the inability of truth-tellers to counter the lies are unprecedented.

It’s hard to argue with much of what mhendo writes here. Except for the part I’ve emphasized. Disinformation spread by social media is one of the biggest problems facing today’s America. It’s a far FAR bigger problem than gun violence, for example.

I don’t have any easy solutions to offer. But putting our heads in a hole and pretending that lies spread by Facebook aren’t drastically degrading the quality of American politics? That ain’t the solution.

I’m not interested in trying to evaluate which thing is worse than the other, although I do agree that political disinformation is a problem. But political disinformation has always been an issue in a representative republic like the United States. Newspapers were printing lies about candidates for office during the ferocious Federalist/Republican conflicts of the 1790s. Some newspapers used the strategy of reporting that a candidate had died, in order to eliminate support for the candidate. In a period where news traveled slowly, such reports were often hard to refute in time for the election.

Sure, the internet and social media have changed the way that information moves, but the solution to that is NOT to blame the technology. Commentators have been complaining about the detrimental effects of new technology for as long as there has been new technology. This is particularly true with forms of media. Look back at the 1940s and 1950s and you’ll see plenty of jeremiads about how television was going to ruin social life and public discourse in America.

The new technology might change the speed and scale at which disinformation can be disseminated, but they don’t change the fact that politics has always been rife with disinformation. There was no golden age of political discourse. The era of the Lincoln-Douglas debates was also an era of partisan and dishonest news reports, and political disinformation and misrepresentation.

Never pretended any such thing. I thing that lies degrade political discourse, no matter how they are spread. When Facebook facilitates the spreading of lies, though, then it’s the lies that are the problem, and not Facebook, And if people are willing to believe the first thing they see on Facebook, without making any effort to determine the reliability of the source or the information, then that’s a problem with America’s level of civic education and engagement, and one that won’t be solved by forcing Facebook to become an arbiter of what constitutes factual information, and what constitutes dishonesty or lies.

The problem there is that’s really a kind of prisoner’s dilemma. If the Left retaliates by themselves systematically corroding institutions and decorum, turning partisan blind eyes to corruption and moral decay, pulling anti-Democratic bullshit to get their goals (like hurriedly passing laws in the dead of night while Republicans are not in the room), constant lies and gaslighting etc… then America fully becomes a banana republic on a short road to some form of violently autocratic authoritarianism.
And yet if they don’t find some way to punish or otherwise discourage these flagrant abuses and dysfunctions, they’ll keep getting it up the dry arse with a handful of gravel.

And if Zuckerberg had any guts he would state, when asked directly: “Hell yeah I’ll publish bullshit, as long as I get paid!”

The increased speed and capacity for degradation of the political discourse might be more important than you’re allowing for. If we posit that a mechanism exists by which the conversation is “self-repairing,” so to speak, and further posit that this mechanism can’t itself be accelerated beyond a certain limit, the technological advances may be causing this degradation to proceed at a rate so high that even a lower level of equilibrium cannot be achieved.

Granted, those are pretty big things to posit. And I DID just pull it out of my ass. Might be worthwhile for some actual social scientists to consider exploring, though.

Yeah, if he were honest, that’s pretty much exactly what he should have said. That’s probably what he’d like to say, but he recognizes that he has to try to look like he’s doing something" while also appearing not to cave in to pressure.

In his position, if I were being honest rather than engaging in self-preservation, I might have said something like this:

Congresswoman, Facebook’s revenue model relies on advertising, and our general policy is to accept advertising from anyone who wants to pay for it, as long as the advertisements don’t violate the law in some way or another. I understand and share your concern that false and misleading advertisements might detract from the nation’s political discourse, and might even disadvantage certain individuals or political parties at the polls, but I’m also very wary of making myself or my employees the arbiter of what constitutes truth and honesty in political advertising. Many political assertions, even those claiming to make factual statements, are somewhat subjective, and are amenable to multiple interpretations. Where would you draw the line, in terms of falsehood? Would it apply only to simple and easily verifiable factual statements, or would it also apply to questions of emphasis and interpretation? What about competing factual claims based on different sources, or different datasets? And, perhaps most importantly, do you really want social media companies to become the gatekeepers for what is and is not acceptable political speech in this country?

Then, if I really didn’t care about pissing off my users, I might say:

Also, why is it Facebook’s responsibility to make up for the shocking absence of rational discourse and critical thinking among the American population? Is it our fault that a substantial portion of our customer base is as dumb as a bag of hammers, and couldn’t distinguish between a good and bad political argument if their life depended on it? If you really want to fix the problems with American politics, start with the people who actually make up the American electorate.
:slight_smile:

You don’t get to weasel out on a very narrow definition of “mission accomplished” on a wholly voluntary intervention like Iraq II.

Frankly, James Mason himself never managed to make the James Mason Look of a surprised toad work.

Before I read this thread, I really was struggling to* not* put a particular Doper on my Ignore list.

sigh Well, I can cross that off.

AOC gets more impressive every time I see her speak. She obviously did her homework prepping for this inquiry. Zuckerberg was like that kid in eighth grade who spent the weekend dirt-biking, then Monday morning tried to bluff through his answer like Otter at the fraternity disciplinary committee.

That would have been a very good answer. Even the 2nd half, cleaned up, could work. Lay the blame not on the electorate (where it 95% belongs), but on the failure to teach life skills like critical thinking and fact-checking (which is hard as teachers far too often also lack these skills).

…Mr Zuckerberg, the Federal Trade Commission has very clear advice when it comes to advertising: “under the law, claims in advertisements must be truthful, cannot be deceptive or unfair, and must be evidence-based.” If the FTC were to start to enforce those laws in regards to political advertising: would Facebook comply with those laws?

I think your use of the word “discourse” is disingenuous here. Discourse implies a conversation. But what you are enabling here is the dissemination of propaganda. Facebook doesn’t promote discourse. That’s embedded in your DNA. Advertisers can pay so that their advertisements are only ever seen by the people that they want to see it. Millions of Americans will never ever know what certain politicians and their surrogates are saying to people who hold different ideologies. Those American cannot push-back on information that they literally will never-ever see.

What makes political advertising so different from everything else that your employees or your algorithms are the arbiters for? You remove pro-vaccine advertisements. You regularly take down the pages of legal sex workers. You already actively police what content you do and do not actively censor.

But we aren’t talking about subjective statements here. We are talking about objectively false statements.

We can start with the question that the Honourable Member asked you earlier in the hearings.

“Would I be able to run advertisements on Facebook targeting Republicans in primary saying they voted for the Green New Deal?”

Its a yes-or-no question. You didn’t answer it before. Would you care to answer it now?

Once again: it isn’t about “where **we **draw the line.” We have asked you to come here today to ask you “where **you **draw the line.”

That’s what this discussion is about, is it not? Asking rhetorical questions is an easy dodge. But one of the reasons we bought you into the hearings today isn’t for you to ask questions of us, but for us to ask questions of you.

Yes or no question: would you run a political advertisement that contained simple and easily verifiable falsehoods?

What about them? We’ve already pointed out to you that you censor both pro and anti vaccine advertisements. You make these judgement calls all of the time. Why is political advertising objectively different?

Mr Zuckerberg, I think its already too late for that. White Nationalism is a valid political ideology. Yet Facebook bans the explicit praise, support or representation of White Nationalism.This isn’t about what is or isn’t acceptable political speech in this country. This is about what is or isn’t acceptable political speech on your platform. You’ve shown that you believe that there can be a line and you’ve demonstrated that you clearly know how to draw that line.

Facebook is the ultimate gatekeeper. And it astonishes me that you can stand here before this committee and pretend that it isn’t. Its entirely the purpose of your various algorithms: it decides what people do and don’t see on their news feeds. So do we want social media companies to be gatekeepers? That really is the wrong question. You already are the gatekeepers. You silo information. You direct propaganda straight to the eyeballs of the most vulnerable. Its a responsibility that we think you should take seriously. But you can’t even acknowledge that this is the role that you play.

The question I would put to you instead would be: are you going to take your responsibilities as gatekeepers seriously, and can you explain to the committee how you intend to do so?

Congresswoman, if you look a little more closely at the rules governing advertising, you will see that they apply to, and have been interpreted by the courts to apply to, commercial advertising. There is no federal requirement for truth in political advertising. This is an important distinction, and federal courts have consistently drawn a distinction between commercial speech and political speech. We might debate whether or not that is a reasonable distinction to make, but we need to be careful to distinguish commercial and political speech when we are talking what types of claims are and are not allowable under the law.

I will concede, for the purpose of this discussion, that propaganda is at least as appropriate a term to use as discourse. But the fact is that notions of free speech don’t respect these types of distinctions. Something can be propaganda and still be protected, and discourse is still entirely possible even in the presence of propaganda, if the people involved are willing to take the time to understand what it is that propagandists are trying to say, and work out for themselves whether or not the propaganda is worth listening to.

As to your argument about who sees what, my response is, basically, so what? The fact that I might not see a particular set of propaganda in my Facebook feed doesn’t necessarily mean that I can’t learn about it, or that I can’t learn what the truth is, or that I can’t learn what the politicians who represent me really think about a particular issue. Should we censor or regulate political advertising in the Wall Street Journal because it will only ever be seen by people who buy the Wall Street Journal, and not by people who buy, say, the New York Post? Should we censor or regulate political advertising on HBO because it will never be seen by people who don’t have cable?

You are absolutely correct. We have made decisions, in the past, to remove certain types of posts. We make our decisions based on what we believe to be in the best financial interests of the company. Believe me, if we found out tomorrow that we were going to lose 100 million American subscribers due to our refusal to censor political advertising, we would censor political advertising in a heartbeat. My argument here is not that we will never make decisions about content; my argument is simply that we won’t make decisions about content based on the desires of censorious politicians. And I apply that principle both to the left wing censors like you, and right wing censors like the president, who consistently argues that he is treated unfairly on our platform.

Okay, sure. Yes, under our current policy, you could run that advertisement. You could tell a bunch of constituents that a Republican candidate and voted for the Green New Deal. It would then be up to you to decide whether you wanted to bear the political cost of being exposed as a liar in the midst of an election campaign, with all the consequences that come with that, both for the election itself, and for your long-term political future. Quite frankly, an easily-refutable “fact” like that should not scare anybody who is genuinely interested in the principle of democracy; it’s precisely the sort of lie that’s easily caught by anyone who wants to know the truth.

That’s where I draw the line. If you want to make me draw it somewhere else, then pass a law. We are committed to following the law, and if you pass a law requiring some type of monitoring system that filters out factually incorrect political content, then we will do our best to obey that law as long as it remains on the books or is not struck down by the courts. I would suggest, however, that if you pass such a law, you should think very carefully about who is going to determine what is true and what is not. Sure, some things can easily be identified as objectively false statements, but there are plenty of other statements that exist in a gray area, where some people might believe that they are factually true, while others argue equally forcefully that they are factually incorrect. If we put in place a monitoring system, will you call me back up here to castigate me if that system eliminates one of your own political advertisements that you believe is accurate, but that the system determines is factually incorrect?

For example, in a tweet in December last year, “$21 TRILLION of Pentagon financial transactions ‘could not be traced, documented, or explained.’ $21T in Pentagon accounting errors. Medicare for All costs ~$32T. That means 66% of Medicare for All could have been funded already by the Pentagon. And that’s before our premiums.” The fact-checking website politifact.com rated this statement as False, noting that it was based on a misreading of the source material and a misunderstanding of the data. In an interview, you said “Unemployment is low because everyone has two jobs. Unemployment is low because people are working 60, 70, 80 hours a week and can barely feed their family.” Politifact.com rated this statement as Pants on Fire. Do you think that it would be reasonable for Facebook to censor this comment as factually incorrect, or do you believe that the point you were trying to make, about the difficulty that many people face in getting by on low wages, is sufficiently clear that the statement should be allowed to stand?

As I pointed out above, one thing that makes political advertising objectively different is that it is treated as such by the very laws that you claim regulate falsehood and advertising. Courts have drawn a distinction between commercial speech and political speech for decades.

But calling us the gatekeeper ignores a fundamental truth: we are only the gatekeeper for people who choose to let us be. It is incredibly easy to avoid all of the propaganda, all of the lies, all of the falsehoods, and all of the offensive material that might appear in political advertising on our site by simply shutting your account. If you don’t like what we publish, I recommend that you do just that.
Speaking as myself now:

Look, I understand why people want Facebook to regulate political advertising. If they started to do it, particularly with respect to easily verifiable factual claims, I probably wouldn’t have much of a problem with that. I have also been toying, myself, with closing my Facebook account for precisely the reason that I laid out in my last paragraph. I don’t need Facebook, and there are ways for me to do basically everything I do on Facebook without being a member of the site. I’m still not sure that dragging Facebook in front of House inquiries and hectoring them about what they do and don’t allow is the way to improve political discourse in this country. And I say that someone who thinks Zuckerberg is a total douche, and who takes great pleasure in seeing him squirm like that.

Facebook is not the advertiser. Facebook is the medium. It is incumbent on the advertiser to make truthful statements. While the medium may make some effort to police advertising, and may editorially choose which advertisers to accept and reject, in the end, the claims in the advertisements are made by the advertisers and the advertisers, not Facebook, are responsible for their claims.

Regarding the question at hand, it shouldn’t be Facebook’s advertising department’s job to determine if candidate so-and-so endorsed or voted for a bill. That’s way fucking different from deciding we don’t allow sex worker advertising. Hell, even the article you link discusses the uncertain legal framework around it. It should not be Facebook’s, or any other advertising medium’s, job to scrutinize each and every advertisement for truth. Quite frankly, too many people think content moderation is easy and cheap (it’s not), then those same people piss and moan when FB et al get it wrong in either direction. For that, just actually read your article on the blocking of pro-vaccine ads. This shit ain’t easy.

I loved how Rep. Ocasio-Cortez stood up to Mr. Zuckerberg and made him squirm. She did well, she came prepared (as she usually does), he (and just about all the other congress critters) didn’t. But until it becomes a law that the medium is responsible for policing the ads for truthfulness, and truthfulness is clearly defined, then Mr. Zuckerberg should have said, “Yes, we will run the ad. Facebook should not be asked to screen every political ad to ensure that the creators of the ad are being honest.”

After that, maybe Mr. Zuckerberg could add, “To the extent possible, the ad could be followed with relevant links to the candidate’s web page, facebook page, twitter feed, voting record, and/or other fact-checking sources.”

If Congress wants to really get involved (or if Facebook wants to be proactive), make it so that advertisements masquerading as content are more clearly labelled, and put some teeth into the FTC (not likely, and triply so with the current administration) to hold those propagating lying advertisements accountable.

That there is some first class debatin’. Mad props to both Banquet Bear and mhendo.

Here’s your problem (among many as far as I can see): You don’t know the difference between a fact and an opinion!

I can understand a little confusion since sometimes people base their opinions on facts while in other cases they do so in the face of facts. But that’s what makes them different.

Reasonable people can come to different opinions even if they agree on the same facts. Look at climate change: The facts are that it’s real, the facts are that human activity is causing it, the facts are that our global climate is changing and most of those changes will make life more challenging for a lot of people.

The opinion comes in as far as what to do about it.

The problem is that so many people (like you, for example) feel… Well, let’s let the brilliant Isaac Asimov handle this:

“There is a cult of ignorance in the United States, and there always has been. The strain of anti-intellectualism has been a constant thread winding its way through our political and cultural life, nurtured by the false notion that democracy means that ‘my ignorance is just as good as your knowledge.’”

Start being able to back up your opinions with facts and then your opinions will have more weight. The fact that you conflate facts as opinions is telling.

There are no shortage of liberals who spread pseudoscience and bullshit regarding GMOs and I have found that anti-vaxxers come from both sides of the aisle.

How much of this bullshit is intentional obfuscation of truth (i.e. lies as opposed to an honest misunderstanding of the facts or other ignorance) is hard to say, but it’s fair to say that a lot of the self-identified libs who spread misinformation do so knowing it’s bullshit.

Also, there are still liberal people spouting bullshit about Bernie being robbed in 2016. I am willing at this point to just call them liars because way too much time has elapsed since then to give them the benefit of the doubt.

:rolleyes:

I know damn well what the difference is. The problem is that when talking about politicians, it’s their job to lie, spin, and talk complete shit. Talking about politics and politicians is in its purist form, a lesson in opinions.

Talking about science, or math, or things that are definitive, those are facts.

Climate change? Sure, it’s real. Been real since the beginning of time, and sure mankind is having an effect on it, but it is NOT settled science as to our role, and as soon as politicians get involved and are using it as a campaign tool - opinions abound and all beneficial to the politician.

Nicely done. I wish Zuckerburg would have said exactly that. And I like AOC.