Who is responsible for controlling the dissemination of misinformation?

Who decides what information is true or false?
Exactly. It should be left up to an individual to study and learn all the facts they can and decide from there what is credible. Of course, that would be too difficult for a lot of people these days. Much easier to turn on the boob tube and listen to a talking head spew.
 

Irwin

Senior Member
As people mentioned above, there is little agreement these days on what constitutes reality — on what is true and what is false information. So expecting social media sites to police people's posts and delete those which are misinformation is an unreasonable expectation and will be met by all sorts of outrage no matter what they do. A great number of people in our country choose to be misinformed, and that is their right, even though it's bad for our country for them to do so.
 
Yes they should be sure if something important contains miss information that people might believe and it could cause harm
Exactly. The operative words are "it could cause harm". If something is posted on their platform, which can cause harm; then it is their duty not allow it to be disseminated. Hate speech is not "free speech". Nor is posts, which call for violence on others. And misinformation designed to create upheaval, riots, and crimes against others.
 

JonDouglas

Senior Member
Location
New England
Exactly. The operative words are "it could cause harm". If something is posted on their platform, which can cause harm; then it is their duty not allow it to be disseminated. Hate speech is not "free speech". Nor is posts, which call for violence on others. And misinformation designed to create upheaval, riots, and crimes against others.
Given that some people today tend to label anything they don't like hearing as "hate speech", the problem returns as to who decides. An example of this was when some retards (or trolls) started calling "Wuhan or China Flu" hate speech. Unless you are for unrestrained totalitarianism, you certainly don't want the government telling what is or isn't truth or hate speech. Sooner or later, the Supreme Court is likely to weigh in on this. As for facebook and twitter, I don't use either as they are not much good for anything I do or enjoy.
 
Given that some people today tend to label anything they don't like hearing as "hate speech", the problem returns as to who decides. An example of this was when some retards (or trolls) started calling "Wuhan or China Flu" hate speech. Unless you are for unrestrained totalitarianism, you certainly don't want the government telling what is or isn't truth or hate speech. Sooner or later, the Supreme Court is likely to weigh in on this. As for facebook and twitter, I don't use either as they are not much good for anything I do or enjoy.
I agree. But is calling Covid, "the China flu" going to harm others. I don't think so. I believe what Sassycakes and I are stressing is misinformation that will cause actual harm to others.
 

Irwin

Senior Member
"Causing harm" is in the eye of the beholder. What needs to be done instead is to remove Section 230 protections from Facebook, Twitter, etc so that they can be sued for harmful content posted on their websites. As it currently stands, they have immunity; that immunity needs to be removed.
I don't agree.

That's like being able to sue gun manufacturers if one of their guns was used in a crime, which some people would like to be allowed, but it doesn't make any sense to be able to sue someone for doing something that's perfectly legal. If what they're doing is wrong, legislators should pass laws making it illegal. Most people who buy guns never use them in the execution of crimes; it's the people who use them to commit crimes who should be punished. In the case of weapons like an AR-15 that's so deadly, anyone with one who can point it and pull the trigger can kill dozens of people in just a few minutes, those should be illegal. But as long as they're legal, manufacturers shouldn't be held responsible if their weapons are used to commit mass murder. The exception is if the gun malfunctions and causes death or injury.

Or it's like being able to sue road builders if somebody drives the wrong way and kills somebody in a head-on crash. The driver is responsible, as long as the road isn't poorly designed or flawed in some way that makes it confusing which way to go.

If a politician goes on Twitter and claims covid-19 is no more dangerous than the flu, it shouldn't be the responsibility of Twitter to fact check him or her. The argument could be made that we should have the right to sue the said politician for making dangerous statements if we believe them and as a result, we suffer in some way. The problem is, it's perfectly legal for politicians to lie.

There are some media outlets with pundits who lie about pretty much everything. If we're going to hold Twitter and Facebook responsible for lies made by people who use those social media outlets, then the cable news channels that lie also need to be held responsible because the liars work directly for those outlets. In fact, we should go after them first, but that's not going to happen.
 

Who is responsible for controlling the dissemination of misinformation?​


My sense on this is to err on the side of free speech. These are privately owned websites, I hate to see the government getting very involved. As to liability I think that needs to fall on the person or organization doing the posting. That is who should be held responsible. To expect the websites to police and censor is asking a lot.

I disseminate plenty of misinformation.
 

AnnieA

Senior Member
Location
Down South
An example of this was when some retards (or trolls) started calling "Wuhan or China Flu" hate speech.

I don't think it should be prosecutable 'hate speech' but do think it painted a target on Asians in those inclined to racism and thought people calling it that were fools with a capital 'F' from the get go. The people of Wuhan and China as a whole are as much victims of the pandemic as we are. It's the Chinese Communist Party's cover-up in late 2019, early 2020 that caused the world and its own citizens great harm. Some of their people who were trying to tell the rest of us how bad things were went missing and still haven't been heard from.

CCP Virus (Chinese Communist Party) --- that's the truly accurate term if you want to go beyond the nomenclature of SARS-coV-2 or Covid-19 into a blame name. Just think of how different the world would be had Chinese doctors and scientists been free to reach out to colleagues around the world before cases got out of control; how much precious research time was lost to the world thanks to the Communist cover-up.
 
I don't agree.

Social platforms like Facebook and Twitter cannot have it both ways, which they do now. For example:

  • Facebook or Twitter can favor certain posts, commentary by celebrity members, or political viewpoints and censor or outright ban those comments or political viewpoints they don't like in order to slant public opinion one way rather than another.
  • Yet, if a Facebook or Twitter post these companies favor leads to some harmful reaction that can be traced back to that specific post or tweet, then Section 230 kicks in to protect Facebook or Twitter from actionable retribution since "they didn't make that post or tweet - a member did".
As others have pointed out, however, this is an attempt by social media companies like Facebook and Twitter to both control the opinions and viewpoints of their members while at the same time basking under the legal protection of Section 230, which gives them immunity from lawsuits over the content posted on their platforms. In short, they claim they are "not publishers" while at the same time exercising the "editorial control" of a publisher. They want the best of both worlds: protection from lawsuits and total control of the opinions and viewpoints of their members. But repealing Section 230 would still allow social media like Facebook and Twitter to control their online narrative, but they would no longer be protected from libel lawsuits for something their members might say.

On the other hand, if social media platforms want to keep the legal safety shield of Section 230 then they need to stop all censorship of their sites - save for anything that was outright illegal. But other than for that, their members would be free to say whatever they pleased against whomever they pleased. If they committed libel, for example, only they would be responsible - not Facebook or Twitter.

 

Gary O'

Well-known Member
Location
Oregon

Who is responsible for controlling the dissemination of misinformation?​


Pretty sure the communists have cornered the market

aaaaand......they've been pretty good on the semination end, too
 

Nathan

Senior Member
Should it be the responsibility of Facebook and Twitter to stop the dissemination of false information? Why or why not?
Facebook and Twitter do try to stop people from using their web properties as a platform for dissemination of false information, as is their right.
Example: I have the absolute right to prevent people from planting posters and signs in my yard. It's that simple.
 

Irwin

Senior Member
Facebook and Twitter do try to stop people from using their web properties as a platform for dissemination of false information, as is their right.
Example: I have the absolute right to prevent people from planting posters and signs in my yard. It's that simple.
That's exactly right. Facebook and Twitter are private companies and they can ban whatever and whomever they want. If they allow illegal content, then maybe they should be held liable, but otherwise, like you said, it's like a homeowner deciding what can be put in their yard.
 
For most of our 200+ year history we have done pretty well allowing free speech and letting people say what they want. We seem to be pretty good at ignoring hate speech and the like, in fact I think giving these folks a platform helps us understand who they are and why we should not pay attention.

When I was a student at LSU, long ago, we had a "free speech alley", anyone could speak from the steps of the Union Building. A young David Duke was a frequent speaker, I listened to him several times. It was good for me, I went away thinking a lot less of him and the Klan than I had before, and I know I am not the only one. Hearing the real raw thinking of those folks was a good wake up. To this day when I hear someone defend him as "not so bad" I have personal experience on which to challenge what is being said. Today I suppose someone would try to censor him for hate speech. It certainly was hate speech, and easy to recognize as such, but letting him vent was probably better than trying to stop him. For me it was better to have the chance to listen than not.

Growing up in the segregated south I can think of lots of other examples, like I said hearing and remembering all that gives me a better perspective today, or I think it does anyway.
 

Judycat

Senior Member
Location
Pennsylvania
Reading or hearing someone's claptrap on social media shouldn't automatically make one a believer. Maybe these days it happens more than it should. People feel alienated and want to belong. Just learn the rhetoric and you're one of us. Blah. No you're not. You're just another dope they add to the roster called Followers. Followers don't get to make any rules, they don't get to do much of anything except show up and keep the leaders happy. Yippee!
 
The people for intentionally manufacturing misinformation are the ones responsible for doing so. If they post that deliberate misinformation on privately held media platforms, those platforms can take whatever steps they deem necessary. And if that misinformation leads to violence, or harm to anyone, then those perpetrators should be held liable and suffer the consequences. Not every utterance is "free speech", and if there is a question of deliberate misinformation, we have the courts. to settle the matter.
 
Facebook and Twitter do try to stop people from using their web properties as a platform for dissemination of false information, as is their right.
Example: I have the absolute right to prevent people from planting posters and signs in my yard. It's that simple.
That's exactly right. Facebook and Twitter are private companies and they can ban whatever and whomever they want. If they allow illegal content, then maybe they should be held liable, but otherwise, like you said, it's like a homeowner deciding what can be put in their yard.
Within living memory, Americans once had the right to work hard and enjoy the fruits of their labor - whether it was owning a modest house in the suburbs or a 100 unit apartment building. Back then, property owners could rent or sell to whomever they pleased, without explanation to anyone, including any government bureaucrat. For example, if an apartment building owner didn't want to rent to you for any reason whatsoever - it was an unquestioned right for him to do so. In short, he did not have to justify or explain his reasons to anyone. And why? Because he owned the property.

Fast forward to today: since there is a clear precedent for stripping property rights from past business owners "for the public good" then the business owners of today have no moral grounds to bleat about "property rights" being stripped from them. Now, we're senior citizens here, right? Certainly some of you were alive when we Americans enjoyed "the right of free association"? We could start a club, buy a clubhouse, and only grant membership to whom we pleased. Remember that? That was the right of free association. Well, that right too was stripped away. Now you can be legally forced to allow people into your club whether you like it or not. The reason? "The public good". So whether we're speaking of a private home, an apartment building, a men's club, a cake bakery, or a 21st century Internet social media platform, the principle is the same: If your "right of ownership" interferes with what the government deems to be "the public good" then your "right of ownership" can be legally modified or simply done away with so that you cannot exclude others from what your business offers.

Some will disagree, and that's fine. Many will. But American citizens have been steadily stripped of their property rights for decades now, as well as their right to freely associate with only those people they wish to. As I see it, if businesses like Facebook and Twitter want to kick people off their platforms (whose opinions they don't like) then we need to restore the rights of house and apartment owners to be able to rent or sell to only those people they wish to. For either our laws are rooted in consistency and logic, or they are simply arbitrary constructs to be changed at whim.
 

Irwin

Senior Member
Reading or hearing someone's claptrap on social media shouldn't automatically make one a believer. Maybe these days it happens more than it should. People feel alienated and want to belong. Just learn the rhetoric and you're one of us. Blah. No you're not. You're just another dope they add to the roster called Followers. Followers don't get to make any rules, they don't get to do much of anything except show up and keep the leaders happy. Yippee!
You nailed it right there! People want to feel like they belong and fit in somewhere, which is what organized religion and cults provide. It is also the main recruiting tool of terrorists. They prey on young men from broken homes who desperately want some kind of family structure in their lives.

The need to fit in is inherent in all of us, as it is in most animals. It's something that helped humans survive as a species since we were better able to survive in groups than if we were alone on the Serengeti millions of years ago.
 

Irwin

Senior Member
Within living memory, Americans once had the right to work hard and enjoy the fruits of their labor - whether it was owning a modest house in the suburbs or a 100 unit apartment building. Back then, property owners could rent or sell to whomever they pleased, without explanation to anyone, including any government bureaucrat. For example, if an apartment building owner didn't want to rent to you for any reason whatsoever - it was an unquestioned right for him to do so. In short, he did not have to justify or explain his reasons to anyone. And why? Because he owned the property.

Fast forward to today: since there is a clear precedent for stripping property rights from past business owners "for the public good" then the business owners of today have no moral grounds to bleat about "property rights" being stripped from them. Now, we're senior citizens here, right? Certainly some of you were alive when we Americans enjoyed "the right of free association"? We could start a club, buy a clubhouse, and only grant membership to whom we pleased. Remember that? That was the right of free association. Well, that right too was stripped away. Now you can be legally forced to allow people into your club whether you like it or not. The reason? "The public good". So whether we're speaking of a private home, an apartment building, a men's club, a cake bakery, or a 21st century Internet social media platform, the principle is the same: If your "right of ownership" interferes with what the government deems to be "the public good" then your "right of ownership" can be legally modified or simply done away with so that you cannot exclude others from what your business offers.

Some will disagree, and that's fine. Many will. But American citizens have been steadily stripped of their property rights for decades now, as well as their right to freely associate with only those people they wish to. As I see it, if businesses like Facebook and Twitter want to kick people off their platforms (whose opinions they don't like) then we need to restore the rights of house and apartment owners to be able to rent or sell to only those people they wish to. For either our laws are rooted in consistency and logic, or they are simply arbitrary constructs to be changed at whim.
You can still discriminate against someone if you don't like their political beliefs or some organization that person belongs to. You just can't discriminate because of a person's sex, race, ethnicity, or religion. I'm not sure the purpose of that is the "public good." I think it's more to protect certain groups of people who have characteristics they have no control over.

The Constitution protects a person from religious discrimination, I believe. Gay people are protected by the Civil Rights Act of 1964, so you can't discriminate against them, either.
 

Pepper

Well-known Member
Location
NYC
The Constitution protects a person from religious discrimination, I believe. Gay people are protected by the Civil Rights Act of 1964, so you can't discriminate against them, either.
Time Waits 4 No Man meant more diverse discrimination than just religion & gay folks. He forgot about Interstate Commerce and how that applies.
 


Top