Meta and YouTube found guilty of offering products that people like

RambleTamble

Member
Location
U.S.
Meta and YouTube have been found liable for negligence for deliberately designing addictive products that hooked a young user and led to her being harmed, a jury ruled on Wednesday. The tech companies have also been found liable for failure to warn. The jury awarded the plaintiffs in the case compensatory damages of $3m.

It took nearly nine days of deliberations for the Los Angeles jury to reach its verdict. Jurors also awarded punitive damages, which will be decided during the next phase of the trial.
Meta and YouTube designed addictive products that harmed young people, jury finds

WTF are they supposed to do? Make products that people don't like?

Some are comparing this case to the case against cigarette manufacturers back in the '90s. It's nothing like that. Cigarette companies lied about the health risks of their products which have been proven to cause cancer and other health problems -- even if you're just around smokers.

Facebook and YouTube create products for entertainment, mainly. What if somebody gets addicted to Seinfeld reruns? Could they sue Jerry and Larry David for making an entertaining product?

What if somebody watches the series Landman and believes the propaganda they're disseminating about renewable energy. Could the producers of the show be sued if somebody goes out and attacks a worker at a wind turbine company?

It's up to parents of children to monitor what they do on social media. If the kid harms themself or someone else, maybe the parent should be held liable.
 
Meta and YouTube have been found liable for negligence for deliberately designing addictive products that hooked a young user and led to her being harmed, a jury ruled on Wednesday. The tech companies have also been found liable for failure to warn. The jury awarded the plaintiffs in the case compensatory damages of $3m.

It took nearly nine days of deliberations for the Los Angeles jury to reach its verdict. Jurors also awarded punitive damages, which will be decided during the next phase of the trial.
Meta and YouTube designed addictive products that harmed young people, jury finds

WTF are they supposed to do? Make products that people don't like?

Some are comparing this case to the case against cigarette manufacturers back in the '90s. It's nothing like that. Cigarette companies lied about the health risks of their products which have been proven to cause cancer and other health problems -- even if you're just around smokers.

Facebook and YouTube create products for entertainment, mainly. What if somebody gets addicted to Seinfeld reruns? Could they sue Jerry and Larry David for making an entertaining product?

What if somebody watches the series Landman and believes the propaganda they're disseminating about renewable energy. Could the producers of the show be sued if somebody goes out and attacks a worker at a wind turbine company?

It's up to parents of children to monitor what they do on social media. If the kid harms themself or someone else, maybe the parent should be held liable.

Im kidless so Im just dibbing. But if your kid is under ten dont you have some say in what they watch online. #headsmack
 
I asked Gemini to explain how Facebook et. al. were culpable. This is part of what it gave me.

In the K.G.M. v. Meta & YouTube case, internal documents revealed that the companies were aware of the addictive nature of their platforms, including evidence that Instagram targeted "tweens" and had high retention rates among 11-year-olds. Documents highlighted that features such as the "infinite scroll" and notifications were designed to foster compulsive use, contributing to a $6 million verdict against the companies.​

Evidence presented in court suggested that Meta actually halted internal research (specifically "Project Mercury" in 2020) after it found that deactivating Facebook led to lower levels of depression and anxiety. Plaintiffs argued that rather than warning users, the companies buried this evidence to "win big with teens" and keep engagement high for advertisers.​

It sounds like all they had to do was notify users every once in a while that there could be dangers associated with scrolling for a long period of time, and if they did that, they would have been fine, but doing so might have affected their profits, so they didn't. Instead they argued that it was "free speech," so they weren't obligated to protect young users.
 
Screen addiction. And like other addictions until the addict wants to change they won't

That being said young people who have yet to live real life in the adult world are at a disadvantage because not only are they more easily fooled/pulled into the addiction they don't or don't think they have a real life to return to as a sober person.

Enabling parents are part of the issue. The parents need to treat screen time like desert. Just like they can tell children No they can't have 10 ice cream cones for dinner they need to be told no cannot spend every waking hour on their phones or computers.
 
Meta and YouTube have been found liable for negligence for deliberately designing addictive products that hooked a young user and led to her being harmed, a jury ruled on Wednesday. The tech companies have also been found liable for failure to warn. The jury awarded the plaintiffs in the case compensatory damages of $3m.

It took nearly nine days of deliberations for the Los Angeles jury to reach its verdict. Jurors also awarded punitive damages, which will be decided during the next phase of the trial.
Meta and YouTube designed addictive products that harmed young people, jury finds

WTF are they supposed to do? Make products that people don't like?

Some are comparing this case to the case against cigarette manufacturers back in the '90s. It's nothing like that. Cigarette companies lied about the health risks of their products which have been proven to cause cancer and other health problems -- even if you're just around smokers.

Facebook and YouTube create products for entertainment, mainly. What if somebody gets addicted to Seinfeld reruns? Could they sue Jerry and Larry David for making an entertaining product?

What if somebody watches the series Landman and believes the propaganda they're disseminating about renewable energy. Could the producers of the show be sued if somebody goes out and attacks a worker at a wind turbine company?

It's up to parents of children to monitor what they do on social media. If the kid harms themself or someone else, maybe the parent should be held liable.
There is a difference in providing content that viewers like and providing algorithms that steer them into topics that are harmful. Let's say a teenager likes a video of a skinny girl dancing and then they are led into a tutorial on anorexia. That is very different than providing simple videos that viewers like. It is the algorithms that these providers use that make them culpable, not their overall content.

Or what about depressed teens who are guided into videos about suicide? And don't tell me a parent can control a 16 year-old's viewing habits.

Did TikTok videos inspire a teen’s suicide? His mom says she found graphic evidence
 
I asked Gemini to explain how Facebook et. al. were culpable. This is part of what it gave me.

In the K.G.M. v. Meta & YouTube case, internal documents revealed that the companies were aware of the addictive nature of their platforms, including evidence that Instagram targeted "tweens" and had high retention rates among 11-year-olds. Documents highlighted that features such as the "infinite scroll" and notifications were designed to foster compulsive use, contributing to a $6 million verdict against the companies.​

Evidence presented in court suggested that Meta actually halted internal research (specifically "Project Mercury" in 2020) after it found that deactivating Facebook led to lower levels of depression and anxiety. Plaintiffs argued that rather than warning users, the companies buried this evidence to "win big with teens" and keep engagement high for advertisers.​

It sounds like all they had to do was notify users every once in a while that there could be dangers associated with scrolling for a long period of time, and if they did that, they would have been fine, but doing so might have affected their profits, so they didn't. Instead they argued that it was "free speech," so they weren't obligated to protect young users.
I'm confused. This seems to be in conflict with your earlier statement about providing content that viewers want. If you've done more research and have changed your position I'm good with it! Don't mean to come down so hard. You are one of my faves here.
 
I'm confused. This seems to be in conflict with your earlier statement about providing content that viewers want. If you've done more research and have changed your position I'm good with it! Don't mean to come down so hard. You are one of my faves here.
Yep, that's exactly what it means. I wasn't aware of some of the details of the case. It turns out that they were intentionally getting kids addicted to their content and hiding the dangers, all just to increase their profits. And all they had to do was put up a little warning, but that was too much for them because it's "free speech." That was their defense.

We shall see how this case evolves in their appeals and if others follow suit.

And thanks. :)
 
Back
Top