Forum Settings
Forums

Should Facebook, Google be liable for user posts? asks U.S. Attorney General Barr

New
Feb 21, 2020 12:01 AM
#1

Offline
Jul 2015
5421
https://www.reuters.com/article/us-internet-regulation-justice-idUSKBN20D26S



WASHINGTON (Reuters) - U.S. Attorney General William Barr on Wednesday questioned whether Facebook, Google and other major online platforms still need the immunity from legal liability that has prevented them from being sued over material their users post.

“No longer are tech companies the underdog upstarts. They have become titans,” Barr said at a public meeting held by the Justice Department to examine the future of Section 230 of the Communications Decency Act.

“Given this changing technological landscape, valid questions have been raised about whether Section 230’s broad immunity is necessary at least in its current form,” he said.

Section 230 says online companies such as Facebook Inc, Alphabet Inc’s Google and Twitter Inc cannot be treated as the publisher or speaker of information they provide. This largely exempts them from liability involving content posted by users, although they can be held liable for content that violates criminal or intellectual property law.

Barr’s comments offered insight into how regulators in Washington are reconsidering the need for incentives that once helped online companies grow but are increasingly viewed as impediments to curbing online crime, hate speech and extremism.

The increased size and power of online platforms has also left consumers with fewer options, and the lack of feasible alternatives is a relevant discussion, Barr said, adding that the Section 230 review came out of the Justice Department’s broader look at potential anticompetitive practices at tech companies.

Lawmakers from both major political parties have called for Congress to change Section 230 in ways that could expose tech companies to more lawsuits or significantly increase their costs.

Some Republicans have expressed concern that Section 230 prevents them from taking action against internet services that remove conservative political content, while a few Democratic leaders have said the law allows the services to escape punishment for harboring misinformation and extremist content.

Barr said the department would not advocate a position at the meeting. But he hinted at the idea of allowing the U.S. government to take action against recalcitrant platforms, saying it was “questionable” whether Section 230 should prevent the American government from suing platforms when it is “acting to protect American citizens.”

Others at the meeting floated different ideas.

The attorney general of Nebraska, Doug Peterson, noted that the law does not shield platforms from federal criminal prosecution; the immunity helps protect against civil claims or a state-level prosecution. Peterson said the exception should be widened to allow state-level action as well. Addressing the tech industry, he called it a “pretty simple solution” that would allow local officials “to clean up your industry instead of waiting for your industry to clean up itself.”

Matt Schruers, president of the Computer and Communications Industry Association, which counts Google and Facebook among its members, said such a solution would result in tech giants having to obey 50 separate sets of laws governing user content.

He suggested law enforcement’s energies might be better spent pursuing the millions of tips that the tech industry sent over every year, only a small fraction of which, he noted, resulted in investigations.

“There appears to be some asymmetry there,” he said.

Others argued that different rules should apply to different platforms, with larger websites enjoying fewer protections than internet upstarts.

“With great scale comes great responsibility,” said David Chavern, of the News Media Alliance, whose members have bristled as Google and Facebook have gutted journalism’s business model.

But other panelists argued that distinguishing one site from another might be tricky. For example, would platforms like Reddit or Wikipedia, which have large reach but shoestring staffs, be counted as big sites or small ones?

The panelists also briefly debated encryption, another area over which Barr has pressed the tech industry to change its modus operandi. Facebook, in particular, has drawn the ire of U.S. officials over its plans to secure its popular messaging platform.

Kate Klonick, a law professor at St. John’s University in New York, urged caution.

“This is a massive norm-setting period,” she said, with any alterations to one of the internet’s key legal frameworks likely to draw unexpected consequences. “It’s hard to know exactly what the ramifications might be.”


---------------------------------------------------

The answer is simple tbh.
If an online platform doesn't treat itself like a public square i.e. they pick and choose which legal user generated content sticks then censors anything else for whatever reason.
Then yes, they should be held accountable.
This topic has been locked and is no longer available for discussion.
Feb 21, 2020 12:12 AM
#2

Offline
Feb 2010
11935
private companies like google and facebook shouldn't be treated like public spaces period. that's giving them to much power in the first place. at lest according to people on this board.

got to love peoples logic on this site.

first they are like:

IPS's shouldn't be made public utilities "thats communism."


now they are like:
Facebook and google ban people because they are a private company they should be made into a public utility.
GrimAtramentFeb 21, 2020 12:18 AM
"among monsters and humans, there are only two types.
Those who undergo suffering and spread it to others. And those who undergo suffering and avoid giving it to others." -Alice
“Beauty is no quality in things themselves: It exists merely in the mind which contemplates them; and each mind perceives a different beauty.” David Hume
“Evil is created when someone gives up on someone else. It appears when everyone gives up on someone as a lost cause and removes their path to salvation. Once they are cut off from everyone else, they become evil.” -Othinus

Feb 21, 2020 1:17 AM
#3

Offline
Dec 2012
16083
hazarddex said:
private companies like google and facebook shouldn't be treated like public spaces period. that's giving them to much power in the first place. at lest according to people on this board.

got to love peoples logic on this site.

first they are like:

IPS's shouldn't be made public utilities "thats communism."


now they are like:
Facebook and google ban people because they are a private company they should be made into a public utility.
"Private companies."

Yeah okay. They didn't receive multi billion dollar deals & exclusive contracts/privileges from the most powerful governments in the world or anything. And it's not like they control the mainstream flow of information on the net either. If Facebook banned all anti-Bloomberg sentiment starting today, that wouldn't have any impact on Bernie's performance in the primaries whatsoever. Because information is just a meme and everyone can just look it up on dogpile or duckduckgo. Better yet, just start your own search engine. And if you get DDOS'd by trolls paid for by Google, tough shit. You should have had millions of $$$ ready for cloudflare.
Feb 21, 2020 8:02 AM
#4

Offline
Jan 2009
93010
news said:
Some Republicans have expressed concern that Section 230 prevents them from taking action against internet services that remove conservative political content, while a few Democratic leaders have said the law allows the services to escape punishment for harboring misinformation and extremist content.


lol i agree and its ironic that conservative content are usually seen as those extremist and misinformation content anyway
Feb 21, 2020 5:46 PM
#5

Online
Mar 2008
47123
Well they should be held accountable at most for negligence if that's the case, not the crime itself unless they were knowingly complacent actively looking the other way then they should face full charges. If it's not a crime they should not be liable for lawsuit unless they failed to enforce their own guidline policy for user conduct because if a user is bound by law (to a degree) to the agreement of the site the agreement should work both ways where agreeing to it means they agree to enforce their guidlines.
traedFeb 22, 2020 1:16 AM
Feb 21, 2020 6:10 PM
#6
Dragon Idol

Offline
May 2017
7120
Social media is already plenty restrictive with what you can and cannot say.
It will only turn them into even worse echo chambers where only one opinion is allowed.
This will have the very nasty side effects of potentially dangerous thoughts going to the shadows where they used to be, making it harder to track down radicalized individuals and rescue any potential victims they may make.
Feb 21, 2020 8:17 PM
#7

Offline
Nov 2008
27790
Removing Section 230 is a bad idea, however there should be a law banning the removal of non-extremist content due to political reasons and media reform.


Feb 21, 2020 8:26 PM
#8

Offline
Oct 2012
15984
That is ridiculous to hold content platforms responsible for its users. It would end social media as we know it. It would drastically alter how the internet is used. It underlies the prominent fact that politicians are unable to adapt to the new age and are ignorant of how the world works. This isn't the 1980s when newspaper submissions are individually reviewed. The internet works on an automatic basis. Imagine if an employee has to carefully vet everything you post everywhere. Or is it that after years of dismissing AI, they're going to force internet companies to research AI to automatically block posts that have any semblance of offensiveness?

Is it that Americans were just jealous of the Chinese police state all along?
My subjective reviews: katsureview.wordpress.com
THE CHAT CLUB.
Feb 21, 2020 9:08 PM
#9

Offline
May 2013
13109
I definitely think Facebook should be accountable for the state of their site.

It's not exactly easy to explain that a website is completely cancerous, hahah, but that doesn't make it any less of a serious problem.

How can the state encourage or force websites to clean up their act? You can't sue them for something a user said, I don't think that makes any sense. And no I'm not sure that Barr's approach will be right at all.

But I am all for being tough on internet giants.
I CELEBRATE myself,
And what I assume you shall assume,
For every atom belonging to me as good belongs to you.
Feb 21, 2020 10:15 PM

Offline
Jul 2015
5421
Hoppy said:
Removing Section 230 is a bad idea, however there should be a law banning the removal of non-extremist content due to political reasons and media reform.

they're working on it
https://www.congress.gov/bill/116th-congress/senate-bill/1914
Summary:
This bill prohibits a large social media company from moderating information on its platform from a politically biased standpoint.
Under current law, a social media company is generally immune from liability with respect to content posted on its platform by users and other content providers. However, the bill removes this statutory immunity unless the social media company obtains certification from the Federal Trade Commission that it does not moderate information on its platform in a manner that is biased against a political party, candidate, or viewpoint.

Apparently members of both parties like Graham, Ted Cruz, Hawley(writer of above bill), Beto, Buttigieg, Bernie, and Pelosi have expressed interest in stricter regulation on big tech

“230 is a gift to them, and I don’t think they are treating it with the respect that they should,”
“And so I think that that could be a question mark and in jeopardy. ... For the privilege of 230, there has to be a bigger sense of responsibility on it, and it is not out of the question that that could be removed.”
- Pelosi

Feb 23, 2020 3:11 PM

Offline
Aug 2009
5519
Making google, facebook and others liable for users posts would amount to making the makers of semi truck trailers liable for when criminals use those trailers for transporting drugs and other illegal goods. Its an idiotic idea to make google and facebook liable for users posts.
Feb 23, 2020 3:30 PM

Online
Mar 2008
47123
ezikialrage said:
Making google, facebook and others liable for users posts would amount to making the makers of semi truck trailers liable for when criminals use those trailers for transporting drugs and other illegal goods. Its an idiotic idea to make google and facebook liable for users posts.

What a terrible analogy. It's more comparable to the legal liability a landlord has.
Feb 23, 2020 4:15 PM

Offline
Oct 2012
15984
traed said:
ezikialrage said:
Making google, facebook and others liable for users posts would amount to making the makers of semi truck trailers liable for when criminals use those trailers for transporting drugs and other illegal goods. Its an idiotic idea to make google and facebook liable for users posts.

What a terrible analogy. It's more comparable to the legal liability a landlord has.
That's a terrible analogy. A landlord can be reasonably held responsible to vet his tenants in today's world, although I would say it's still ridiculous to hold landlords responsible for their tenants. A website cannot possibly vet every user because posting is automatic. This is especially true as we talk about privacy concerns and not collecting personal information from users.

I prefer ezikialrage's analogy. An online platform is not an organizational structure in which users are employees to a business, such that businesses are responsible for their employees. No one would make that equivocation. Users are more like customers of the platform, who use the platform on a license basis. Now imagine suing Verizon because someone used internet to watch child porn, or suing a car dealership because someone went on a car chase in a car he leased. Those are more comparable business models, because no one reasonably expects Verizon or car dealerships to run criminal checks, and investigate their customers for security clearance. In fact, if they did that, there would be lawsuits over privacy. If Mark Zuckerburg hired someone to stalk your Facebook account, and message you warnings over things you post, there would be privacy lawsuits.

It's a blatant disconnect from the world we live in to expect massive open technologies to function the same way as parents in an atomic family is responsible for their children.
My subjective reviews: katsureview.wordpress.com
THE CHAT CLUB.
Feb 23, 2020 4:30 PM

Online
Mar 2008
47123
katsucats said:
traed said:

What a terrible analogy. It's more comparable to the legal liability a landlord has.
That's a terrible analogy. A landlord can be reasonably held responsible to vet his tenants in today's world, although I would say it's still ridiculous to hold landlords responsible for their tenants. A website cannot possibly vet every user because posting is automatic. This is especially true as we talk about privacy concerns and not collecting personal information from users.

I prefer ezikialrage's analogy. An online platform is not an organizational structure in which users are employees to a business, such that businesses are responsible for their employees. No one would make that equivocation. Users are more like customers of the platform, who use the platform on a license basis. Now imagine suing Verizon because someone used internet to watch child porn, or suing a car dealership because someone went on a car chase in a car he leased. Those are more comparable business models, because no one reasonably expects Verizon or car dealerships to run criminal checks, and investigate their customers for security clearance. In fact, if they did that, there would be lawsuits over privacy. If Mark Zuckerburg hired someone to stalk your Facebook account, and message you warnings over things you post, there would be privacy lawsuits.

It's a blatant disconnect from the world we live in to expect massive open technologies to function the same way as parents in an atomic family is responsible for their children.


Look at my oginal post before that one to get a better idea of what my view is. It's not about vetting it's about contractual agreement and direct involvement and perhaps negligence but i wouldnt go overboard on that.

A landlord is responsible for upholding their side of the contractual agreement was the main focus I had but yes they have other legal obligations though it varies. For one thing they are required for reasonable precautions such as providing locks for doors (though I never heard of any places that didn't have locks), and adequate lighting outside.

His analogy is bad because once a truck is sold it's no longer under any control of the manufacturer. I used landlords because websites can be thought of as communities under private ownership and management.
Feb 23, 2020 5:29 PM

Offline
Jul 2015
5421
ezikialrage said:
Making google, facebook and others liable for users posts would amount to making the makers of semi truck trailers liable for when criminals use those trailers for transporting drugs and other illegal goods. Its an idiotic idea to make google and facebook liable for users posts.

a better analogy is that you wouldn't sue common carriers like telecoms companies or public utility just because mass shooters/terrorists used their services.
however those businesses don't discriminate against users, or censor/promote specific people/content/agendas. if those big online platforms want to act like publishers so much, they ought to be treated like one.
Feb 23, 2020 5:48 PM

Offline
Jul 2015
5421
It's such a shame that there were so many people on MAL calling for the shutdown of 8chan, and celebrated when it got temporarily deplatformed. when it itself was only platform, a truer 'platform' than any of these big social media sites.
yet a million users lost access to it because of literally just a handful of violent assholes in a few specific boards looking for notoriety. meanwhile facebook can stream live mass shootings, and instagram can have beheaded women with no outrage against the platforms themselves.
the double standard is ridiculous.
Feb 23, 2020 6:27 PM

Offline
Oct 2013
5174
Of course one would need to ask, who would be the one in charge of categorizing the content as harmful. Of course it would be the State but how would one know they are being unbiased? It was a rethoric question: we can't.

On the other side it is obvious social media has become a parallel space for public discourse; and, as we've always done, public discourse needs to be regulated.
These companies should not be the ones made liable, in my opinion. It should be the people who share unacceptable content who should be made liable, for it was them inasmuch as they are posters who behaved like unacceptable people in the first place.

Is it thought policing? It's not, your thoughts are reserved only for yourself; and as we've always been told, you should keep certain thoughts to yourself
Feb 24, 2020 2:54 PM

Offline
Oct 2012
15984
traed said:
A landlord is responsible for upholding their side of the contractual agreement was the main focus I had but yes they have other legal obligations though it varies. For one thing they are required for reasonable precautions such as providing locks for doors (though I never heard of any places that didn't have locks), and adequate lighting outside.
What would be the analogy of locks to Facebook? User login security? I would agree if that's the case, but I'm not sure that's relevant to the thread topic.
My subjective reviews: katsureview.wordpress.com
THE CHAT CLUB.
Feb 24, 2020 3:27 PM

Offline
Sep 2016
380

Hoppy said:
Removing Section 230 is a bad idea, however there should be a law banning the removal of non-extremist content due to political reasons and media reform.



From websites like google, bing, and engines that are more or less required for the free flow of information sure.


However, websites like twitter, Facebook, Instagram, ect really have no reason for a law like that to exist covering them.

Mainly because they're communication platforms is a sea of dozens.

And you have to remember the majority of the time someone is banned from a website like twitter. Is because they did something wrong. Or have a history of being inflammatory.
I've surpassed your limit!
This topic has been locked and is no longer available for discussion.

More topics from this board

Sticky: » The Current Events Board Will Be Closed on Friday JST ( 1 2 3 4 5 ... Last Page )

Luna - Aug 2, 2021

272 by traed »»
Aug 5, 2021 5:56 PM

» Third shot of Sinovac COVID-19 vaccine offers big increase in antibody levels: study ( 1 2 )

Desolated - Jul 30, 2021

50 by Desolated »»
Aug 5, 2021 3:24 PM

» Western vaccine producers engage in shameless profiteering while poorer countries are supplied mainly by China.

Desolated - Aug 5, 2021

1 by Bourmegar »»
Aug 5, 2021 3:23 PM

» NLRB officer says Amazon violated US labor law

Desolated - Aug 3, 2021

17 by kitsune0 »»
Aug 5, 2021 1:41 PM

» China Backs Cuba in Saying US Should Apply Sanctions To Itself

Desolated - Aug 5, 2021

10 by Desolated »»
Aug 5, 2021 1:36 PM
It’s time to ditch the text file.
Keep track of your anime easily by creating your own list.
Sign Up Login