Jump to content

So the question of legal liability for what users due is a controversial one. One reason for America's dominance in the tech world is that it's section 230 legislation gives websites legal protection from being held liable for what their users do.  Many do not like this and as a result many countries do not operate like this. Telegram's founder has gotten arrested in France recently because France argues that Telegram was used to break French law.

 

I personally do not think platforms should be held liable for what their users do but I will not state my reasoning why for the sake of discussion. 

 

I do not deny that a country has the right to do this. In the same way the United States of America has the sovereign right to ban TikTok for example, but I do not think they should do that and this doesn't make the act above critique. 

Link to post
Share on other sites

Unless the platform was created with the explicit intent to break the law/bring harm to someone, then it stands to reason they should not be liable. That said, if someone posts on the platform to try and rally up to harm someone, and the platform takes no action (like contacting authorities, removing the post, etc) then they could be held liable.

Link to post
Share on other sites

35 minutes ago, SS451 said:

I personally do not think platforms should be held liable for what their users do but I will not state my reasoning why for the sake of discussion. 

Same. Such a precedent would mean that logging companies should be liable for any piece of paper one uses to commit illegal activities. Alas, the real problem here is that a subset of humans will always find ways to do or be evil. We should focus on ways to reduce the desire to do or be evil, and instead focus on ways for everyone to live an affordably equitable, happy, fulfilling lifestyle. 🙂

Desktop: KiRaShi-Intel-2022 (i5-12600K, RTX2060) Mobile: OnePlus 5T | [REDACTED] - 50GB US + CAN Data $34/month
Laptop: Lenovo Yoga 7i (16") 82UF0015US (i7-12700H, 16GB/2TB RAM/SSD, A370M GPU) Tablet: Lenovo IdeaPad Duet 10.1
Camera: Canon M6 Mark II | Canon Rebel T1i (500D) | Canon SX280 | Panasonic TS20D Music: Spotify Premium (CIRCA '08)

Link to post
Share on other sites

49 minutes ago, SS451 said:

section 230 legislation gives websites legal protection from being held liable for what their users do.

Didn’t work for Cox Communications, they got sued successfully for their users committing acts of piracy.

 

I just want to sit back and watch the world burn. 

Link to post
Share on other sites

"Should platforms be held responsible for what their users do?"

No.

COMMUNITY STANDARDS   |   TECH NEWS POSTING GUIDELINES   |   FORUM STAFF

LTT Folding Users Tips, Tricks and FAQ   |   F@H & BOINC Badge Request   |   F@H Contribution    My Rig   |   Project Steamroller

I am a Moderator, but I am fallible. Discuss or debate with me as you will but please do not argue with me as that will get us nowhere.

 

Spoiler

Character is like a Tree and Reputation like its Shadow. The Shadow is what we think of it; The Tree is the Real thing.  ~ Abraham Lincoln

You have enemies? Good. That means you've stood up for something, sometime in your life.  ~ Winston Churchill

Reputation is a Lifetime to create but takes only seconds to destroy.

Docendo discimus - "to teach is to learn"

 

  

 CHRISTIAN MEMBER 

 
 
 
 
 
 

 

Link to post
Share on other sites

Even if platforms aren't held responsible in a legal way, they can/will still make changes based on advertisers & media pressures. 
 

Just look at how many subreddits don't exist over last 15 years. They changed /r/all algrotihm many times as well

Link to post
Share on other sites

yes but no.

 

to a degree, a platform is responsible for keeping a clean house, for example not allowing distribution of pirated content.

(forum coc would be the example here)

 

but to say the owner of a platform would be on the hook for the lawsuit if someone shared a shady link and moderation didnt catch it... that's a bridge too far imo.

 

there's a saying in dutch that roughly translates to "manage like a good housefather" that comes to mind here.

you cant blame the head of the household for a teen doing petty crimes, but if they refuse to do any effort to keep their spawn on the rails that is problematic just the same.

Link to post
Share on other sites

5 hours ago, SS451 said:

 I will not state my reasoning why for the sake of discussion.

That is a shitty cop out. If you post a topic like this, at least have the balls to participate in the discussion. You have a bad reputation for this exact behavior here. Own up and contribute or don't bother posting these topics.

Link to post
Share on other sites

I honestly don't understand what rule 230 has to do with why american companies are so big in the tech space as most companies don't have anything to do with websites but rather either hardware/software or services. In sweden we have a law that regulates some of what can be posted on websites but that hasn't done anything in regards to companies like Spotify or Klarna.

 

I think that to a certain extent anyone who offers a service should be sort of responsible for material that is posted and the basis for this regulation should be more on the objective basis but it should be set specific and the list of "forbidden" material (for a lack of better words) should be a small as possible.

Don't misstake industrial/warehouse automation for AI automation. You could automate manufacturing in the 1960s.

 

English is not my native launguage so sometimes I might not make total sense.

 

 

 

Link to post
Share on other sites

Responsible for users? No.

 

I generally buy into the whole "are you a publisher or not" idea. 

 

Do you have employees (full time or freelance), and especially if you edit the content, then yes. (So newspapers, etc)

 

If you're social media (or something else similar to that), with 'users' then no.

 

There are probably some interesting grey areas in there that would have to be looked at case by case, but I like that general philosophy. 

 

 

5800X3D | 32GB RAM | RTX 4070 | 1TB NVME (boot) | 2TB NVME (storage) | B550M DS3H | Samsung NU8000 65" 1440p 120hz | 5.1 Surround Sound

 

Link to post
Share on other sites

9 hours ago, manikyath said:

a degree, a platform is responsible for keeping a clean house, for example not allowing distribution of pirated content.

It’s not a ISP or service providers job to be law enforcement. It’s a courts job to decide if someone is innocent or guilty. 

I just want to sit back and watch the world burn. 

Link to post
Share on other sites

1 hour ago, Donut417 said:

It’s not a ISP or service providers job to be law enforcement. It’s a courts job to decide if someone is innocent or guilty. 

 

Agreed. How do you even verify if someone is the rightful owner of something? Half the time you can't. 

5800X3D | 32GB RAM | RTX 4070 | 1TB NVME (boot) | 2TB NVME (storage) | B550M DS3H | Samsung NU8000 65" 1440p 120hz | 5.1 Surround Sound

 

Link to post
Share on other sites

14 hours ago, SS451 said:

One reason for America's dominance in the tech world is that it's section 230 legislation gives websites legal protection from being held liable for what their users do.  Many do not like this and as a result many countries do not operate like this. Telegram's founder has gotten arrested in France recently because France argues that Telegram was used to break French law.

Many companies do operate like that, and the France example is a bad example.  Had the US decided to they could have gone after him for the similar thing.  Overall they are still protected from what their users do, but in the case of Telegram it was France's effective claim that Telegram should have the ability to monitor and moderate their communities when the authorities request it.

 

Should be noted as well, section 230 doesn't give places a get out of jail free card...it's why YouTube's whole ContentID thing was created because it was getting too costly to constantly defend it.

 

Section 230 only protects the providers as long as they are not aware of the situation.

 

Anyways, to answer your question, should platforms be responsible?  Yes and No.  It's circumstances that greatly control that answer.

 

PirateBay for example, I would 100% say they should be held responsible for copyright issues because it's quite clear the intent of what content they want their users to post.

 

Should YouTube be held responsible for someone uploading copyrighted material, no...although if someone reports it and YouTube ignores the request yes.

 

So it would I say all depend, if the core business model is around the concept that users will be using it for illegal purposes then I would say they should be liable.

3735928559 - Beware of the dead beef

Link to post
Share on other sites

17 minutes ago, wanderingfool2 said:

 

Should YouTube be held responsible for someone uploading copyrighted material, no...although if someone reports it and YouTube ignores the request yes.

 

 

But anyone can report anything. How you are supposed to confirm any of it. I understand the flip side... leave it all to go to court and copyrighted infringement will increase dramatically. But I don't know that the whole YouTube "assume every claim is valid" approach is very good either. It's all very messy.

5800X3D | 32GB RAM | RTX 4070 | 1TB NVME (boot) | 2TB NVME (storage) | B550M DS3H | Samsung NU8000 65" 1440p 120hz | 5.1 Surround Sound

 

Link to post
Share on other sites

38 minutes ago, Holmes108 said:

 

But anyone can report anything. How you are supposed to confirm any of it. I understand the flip side... leave it all to go to court and copyrighted infringement will increase dramatically. But I don't know that the whole YouTube "assume every claim is valid" approach is very good either. It's all very messy.

Through DMCA requests specifically...although copyright might not have been the best analogy by myself.  If lets say there is some clearly illegal thing going on in a video, YouTube should be responsible to remove it once it's reported and they know about it.

 

The prime example where which I think shows how a company should be responsible (but really wasn't held responsible) is the Twitter of old situation where an exploited kid requested a video of him to be taken down...they got all relevant information from him, including he was a minor at the time of the video and decided to do nothing.  Cops contacted them and they did nothing, and then finally someone from homeland contacted them before they removed it.  In the above, once they knew a video was of a minor it should have been pretty instant of the removal of the content [as it was in clear violation of their rules, and even the law]

3735928559 - Beware of the dead beef

Link to post
Share on other sites

1 hour ago, wanderingfool2 said:

Through DMCA requests specifically...although copyright might not have been the best analogy by myself.  If lets say there is some clearly illegal thing going on in a video, YouTube should be responsible to remove it once it's reported and they know about it.

 

The prime example where which I think shows how a company should be responsible (but really wasn't held responsible) is the Twitter of old situation where an exploited kid requested a video of him to be taken down...they got all relevant information from him, including he was a minor at the time of the video and decided to do nothing.  Cops contacted them and they did nothing, and then finally someone from homeland contacted them before they removed it.  In the above, once they knew a video was of a minor it should have been pretty instant of the removal of the content [as it was in clear violation of their rules, and even the law]

 

 

Yeah, some are certainly more cut and dry examples, and I support a company taking action in that case. If I'm 'reacting' to a full length Disney movie on my channel, it seems reasonable I don't own any Disney rights.

 

But then there's the situation where I try and copyright strike you. Two nobodies. While we can see the clear difference on the surface, I don't know how to enshrine that into law. I guess that's where you get into using words like "reasonable",  "believe", "expectations" and phrases like that when creating laws/regulation. I just wish it was somehow cleaner.

5800X3D | 32GB RAM | RTX 4070 | 1TB NVME (boot) | 2TB NVME (storage) | B550M DS3H | Samsung NU8000 65" 1440p 120hz | 5.1 Surround Sound

 

Link to post
Share on other sites

5 hours ago, Donut417 said:

It’s not a ISP or service providers job to be law enforcement. It’s a courts job to decide if someone is innocent or guilty. 

the world isnt black and white.

 

also note i said platform, not ISP.

Link to post
Share on other sites

As with anything there is nuance. Look at say a nuisance streamer on a service like twitch or kick, a streamer that is actively breaking laws while platforms are providing positive monetary encouragement for them to profit off of bad behavior. if those platforms are not removing or actively banning these types of users, then they should be held liable for platforming them. Allowing chaos because you’re making money of chaotic people is not acceptable.

 

 

AMD Ryzen 5900X

T-Force Vulcan Z 3200mhz 2x32GB

EVGA RTX 3060 Ti XC

MSI B450 Gaming Plus

Cooler Master Hyper 212 Evo

Samsung 970 EVO Plus 2TB

WD 5400RPM 2TB

EVGA G3 750W

Corsair Carbide 300R

Arctic Fans 140mm x4 120mm x 1

 

Link to post
Share on other sites

On 12/4/2024 at 7:23 PM, SS451 said:

So the question of legal liability for what users due is a controversial one. One reason for America's dominance in the tech world is that it's section 230 legislation gives websites legal protection from being held liable for what their users do.  Many do not like this and as a result many countries do not operate like this. Telegram's founder has gotten arrested in France recently because France argues that Telegram was used to break French law.

Hence the consequences of operating a website or service that is illegal... somewhere.

 

On 12/4/2024 at 7:23 PM, SS451 said:

I personally do not think platforms should be held liable for what their users do but I will not state my reasoning why for the sake of discussion. 

 

I think platforms should be held liable for what their users do when "acting with malicious intent" is the stated goal of the site, such as k*wif*rms and 8ch*n (censored to not bring the trolls here) where the stuff they post they know is going to get someone killed. These sites should not exist, google should not index them, and there's literately thousands of copyright-infringing sites out there that google should not be indexing, but they do, and thus it results in creators having counterfeit or stolen versions of their content or products burying their legitimate product.

 

On 12/4/2024 at 7:23 PM, SS451 said:

 

I do not deny that a country has the right to do this. In the same way the United States of America has the sovereign right to ban TikTok for example, but I do not think they should do that and this doesn't make the act above critique. 

 

*dictator hat on*

 

Well I do declare, if I were the Czar of the Internet, and the Internet were a sovereign country, that the first action taken would be disconnecting all IPv4 addresses. Following that all IPv6 addresses should be considered public property, and all domains not ending in .com shall henceforth be banned from selling any product or service, and any site not ending in .com from posting any advertisements for any product or service not 100% sold from their country.  All sites ending in .com must have a public, physical address in the real world, and where this address exists shall the laws of that country apply to that site, no other country may claim legal sovereignty over the internet property.

 

All sites not ending in .com are prohibited from selling any product or service of any kind, and running advertisements for products not accessed on the first party site of any kind.

 

*hat off*

 

I'm not speaking of what/how I wish things worked, but rather that until some time the internet itself declares sovereignty from the real world, you're going to have a complex web of legal jurisdictions claim their laws apply to everyone on the internet who accesses the site, regardless of where the site is.

 

The goal above is basically to silo every site you buy a product or service from to a .com (commercial) domain, with a physical address for the company to be liable from. If someone from Canada wants to buy a product from the US .com , ONLY the US laws apply to everything until that product physically crosses the border. The user is ultimately liable for customs/taxes or if they buy something that is illegal in the country. If the user buys a product that physically or financially harms them, the US laws apply to the US company and the user needs to make a claim to the US courts the US site is HQ'd from.

 

All other TLD's, would not be permitted to operate a service requiring payment to access, run advertisements for products or services that take them off the site, only the .com's with a physical address with a legal jurisdiction would. 

 

Point being, this is unlikely to ever happen, so all it takes for a French legal jurisdiction to arrest a foreign national is to wait for them to step on to French soil. No assumption of innocence. Even if the company is not French.

 

There's also some fun precedents with this too. Canada held Meng Wanzhou Huawei (A Chinese national) under house arrest at the US's behest. Canada ultimately paid some what of a price for this, since China engages in hostage diplomacy. All France needs to do to "capture" someone they have on their target list is request the flight their target to be diverted to France when it goes through Europe. It would inconvenience everyone on board, but if your target is exceptionally dangerous or valuable. It's doable. The airline/airport/ATC would pay the price for it. 

 

People who are known targets for kidnapping or arrest, because of their connections to crime, can't travel because Interpol will grab them the second they get off the plane or train that crosses an international border.

 

Link to post
Share on other sites

On 12/5/2024 at 4:23 PM, manikyath said:

also note i said platform,

The Platforms that host copy righted content are generally not located in countries that give a fuck. Which is why the media companies are now going after ISP's. 

I just want to sit back and watch the world burn. 

Link to post
Share on other sites

5 minutes ago, Donut417 said:

The Platforms that host copy righted content are generally not located in countries that give a fuck. Which is why the media companies are now going after ISP's. 

which is in no way related to my point, nor what OP was talking about?

Link to post
Share on other sites

2 minutes ago, manikyath said:

which is in no way related to my point, nor what OP was talking about?

My point is, it's not the platforms job to be a court of law. If a copyright holder or some other person has an issue they can take it up with a judge and jury. 

I just want to sit back and watch the world burn. 

Link to post
Share on other sites

16 minutes ago, Donut417 said:

My point is, it's not the platforms job to be a court of law. If a copyright holder or some other person has an issue they can take it up with a judge and jury. 

and my point is that there is more than "the court of law" and black or white thinking. when you provide a platform, be that online or the real world, you have a duty to keep that platform within social limits.

 

if you provide an open mic stage, it's your duty to make clear to the performers what can or cant happen on said stage. in theory you're not "responsible" if someone is breaking the law on your stage, but you have a responsibility to have that not happen, so if it does you'll be under fire for it too... and then the court of law will decide if you were complicit or not.

likewise, a digital platform has a duty to keep users in line, yes copyright holders have a duty to protect their copyright, and are excessively lazy about enforcing said copyright.. but let's say you post the entire collection of star wars movies on this platform, they cant go after you without going after LMG to collect what personal information they have of you to hopefully find you.

 

in that line, platforms providing their user anonimity have an even bigger duty to keep clean house, because essentially they have nothing to give the court system when illegal activity is found, and if that combines with a complete disregard for moderating your userbase, that's pretty much equal to being party to said crime, because you're protecting them.

 

recently a colleague of mine who's a bit more "in the know" showed me just how easy encrypted chat applications like telegram (to bring it back to the topic at hand) make it to find "things".. i dont recall exactly which chat app it was, but it was essentially a matter of searching local groups, and there is access to every dealer in town. yes.. it's technically the duty of police to clean those up, but it would be a much lower burden on the tax payer if they didnt have to do an undercover operation every week to clear the next dealer that showed up on a platform that refuses to work with police.

Link to post
Share on other sites

6 hours ago, manikyath said:

you have a duty to keep that platform within social limits

There are no universal social limits. This is the problem. What is legally protected speech (or action) in one country is a criminal offense in another. Since the internet is global, it's really hard to restrict it to a geographic area (and causes severe functional flaws). 

 

Geographic restrictions would pretty much ruin the internet, in my opinion.

 

There is really no way to create and implement a universal moral code for the internet. 

 

6 hours ago, manikyath said:

platform that refuses to work with police

Section 230 doesn't stop lawful police orders and requests in the US (as I understand it). That's something entirely different.

 

As I understand it, section 230 was designed to protect people and businesses that make an honest effort to keep the content reasonably honest and legal, and put the legal burden where it belongs. For instance: if someone posted a video with untrue things about me on Youtube, I could technically sue them (in the US), but I probably wouldn't. Section 230 says I can't get greedy and sue Youtube instead, because they have more money. Youtube has content moderation in place, but they do make mistakes.

 

As I understand it: pre-section 230 in the US, you had 2 choices. You could try to moderate, and be responsible for everything, or not moderate at all, and be responsible for nothing. By moderating the content in any way, a business took responsibility for the content. Or they could do nothing, and be legally protected. (Remember that's in the US, it doesn't apply to other countries.)

 

Do people misuse section 230? Of course. 

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×