Before I start, let me say that more than anything else .. anyone who has an opinion on this (on EITHER SIDE) is wrong about even having an opinion unless you've informed yourself of sufficient facts.
I've replied to a lot of posts below based on some of the quotes I've seen. I'm taking the quotes to be true, but to be clear, I have not personally seen the entirety of the content of Gab .. however my comments and replied below are based (particularly further down as I got more information) on both statements below being true ... If anyone has proof that either one of these statements are not based on fact, please do let me know, as it is an absolutely crucial element in the debate of what should happen to the site.:
1- "
GoDaddy investigated and discovered numerous instances of content on the site that both promotes and encourages violence against people." - GoDaddy
*AND*
2- "
We want everyone to feel safe on Gab, but we're not going to police what is hate speech and what isn't." - Gab.com
OK .. now buckle your seatbelts .. going through this thread in order .. haven't actually gotten to the end ... yet ...
In another thread about freedom of speech someone recommended this book, and I actually bought it and I recommend it too. Freedom of speech is challenging.
That was me!
.. While it is indeed a short book, it is very dense and a good foundational primer into the subject. I still recommend it to all who aren't familiar with the basic concepts of free speech and where the line is drawn both legally and ethically. The book in question is:
https://global.oup.com/academic/pro...ort-introduction-9780199232352?cc=ca&lang=en&
... The issue is one of precedent. If they can take down Gab because one nut publishes nutty content, then we have a problem.
No we don't have a problem .. lol .. or admittedly maybe we do ??? It depends 100% on the specifics of the "nutty content" and the particular wording being used. If the wording of hate speech is indeed illegal (call to action for violence for example), and the owners/operators of the site were aware of it (and even if they weren't in some cases like deliberate ignorance/negligence), then the 1st amendment and/or whatever "freedom of speech" laws/principles that are in effect for most jurisdictions
does not in any way protect them, nor does it necessarily leave them irresponsible for the dissemination of hate speech
Again .. 100% depending on the actual words and wording used .. as more often than not people react too far one way or the other, and unfortunately aren't focused enough on the specific words .. which are crucial in deciding either way.
... no unlawful material is allowed, but virtually any speech is.
... the site has 800,000 users and has experienced modest growth recently so it really isnt all bad hate speech. regardless, those disgusting messages on the site by some users are also lawful no matter how distasteful they are.
By your very words you are implying that there is at least some bad hate speech. In this context of your post I'm taking "bad" to mean illegal. I won't make final absolute judgement on the site either way without seeing all the actual words used and wording of any potentially offending texts. But indeed even if only 1% of it is legally hate speech, and then they don't actively take it down and cooperate with the authorities .. then yes .. what they would be doing at that point is both immoral and much more significantly .. illegal!
I understand that Godaddy is a private business and its clauses may allow it to do this, but this seems extreme overreaction. "24 hours to transfer or else" is a very menacing way of doing business.
I've never hidden the fact that GoDaddy can often be a frustrating company to deal with. Their platform is buggy and I've both lost domains and paid extra for domains because of unclear wording, incorrect information, bugs, etc etc ...
HOWEVER .. at the same time I believe most of that is due to the fact they are a huge company and more often than not it's a case of the left hand not having a clue what the right hand is doing (a symptom found in most large companies) .. what I can give them credit for, is that at the end of the day they usually at least want to do the right thing. Beyond that, I also know they usually involve their legal department in cases like this, where their lawyers very diligently go over the facts, and it would very much surprise me if they made a decision without looking at the particular comments and wording in question. Maybe
@Paul Nicks and
@Joe Styler can get us more specific clarification on this?
I should have been more clear. It's ironic that GAB broke Godaddy's terms, but not Twitter's.
Not surprising at all really if you think about it. The content GAB puts out on twitter in 140 character strings is likely completely different from the actual content found on their website. They are likely very careful with what they post on their twitter account, specifically because it isn't their platform and it's valuable to them as it likely is a tool they use to generate users and members.
Correct me if wrong but its only criminal if they make no effort to remove "violence instigation" type of material. From what I've seen, the rules on gab are clear- no violence or illegal material allowed. If gabs made aware they delete it
I'm pretty sure that actually is wrong. Illegal hate speech is illegal hate speech, regardless of how long it is there for (5 seconds or 5 years). However .. I think the accepted standard for platforms is that they will be given a free pass if they are indeed prompt and effective at removing such speech, but at the end of the day, that is very likely 100% the discretion of authorities, who likely base their actual judgement on what they think would be considered "reasonable time/effort" by a judge or jury. So the law is likely very blank and white, but often enforcement and judgement is more subjective and will not just put everyone in jail and throw away the key unless they see actual intent and/or deliberate negligence.
Also, based on a quote from them (that I posted at the top), they actually don't usually police or delete.
gab ceo also said during an interview he forwarded all of the shooters gab profile data to the fbi before the fbi even asked which led me to start thinking- is that proper procedure? I was under the impression warrants are required then the website can hand the data. Or maybe its at their discretion.
This is how it works in most free countries... When they get a warrant they
MUST comply (or fight in court at least). Aside from that it is 100% at their discretion. In fact an argument can be made that if anyone has information on illegal activities it is their civil obligation to report it. Depending on your jurisdiction and the specific circumstances, you can definitely be found guilty of not reporting information on your own if it could be helpful to authorities in a serious crime.
In America, Nobody has a right to NOT be offended, but private monopolies can rally their own political views however they want. They don’t need to follow the first amendment because it’s their territory. It isn’t really public. This is why free speech is under attack due to platforms and large corporations. If the web returned back to independent websites, and open minded small webhosts then it would be much easier. This social media world is polluted with all ranges of opinion, and the wrong place for free speech.
Yeah .. ironically people getting offended is a natural part of debates .. which in turn is a crucial component of democratic freedom. The moment people stop having the right to offend is the moment you've lost your democracy. (Although obviously as discussed throughout this discussion, there are limits)
Anyhow .. I'd tend to agree with the overall feeling here, although it's more to do with political clout/influence/lobbying than just the simple fact it's their "territory". But the one huge pivot point on all this is the basic question of if a "bridge platform" is responsible for the content on their platform, or if they are simply a tool and it's only the people using the tool who are legally responsible.
1- What I mean by this is clearest in the case of Uber. If someone ends up being attacked by one of their drivers, does Uber have responsibility or not? At the end of the day it's an interesting on each ethical, technical and legal levels. Even more so that the answer for each level might not be the same.
2- Then a little less clear to some (although theoretically the same to some degree), is if social media platforms are responsible for the content shared on their network. They constantly try to argue the answer is no. But obviously that's because they don't want the responsibility both in terms of liability, and more importantly, because REAL policing can be expensive.
3- The similar question can be brought to ISPs and hosting companies regarding what is being shared on their servers. Are they responsible or not?
While each of those 3 cases might appear different, theoretically they are very similar. The world wants centralised social systems because they are vastly more efficient. But at least part of that efficiency is specifically because they (for the most part) ignore what could be seen as their social responsibility (in this case to ensure that the limits of free speech are respected and enforced, but there are other important aspects as well that fall into privacy, security and other rights).
Allowing big media companies to skirt legal responsibility certainly has both very strong positives and very strong negatives. I for one think the big companies have been given too much leeway and tolerance to what is ultimately their blind eye to responsible supervision of the actual content on their platforms. They have grown stronger and faster because they haven't needed to have been truly responsible for the content shared by their members.
I personally think the world would be a better place if they were ultimately held responsible .. but I also see how that could have negatives as well. It's a very interesting dilemma.
Somehow starting riots and Yelling to kill police, Black Lives Matter a domestic terror group fits in the terms of service, but they should be banned too if Godaddy was morally correct, but of course that would be all over the news, be called racist, etc. Politically correct world.
On this point I don't really agree with you. Although being in Canada, I might not be aware of some of the specifics surround BLM. But in general, like all groups, while there are likely a handful of individual members of BLM who have been violent OUTSIDE the scope of BLM, as far as I know, BLM does not in any way support violence or hate crimes of any kind (I'm also of the understanding the vast majority of their members are specifically against such things as well). Nor does their actual website have a public forum or social aspect where such notions could be shared, so unless you're seeing actual calls to action for violence or racism on their own web pages, then it's a non-issue right from the start. If they have used their platform or organisation to specifically call for violence, then please put a quote or link, as if that were actually the case then I'd agree with you.
To be continued ...