Framework for Moderation
It’s been exciting to see many new tooters on social.coop and though each of our timelines may look a little different, we are looking for ways to ensure that the social.coop corner of the fediverse continues to reflect our values and ethos for being a cooperative and collaborative space. The CWG Ops team has been hard at work at our moderation duties through this large growth phase and we have been grappling with some moderation questions.
We would like to add clarification to encouraged and discouraged behaviours outlined in our CoC and Reporting Guidelines.
In an effort to encourage and model positive and constructive modes of discourse that demonstrate a mode of online communication that can prove counter to the norms on other platforms,
we’d like to introduce nuance to moderation based on the guidelines outlined on this blog: https://wiki.xxiivv.com/site/discourse.html
In this thread we will discuss different moderation challenges and based on those discussions, will propose amendments to the CoC for approval by the membership.
Aaron Wolf Fri 13 Jan 2023 6:02PM
I've been working for some years on updated synthesis of related ideas, combining good resources like the one you linked. It's still too noisy to be worth sharing. It fits what I was pushing for when guidelines were first drafted.
I think we need to find the third-path between just censorship and inaction. Mastodon still doesn't offer any moderation tool that would hide a toot without deleting, does it? My ideal suggestion is for controversial toots to be temporarily hidden so that any further tension is limited while the situation gets resolved. It just feels less heavy-handed to say that a toot is temporarily hidden than to tell people it's deleted. And the resolution needs to be a restorative process that surfaces the issues and feelings and allows room for edits that achieve win-for-all outcomes.
Is the moderation team conscious of this sort of angle? Or is there a trend toward the moderation team just making top-down judgments and imposing them?
Is there a process or norm for reporters, moderators, and critiqued posters expressing their feelings in terms of the constrictions we're feeling? The ideal approach IMO involves taking the minimum action that allows immediate tensions to be resolved and then deals with further issues in stages. So, in some cases, a reporter needs to just feel heard and respected in order to begin relaxing from their initial reaction. Then, from a more relaxed state, they can discuss what steps to take next. In many cases, I imagine the original poster getting notified that someone felt constricted about the post and invite them to consider if an edit could improve things and reduce tension.
I really think we need to keep heavy-handed censorship as a final conclusive action as something limited to more obvious violations. I wish for everyone joining the community to expect however that some form of temporary censorship (given what tooling allows) is to be expected and normal because the community prioritizes avoiding drama and related problems. It's really hard to get this norm that doesn't exist in the online world in general, but we really want people to expect that temporary hiding, a pause on a topic, is normal and not a rebuke or warning or bad mark or anything but an expected part of discourse.
We also need everyone to expect that we might see posts that cause us some constriction, and that when we do there are ways to address it besides getting into public drama, that concerns will be heard and supported, but that we seek solutions other than blunt censorship except in cases of obvious ill-will.
Ana Ulin Fri 13 Jan 2023 8:06PM
I would prefer it if we would avoid using words like "censorship", which evokes oppressive governmental or controlling action, to refer to moderation actions that remove content deemed to be harmful. Having your content removed from social.coop, or even your account suspended, does not come even close to qualify as "censorship", as people whose content is removed from social.coop are free to continue spreading such content in many other venues, including other Mastodon instances. Further, using the heavy-handed label "censorship" tends to derail many moderation discussions with references to frozen peaches and other nonsense that is not germane to the governance of social.coop.
Similarly, I would hope that we can use more precise words than "drama" to describe whatever it is that moderation actions might be trying to prevent (conflict? discomfort? harm?).
I'm not trying to be pedantic here, but just hoping that we can bring some rigor to this tricky discussion. 🤞🏼
Michael Potter Fri 13 Jan 2023 8:50PM
There's some food for thought here, but I don't see actual policy in it, besides "settle things yourself." Certain specific rules are included in our CoC, which I do agree with, but if we're an anti-fascist organization, I think it makes sense for us to recognize that fascism is spreading globally through propaganda and disinformation. If we're going to expand the CoC, maybe we can include something to address denials of scientific consensus and certain debunked theories that have led to violence in our recent past. As it stands, I feel like our only options are to argue endlessly with someone who's in bad faith, or mute them and allow the poison to spread through our systems.
Matthew Slater Fri 13 Jan 2023 9:04PM
@Michael Potter I suspect you and I would disagree about what exactly 'fascist' means and also what constitutes a debunking or fact-checking in good faith. And so it wouldn't be enough to define 'fascism' in more detail because then every ideology would need defining and our definitions would be arbitrary. Similarly with debunking, we've seen with vaccine and Hunter's laptop how politicised issues are distorted and the truth changes over time.
I would venture that the only thing that needs moderating is hate speech. I think it would be easy to reach a good consensus about what is hateful and what is not.
Aaron Wolf Fri 13 Jan 2023 9:27PM
Thank you Ana! I appreciate and share your interest in using words carefully. I don't think it's just pedantry. Wording matters.
I do mean some implication of "controlling action" and oppression by "censorship" but I certainly don't want to align with or seem associated with the free-speech-absolutist radicals out there. What I'm aiming to describe is the difference between someone choosing to remove or edit their toot because the problems with it have been brought to their attention versus moderators imposing the removal. I'm not suggesting there's no room for the latter. I'm concerned there's not adequate process to facilitate a norm of the former. What wording would you suggest for describing this point?
Also, "drama" is intentionally vague because I don't have a good word to describe essentially all the various situations that amount to noisy situations with people having constrictive feelings. I don't think "harm" is right. Harm is something that might or might not go with drama, and it depends on how things are or aren't resolved. "Conflict" might work. I'll think about that term, that might just better. Happy to get other suggestions.
I think this sort of care with words is essential. How else can we bring big complex ideas into CoC and policies that people can effectively use?
Ana Ulin Fri 13 Jan 2023 9:27PM
Can you expand on why do you think agreeing on what is "hate speech" would be easier than agreeing on what is "fascism"?
Ana Ulin Fri 13 Jan 2023 9:32PM
Do we have any stated "goals" or purpose for our moderation? It seems that our intentions behind having moderation are crucial to guiding what this moderation should look like.
Aaron Wolf Fri 13 Jan 2023 9:43PM
I'm hesitant to use the label "anti-fascist" here mostly because I know there's a trend of us-vs-them reactivity among activists that tries to draw hard lines where things are fuzzy, and trigger terms don't seem IMO to facilitate coalition-building. That said, I do generally share your story that propaganda and disinformation are spreading extremely dangerous ideologies that cooperative movements should work to counter. I hesitate to take labels, but with the risk of adding confusion, people who know me have said I'm some sort of socialist, anarcho-syndicalist, anti-capitalist… my favorite author is David Graeber. I mean, just understand that if you ever interpret me as defending some right-wing idea you find threatening, it's not because I'm coming from that side of our divided world.
I find myself feeling a lot of constriction around suggestions that social.coop block posts that deny scientific consensus. That's not because I disagree with the consensus about vaccinations or climate etc., but I feel wary that we enforce groupthink enough. The issues bring to mind the case of socialist activist accounts on Facebook being blocked by the same mainstream-is-correct moderation and algorithm approaches that were focusing on blocking the right-wing conspiracy theorists.
If we experience a pattern of particular conspiracies and denials showing up that are too rampant to easily address in one off conversations, I would support then adding certain things as needed to the CoC. I don't support the idea of just blocking ideas we don't like from showing up rarely here and there.
The policy I'm suggesting is not a mere do-nothing "settle things yourself". I want to see a specific culture of consciousness-raising and engagement that is facilitated with good systems. If we were to make a volunteer group of facilitators open to working with people in cases of conflict, I will join that. And I support figuring out ways to actively prompt people to remove or edit contentious toots and specifically to onboard newcomers to understand that this is how we do things and that they can expect to receive such messages. I want to see such messages sent more readily rather than be something we reserve only if something really goes over a line. Being asked to remove or edit a toot should be a norm. I want to get constructive feedback and improvement from others continually in that form, and I want to be able to make conscious decisions and build practice improving my communication and understanding of others' perspectives.
Matthew Slater Fri 13 Jan 2023 11:08PM
I don't want to write an essay on fascism, but the term has come a long way from mussolini's 'merger of corporations and the state', through Nazism where it is remember for being antisemitic. It has a more philoshopical connotation as the polar opposite to individualism, and often serves as a harsh synonym for any authoritarianism. There's a strong undercurrent of German fascism in USA from the 30s though Prescott Bush et al financing Hitler, through operation papercliip, through the attempted coup blown open by Smedley Butler. Those fascists were never defeated. Currently fascist is used as a synonym for the 'far right', another ill defined term. It is also regularly applied to the state of Israel, USA's canon fodder the Azov battalion who borrow their iconography from Hitler, as well as a rising tide of populist politics especially in Europe. I hope that answers your first question.
To consider hate speech lets consider a few examples. Which of the following would you consider hateful?
Biden is (was) a sexual predator
Too bad Epstein wasnt hung by his balls
Trump is intellectually lazy, dyslexic, and narcissistic and unfit to hold office
The government is controlled by pedophiles
I carry a gun in case I meet a politician
Punch a Nazi
I would rate 2, 5 & 6 as hateful. Maybe you can think of more borderline statements?
Matthew Slater · Fri 13 Jan 2023 11:31AM
I'd like to think that by the end of this thread it would be clear what went wrong when Jem Bendell's posts citing mainstream media and/or science about vaccine efficacy were deleted.
The moderation was biased towards the complainant, the offended, or the pro-vaccine POV
Anti-woke narratives in general are not welcome in social.coop
I'm open to other explanations.
And any new rules should make it clear what would happen to similar posts in the future.