CoC DOs are enforceable too?
I like that there are DOs and DON'Ts in the CoC. But I think there's not been enough consideration of enforcement.
If we want something to be just a general guidance that isn't enforceable, either it should be separated or at least have language like the "strive to" which is present in some areas.
A good CoC should not be vague about which items are hard enough rules that violations could be reported.
The reporting guidelines themselves could clarify how to handle the items that are the "strive to" stuff… (how to handle an image that lacks alt text for example) versus those violations that could result in censorship or other actions.
emi do Wed 8 Aug 2018 12:55PM
@wolftune I am a little confused and uncertain about 'enforcement'. Who would do this 'enforcement' and do you think the CWG or members need (or would want) a list of explicit enforceable items in order to confront or discuss issues that are creating discomfort?
I think some of these are discussed in the conflict resolution guidelines (the soft steps prior to reporting) that allows individuals, on a case by case basis, determine what their needs/wants are in order for a conflicting situation to be resolved.
To go back to your example: Let's say I wasn't assuming good faith and interpreted a comment that someone as being derogatory against women. In our current model with conflict resolution: the person who made the original comment would (if they felt comfortable with doing so) would message me privately to start a conversation, ask questions, understand where I was coming from and perhaps make a plan about how we both can avoid incendiary comments in the future. If they didn't feel comfortable messaging me directly, they could solicit support from a mutually trusted friend or a 'social.coop steward' (to be determined by CWG group) to mediate a conversation to help resolve the issue. All of this before having to resort to a report which would lead to formal action. If we were instead using an enforcement model, I feel like it would encourage members to hide behind a CoC rather than really listening to where the other was coming from and seeking to understand the situation.
I am aware that there are gaps in this type of approach, but am also trying to be pragmatic about where volunteers will want to allocate their time (social.coop stewards vs. social.coop enforcers) and how we can facilitate better communication and fostering/empowering positive interactions.
Aaron Wolf Wed 8 Aug 2018 11:24PM
I like all this. What I see missing is that sometimes the best thing for such a situation is to get the problem post hidden ASAP while the resolution is being worked out.
So, in that case, enforcement means that there is a CoC item to point to for requesting that a post be hidden at least temporarily. Since there's no way technically with our tools (I think) to give the reporter the ability to hide another's posts (which if it existed would be a privilege for those with a certain level of trust), they probably need to request that a moderator do the hiding.
But getting a post hidden can feel really bad if interpreted badly. So, we want a situation where it's as clear as possible that there's no personal attack, no further penalties… it's just a temporary situation as the concern is resolved.
Point is: a derogatory-against-women post shouldn't sit public and get public replies etc. while a private discussion is happening to address the issue.
The whole goal needs to be to block the tensions that arise from such a post before they cause further issues in the community. Most times, it will be a minor slip or misunderstanding or carelessness. So, it should be treated as casually as possible other than the getting hidden.
Right now, most everything online is either stuff staying public and causing all sorts of problems or treated as a real rebuked, finger-wagging, punished serious violation. There's no middle where it's like "this post is likely to cause problems, I already feel tense about it, we need to address it maybe with an edited repost that is better, so we're just hiding it for now — and as soon as we can resolve this we can continue as though nothing went wrong. There's no personal offense here per se…"
But even that situation needs to only happen if there's something from the CoC to point to about why the hiding should happen. Otherwise, we need to leave the hiding to be purely voluntary from the poster if they get a private request. So, that's the distinction for enforcement: is this issue something where a mod might hide a post prior to the poster getting a chance to voluntarily hide it (or decide they don't want to).
emi do Thu 9 Aug 2018 12:59AM
Again more food for thought.
At the moment it feels as though the rate at which toots/re-toots take place, and by the nature of Mastadon design where you can't quote toots (see anti-abuse rules) I don't feel that this type of explosive spread of a harmful thought will take place.
Going back to your example, a derogatory post about women, I think these happen on a gradient. Some feel like they threaten safety (I hate women and I am going to kill them all), while others are less obvious (women aren't as smart as men which is why we shouldn't be expecting equal representation in workplaces). I think the former needs to be taken down immediately, which according to our flow chart, would probably result in a report and action within 24 hours. The latter is one that I hope the community would address in direct dialogue, and feel like if we were to step in and hide it without engaging the member, would actually escalate the situation.
How about adding "If the conflict is in regards to a specific toot or written comment in a social.coop space, consider asking for it to be taken down temporarily while you work on resolving the conflict" under the "invite conversation" section of the Conflict Resolution Guide?
Personally, I am a little uncomfortable with the idea of having to create a task force that is going around hiding toots/comments on behalf of someone else? I fully support the need to support those who are feeling triggered, and I would hope we can come up with strategies to empower those members(such as muting/blocking etc.) and provide avenues where that member can be heard (social.coop stewards) and have their feelings validated. I think sometimes in an effort to help, we want to 'rescue' someone and we do it by putting our superhero costume on and seeking justice and executing punishment when there might be other ways we can facilitate resilience of the community?
Aaron Wolf Thu 9 Aug 2018 2:06AM
if we were to step in and hide it without engaging the member
Misunderstanding. I'm never saying that toots should be hidden without notice. Every time a toot is hidden, it should be in conjunction with saying "this violated the CoC, please just post again in a manner that is aligned with the CoC". It needs to be an invitation to simply repost and continue.
In the case of "women are less smart", we probably don't want a public back-and-forth debating that. In the case of "you're less smart because you're a woman" we definitely don't want that to be discussed in a public back-and-forth debate.
How about adding "If the conflict is in regards to a specific toot or written comment in a social.coop space, consider asking for it to be taken down temporarily while you work on resolving the conflict" under the "invite conversation" section of the Conflict Resolution Guide?
That sounds pretty good!
As to the worry of excessive hiding, that's where (A) we need some clear guidance (but not a strict program, we want human judgment still) about when something deserves to be hidden (and it should be usually a private request to the poster to choose to delete and repost (editing would be far better, but that's not available yet in Mastodon) and (B) clarity from the CoC is part of that (so, the CoC should not say that it's a violation to trigger someone, it says that only a specific list of things are violations, so anything that's not a violation will not be hidden by any task force…)
emi do Thu 9 Aug 2018 2:37PM
I've been looking at our CoC and trying to find ways to address some of the things that you have brought up and I just can't figure out an appropriate place for it.
I then come back to what the objective of the CoC is. I think you're raising really great points, I know I personally don't really feel like engaging/educating/debating misogynistic comments and at the same time, I am still concerned about our capacity and what we are actually able to implement.
I think encouraging the person who notices a post that might need amending to directly address the person to edit the post/take it down is probably the most realistic. I find that templates or examples are sometimes nice and a way to decrease barrier to taking that step? Perhaps we can put a template into the conflict resolution guideline?
Something along the lines of "I noticed your post on social.coop that mentioned __________. I took it to mean ______ which makes me feel ____________ . Would it be possible for you to edit your post/ take down your post/ put a CW on your post to be more reflective of the CoC of social.coop? "
Aaron Wolf Thu 9 Aug 2018 7:15PM
I feel you're thinking about this well.
I think the template should reference the CoC, as in: "I noticed your [post] on social.coop and it seems to be to be problematic per [specified item(s) from CoC]. Would you please [delete and repost an improved version]?"
A final part to the optional part of the template should say "I think if you [suggested improvement], that would address the problem"
If editing is a possibility (as it is on Loomio), then the request can be for an edit. On Mastodon today, it seems to be a delete-and-repost out of necessity.
In my view, it's a responsibility of all members to help maintain our standards. So, it's not okay to leave it to an offended woman to object to a sexist post. The more clear it is on signing up that we have certain standards and we all work to maintain them, the less personal any particular situation is likely to feel.
The final point is that this process should suggest that people can directly contact or can request someone else do it on their behalf. And I don't think even the latter case needs to be a big drawn-out reporting / enforcement process.
Ideally, it would be the case (as it is in Discourse, which we don't use) that an edited or reposted item fixing a problem notifies the reporter so they can check on the fix being okay.
emi do Fri 10 Aug 2018 2:50AM
Ok, so I think we're on the same page. Just some differences in wording. Here is my suggested edit to the template:
"I noticed your [post]. I took it to mean ______ which I feel is not reflective of [specific portion of CoC]. Would you please (to edit your post/ take down your post/ put a CW on your post so that it is more reflective of the CoC of social.coop?)"
Currently our conflict resolution and reporting guidelines don't specify that it has to be someone who identifies as the gender/race/sexuality etc. being addressed in the offending post. In fact, I would argue that it is specified in our CoC in these two points, that people SHOULD be speaking out if they see a problematic post regardless of if they identify as x, y z...:
* Help maintain a culture of inclusivity and open participation
* Show care, consideration, and respect for others
Perhaps your concern can be addressed in the annotated version as an example of how a member can maintain a culture of inclusivity?
Aaron Wolf Fri 10 Aug 2018 3:54AM
I like your updated template wording overall.
Minor detail: I don't like the word "mean" because we don't want to suggest what the poster intended. We want to emphasize the problem with the post, whether it was intended or not. Maybe something more open-ended like "I noticed your [post]. I feel it may not adequately reflect [specific portion(s) of CoC] because ___"?
note portion(s) plural possible. Posts can violate multiple parts of CoC.
Other wordings instead of "took it to mean" could be things like "it could be interpreted as" or something like that.
Example: I assume good faith, give benefit of the doubt that someone did not mean to be sexist and taking it to be that way would be assuming bad faith. But I still want to call it out because it is easy to interpret as sexist. So, I call out the problem regardless of the poster not intending the problem.
Anyway, there's some misunderstanding about identity. I wasn't suggesting and don't want that the identity of any party have any relevance. Women don't get any extra pass on spreading misogyny for example, and anyone who recognizes a problem should do something regardless of their personal identity.
There can be a wide range of reasons people don't speak up or take action. I wasn't assuming how they identify would be part of that necessarily.
The core issue is that I'm not thinking about the worst-case scenarios. I don't want a CoC and process that only deals with horrible bad actors. I want to see troublesome minor slips addressed before they lead to division and tension. I think a good portion of online conflicts come from poorly dealing with misunderstandings or carelessness, and we should address that as well as addressing serious bigotry etc.
Manuela Bosch Fri 10 Aug 2018 6:20AM
Thank you all. I have to admit, I am overwhelmed by the amount of information that is up in the air. I have no other idea how to solve this, than asking now to include the parts that are important to the person who brought them up into the respective document. I suggest we do so by considering that every change we make helps making the text/wording more
+ concise/focused
+ inclusive
+ inspirational
+ memorable
Let’s try not to discuss anything further. Just change the things or leave them for now as a next project/ in the living document we thought of. Does that make sense? I am speaking out of my role helping getting this done. :-)
Aaron Wolf Fri 10 Aug 2018 5:09PM
@emido could you propose your now-preferred wording of a template for self-help call-out of problem posts? I'm not sure where it goes. Maybe in the reporting docs?
The rest of the stuff here, I'm willing to accept there's no conclusion and we can solve issues over time later.
emi do Fri 10 Aug 2018 9:37PM
I have included the template here: https://pad.disroot.org/p/Social.Coop_Conflict_Resolution_Guidelines_v2
Aaron Wolf Fri 10 Aug 2018 10:00PM
Thanks, but there's too much assumed in that doc still.
There's certainly times when no previous conflict exists. I don't know that seeing one bad toot that I only read and am not involved in could even be called a "conflict". I just see the toot, send the message to fix it, it's fixed…
So all the "while you work on resolving the conflict" stuff is overbearing in such a case.
Simon Grant Fri 10 Aug 2018 11:01PM
It's so easy to write with a particular scenario in mind, isn't it? Come to think of it, that's pretty much the situation we're trying to address, when one person has one view of a conversation, and visualises an incorrect scenario about what's going on in another person's head.
Perhaps we are called to be poets here. To try to express an essence which fits different scenarios, without laboriously spelling out all the different steps, all the different possibilities, all the different assumptions.
There's something to be said for minimalism here, isn't there? Actually, in drafting policies (for our cohousing community) I have experienced in practice the virtue of taking out any material that does not positively need to be there.
I'm guessing that relates to what you are saying, @manuelabosch ? Also, we need also not to take out matters that seem really vital to e.g. @wolftune It's a worthy challenge. And, I think, a measure of the cultural maturity of a group of people, if they can get their collective act together in this way. Let's keep up the good work!
Simon Grant Fri 10 Aug 2018 8:25PM
Thank you both @emido and @wolftune for continuing a most thoughtful discussion on these points. I can't think of any points to raise that you haven't written to already! I do, though, have a short reflection on where we are. I get the sense that we are talking mostly about something that is to do with a living online culture. Sure, in extremis we need some "hard" rules to protect people from substantial immediate harm. But we will only be communicating well together if these hard rules are not invoked.
When I think more about this, what I am imagining is more like some trusted person ("elder", if you like) just taking the person aside, so to speak -- that would happen in a face-to-face context, but some kind of private messaging should suffice -- and ask, what is going on? If it's a new person, have they simply not internalised the culture? If it's a well-established member of the community, is it a temporary lapse due to some bad thing happening to them that is not visible to the rest of the online community? To me it would seem relatively straightforward to deal with either of these cases, if we provide a mechanism and a way of choosing a trusted person to engage the wayward one. (We neither want it to slip between everyone, nor for everyone to pile in because they can't see that someone else has stepped in already.)
But the scenario that taxes my imagination more is where we could have someone who persistently contravenes the CoC. What runs well in my imagination is a conversation between the forum "elders" (one interpretation of "jury", I suppose) about what might be going on for / in that person, and deciding amongst themselves whether they think they collectively have the resource to address the issue constructively; or if not, then whether a "hard" rule should be invoked. I'm not suggesting that we act as a mental health agency for the severely disturbed! But for lesser cases, a bit of skilled listening might really help the wayward person themselves recognise the issue and adopt a less disruptive manner.
I don't think I'm saying anything different from what you have written above, just reflecting back my own understanding of the situation.
So, I wonder if we can have the CoC, suitably worded to let people know that extreme disruption will be dealt with (in extremis), but that softer, restorative processes (which will take more time and care to work out) will indeed be carried out, within the capacity and willingness of the team of, whatever we call them -- moderators, jury, panel (that discussion is elsewhere isn't it ;) )
Aaron Wolf Fri 10 Aug 2018 8:54PM
Although I agree with all your views that you're stating, I have a quite different scenario in mind.
Someone who persistently contravenes the CoC is one thing (and seems to be what most people focus on, understandably).
What I'm getting at is more like the value of good facilitation with as much self-help as possible.
In the in-person analogy, people may be talking in good faith about a contentious subject and simply may in-the-moment say something that threatens to be toxic (it's condescending, dismissive, insulting, bigoted and/or similar…). It's subtle, not really aggressive. But it threatens tension and derailment, bad for the community. Again, the people are not persistent bad actors, just normal people discussing a subject with disagreement.
Sometimes, others can simply help facilitate. Ask "what do you mean?" or "I think she was meaning something more like X, is that right?" etc. But sometimes a good facilitator would cut off a statement and say, "hold on, that's not a productive, respectful way to communicate! Please try again". Then, the person could pause a moment and say, "okay, sorry — I meant…" and say their point better. And this goes way better for everyone than if there's an extended debate about the badly-formed statement.
So, taking someone aside and asking "what's going on?" sounds like an intrusive (but well-meaning) inquiry into the person, as though their problem statement is a reflection of deeper personal context. I think that sort of about-the-person approach can sometimes be quite wrong. If a person expresses some emotion, that they are angry etc., maybe taking them aside is appropriate. But in other cases, nothing more than "hey! that sort of phrasing is NOT appropriate. Try again" is all that's needed. But making a public spectacle of that can be a problem.
In summary:
I want everyone to agree to some explicit standards to which we hold one another. I want a clear, reliable norm for how we handle that. We then all expect to be held to that standard by everyone else, knowing that insisting on respectful appropriate communication does not involve personal judgments or assumptions of bad faith.
Is that clearer?
Simon Grant Fri 10 Aug 2018 9:30PM
yes ... I hear you as talking (in my terms) about the communications culture. And I'm explicitly agreeing that the best way of handing deviations from cultural norms is informally, and with forbearance (or even humour!) On top of this, if anything is needed, for these minor points, maybe some fallback to ensure that someone is mentioning it. Properly, however, that shouldn't be needed, as the spontaneous processes in a positive culture work well. As you've described. I'm alive, though, to the possibility that a well-meaning "hold on..." needs a reasonable degree of sensitivity to deliver well, and sometimes the person who delivers it can cause more hurt than needed in the other. It's a delicate point. Again, it's the culture -- not only the culture following the CoC, but the culture of correction.
The point I'm trying to point out could be that we would benefit from some people taking on the role of watchers and guardians of the culture as a whole. Process facilitators, if you like. Raising awareness of what good practice is, in its different forms. And, of course, modelling it.
Your summary is clear enough -- that you would like clear and reliable norms, not just for the standards of behaviour themselves, but also for how we hold each other to them. I feel a little uneasy when I think about that, and I'm not exactly sure why. It may be that it is really hard to spell out all that stuff explicitly. How best to call out someone for failure to abide by an explicit standard is not something, in my view, that is standard and rule-governed -- rather it depends on empathy with the individual. I can't help feeling a little awkwardness when rules such as these are spelled out too mechanically. (That's one of my reservations about NVC, by the way. Good learning tool, but...)
It would be great if we could, as part of our collective culture, value gracefulness in our interactions. I am painfully aware of how much I have lacked grace on occasions. So I'm wondering how the insisting can be done gracefully as well as sensitively? While (to repeat myself to be clear) recognising that, rarely, ungraceful restraint is unfortunately needed.
Aaron Wolf Fri 10 Aug 2018 9:45PM
I'm not wanting programmatic rules that remove practical wisdom and turn people into computers. I agree 100% with feeling concerned about that direction.
What I'm saying is that a few things should be explicit:
- Here's the list of things that we identify as problematic to the extent that any violation should be removed (not just a graceful request that could be turned down)
- The first step for any problem post is merely getting it removed/fixed — we hope to almost never have to go any farther with any other actions/penalties
- Because our tools don't include a clear way to (A) edit posts and (B) request edits / point out problem-posts, here's a tooling guide so people understand the options
- If you (person who sees problem post) doesn't want to be the messenger, here's how to contact someone else who can handle it
We need grace both in posting initially and in calling out problems. We need a guide that isn't too rigid that helps people succeed.
One specific additional concern that I think should be spelled out:
We want to avoid back-and-forth discussion of a post on the thread of the post itself. We don't want an ungraceful post followed by a dozen discussions of it to derail an actual topic. That's why problem posts should be removed ASAP and separate discussion of the post take place (privately or publicly, per good human judgment). That way the original discussion can stay on topic and stay valuable and clean for readers…
Final point for now: some people who feel marginalized may be less likely to speak up in absence of a clear policy that explicitly states our dedication to maintaining these norms.
Aaron Wolf Sat 11 Aug 2018 5:53AM
This is getting tangention from the initial topic here (which is about the CoC terms themselves), but I made a draft of how I see a better self-help addressing of violations to go along with the reporting guide:
https://pad.disroot.org/p/combined-reporting-docs
I think the ideas there capture the points well enough and bring up issues for the CoC too. I think this is much better foundation than I've seen in the other docs so far.
Aaron Wolf · Wed 8 Aug 2018 11:14PM
I'd like to see a separate (or extension) flow-chart for helping people decide the best resolution strategies when they are staying in the self-help part of the process.
I also don't see in the reporting any outcomes that are more qualified. I want this to be common: