Loomio
Sun 8 Sep 2024 12:02PM

What moderation policies should Social.Coop adopt toward Fediverse instances run by for-profit corporations?

D Dynamic Public Seen by 182

In brief

Social.Coop has adopted a policy that Threads.net content should be moderated at a "Limit" level, which means that content posted by accounts on Threads.net will be

hidden from all other users on that instance, except for its followers. All of the content is still there, and it can still be found via search, mentions, and following, but the content is invisible publicly.

We do not, however, have any policy on what our relationship should be with instances run by for-profit corporations more generally. Having an explicit policy on how for-profit instances should be handled by the Community Working Group (CWG) would streamline future decision-making and clarify for users of differing opinions whether Social.Coop is the right choice for their their long term home on Mastodon.

It would therefore be valuable to discuss where our membership comes down on this question, and to work toward a written policy.

Longer context

Prior to the Facebook's/Meta's creation of Threads.net in 2023, I believe that federated social media was almost exclusively open software platforms run in a distributed fashion, and generally not for profit. Individuals would sometimes release cross-posting tools/robots that utilized the API of centralized corporate platforms to relay content from the corporate platforms onto "Fediverse" platforms, but this was not core functionality.

As such, up until 2023, the question of how Mastodon and other federated platforms would relate to corporate platforms was largely a moot point. Different instances did have different policies on bridges and cross-posting robots, but my sense is that it was generally understood that Mastodon and corporate social media were separate things. To a large extent, I believe that this is still the case. Threads.net has not in fact implemented full federation using ActivityPub. However, the 2022/2023 announcement by Meta of plans to adopt ActivityPub prompted immediate and strong responses from the Fediverse community, some seeing this as an opportunity to connect with the much larger number of people who use corporate platforms, while others expressed alarm about a long history of problematic behaviors by Facebook and Meta.

Brian Vaughan introduced the question of whether Social.Coop should join Vanta Black's the Anti-Meta Fedi Pact in June 2023 (https://www.loomio.com/d/AZcJK6y2/discussion-support-the-anti-meta-fedi-pact). After extensive discussion, we decided not to join the pact, but to instead impose a minimum of Limit moderation level on Threads.net. One of the many themes running through that discussion was the question of how we see the role of for-profit platforms in the Fediverse.

There was also a related discussion of whether we could formalize a policy that would capture the objections to Threads.net in a generalizable form, including laying out explicit norms for respect and policies about privacy and user control of their own data, which several of our members began writing up in the collaborative document that Eduardo Mercovich links to here: https://www.loomio.com/d/W6tL5cvp/the-bluesky-bridge/34

Most recently, there have been questions raised by Sam Whited and Flancian about how to moderate bridge instances that permit interoperability between Social.Coop and Bluesky (a very new platform that so far has much better moderation policies and social track record than the social network giants Facebook/Meta and Twitter/X, but which is also run by a for-profit public benefit corporation). The discussion about the bridges with Bluesky is here: https://www.loomio.com/d/W6tL5cvp/the-bluesky-bridge

Inevitably there will be future questions about moderation of corporate instances, as well as bridges with such instances, and I think it would be valuable to not need to discuss each case individually.

D

Dynamic Sun 8 Sep 2024 4:30PM

@Zane Selvans

Do you have thoughts on what would be a good discriminator?

Also, see my reply to Justin du Coeur above.

N

Nat Sun 8 Sep 2024 5:23PM

I think there's a pretty strong argument to be made that, even if we assume the best of intentions, "for-profit" instances are inherently less prepared to make principled moderation decisions (unless the principle is profit-maximizing). We see that in the way being visibly queer online used to be advertiser friendly, and how much that changed this last year. I mean, personally I hold the politics of the people who run big tech companies in a very low regard, but I think it's reasonable to assume that at least they're very plastic for the status quo, and the status quo is pretty bad for a lot of people.

I'd be in favor of a firm stance against "for-profit" instances, but I agree with Justin and Zane we could use a better definition that literally whether or not the organization running it is legally a for-profit entity.

D

Dynamic Sun 8 Sep 2024 9:53PM

@Nat

Thanks.

And I'd love to hear thoughts on what does constitute a for-profit instance if you have them!

S

Sieva Mon 9 Sep 2024 12:28PM

Like others said, for-profit and non-profit are not the categories we should operate with here.

To keep things simple, I would limit potentially dangerous instances first and open individual discussions about each of them to determine if we want to ban/unlimit.

D

Dynamic Mon 9 Sep 2024 1:30PM

@Sieva

Thanks for your thoughts. Can you elaborate on what "potentially dangerous" means to you? Are our existing policies, such as our code of conduct (https://wiki.social.coop/wiki/Code_of_conduct), sufficient to characterize potentially dangerous, or do we need to set additional policies to guide how instances are moderated by the Community Working Group?

S

Sieva Mon 9 Sep 2024 2:31PM

@Dynamic I would trust the Community WG on that. If the decision is not easy for them to make, we should have a broader discussion/vote here.

The code of conduct is a list of rules for people on our server, I don't think it can be applied to an instance like Threads.

Perhaps a separate guide describing when to NOT federate with an instance could be helpful, however considering the CWG has been doing a good job in this regard, it might not be necessary.

D

Dynamic Mon 9 Sep 2024 10:54PM

@Sieva

I think the lack of a clear policy on potentially data-harvesting corporate (or whatever) instances is a problem. One point that got raised in the current thread about the Bluesky bridge was that we don't have a policy on corporate instances. True federation with Bluesky isn't possible, but if it were on the table, the implication seems to be that of course we would federate with them with no limit or anything because they're just another instance.

That leaves individual users who don't want to be part of the predatory internet playing whack-a-mole with trying to stay ahead of proactively blocking these kinds of instances.

Or alternatively that we're going to have to rehash the whole thing every time one of these comes up, which I for one really really don't want to do.

So I want help finding a path forward.

NS

Nathan Schneider Mon 9 Sep 2024 2:19PM

As many others have said, I don't think for-profit and non-profit are the relevant distinctions. We should focus on developing and implementing our code of conduct, focusing moderation decisions on behavior and policies, not economic model.

MN

Matt Noyes Tue 10 Sep 2024 2:34AM

Wow. So no focus on economic models? "I don't believe that these words mean what you think." ;-) Obviously, profit vs non-profit is not a helpful dividing line, but clearly the intent is to resist the corporate takeover of the Fediverse, which is a reasonable concern. Ignoring their economic model would be akin to reverting to the old model of economics as severed from political, cultural, ideological, and other dimensions. Our code of conduct has a very limited scope and does not provide an adequate basis for charting a course through this conflict.

N

Nic Tue 10 Sep 2024 8:51AM

What about distinguishing 'companies shown to have put the pursuit of profit ahead of human, social and/or environmental factors'? (or something less wordy than that!).

I can see how Meta/Threads would be limited in that context as they currently are, or any pure data-slurping adtech/surveillance business . BlueSky I don't see as having crossed that line yet, it's more a fear for whatever their future business model is, afaict. But supposing X adopted the AT protocol and became a BlueSky node, then that would obvs be v different.

D

Dynamic Tue 10 Sep 2024 10:59AM

@Nic @Matt Noyes @Nathan Schneider

I suspect that assessment of potential harm may differ depending on which threat models we focus on. If the focus is on whether an instance engages in appropriate moderation of content, then "wait and see" for new instances in general might make more sense than if the focus is on the current internet norm of passive surveillance.

Acknowledging that Mastodon is absolutely crap when it comes to privacy, I still think there's value in trying to create spaces where people can at least know that the private posts they are making right now are not being warehoused to be sold to the highest bidder (whether for advertising, scams, AI training, or whatever). I feel confident that that isn't the case on social.coop and on volunteer-run instances. For instances run by businesses (or perhaps the better framing is for instances run as businesses?), the question opens up of what happens to the data in the case of a buy-out, reorganization, or just if the leadership changes.

I think our policies should be sensitive to this issue, and not put it on individual users to take precautions like screening new followers and switching to followers-only posts.

That said, as others have noted, distinguishing between profit vs non-profit businesses might not be so helpful here. A few years back my spouse took a job at a non-profit that shortly afterward underwent a reorganization where the most of the assets and staff were transferred to a publicly traded for-profit company.

BS

Billy Smith Tue 10 Sep 2024 12:13PM

Instead of "for-profit" and "non-profit", what about "exploitative" vs. "supportative"?

The main question is who the technology is working for...

In a co-operative instance, the (users of the instance) and the (owners of the instance) are one and the same. We operate in a manner that benefits us, and by extension, everyone else.

In a techno-feudal instance, it's the instance-owner that makes the decisions, and everyone else on that instance has to agree with their decisions, or leave. Same as the rentier market for property.

D

Dynamic Tue 10 Sep 2024 5:36PM

@Billy Smith

I'll say that I have the impression that the majority of small instances have a "benevolent dictator" model and are run by volunteers, and that that's actually fine.

EB

Evan Boehs Tue 10 Sep 2024 12:53PM

  1. Many mastodon servers are technically, legally, for profit, like "Fediverse Communications LLC" https://fediverse.neat.pub/

  2. There are fully for-profit companies that have proved good stewards, like https://about.flipboard.com/

  3. I intend to run my own "for-profit" company on the fediverse, at https://pinefore.com/, and as such I would need to recuse myself from this vote, but I personally believe small for-profits can act ethically, and further collective goals.