Above: Dr. Diaz talks with someone who formerly trolled them
As Meta learned the hard way, avatar trolling is a core behavior in virtual worlds that needs to be dealt with on a constant basis. Surprisingly, there's not much shared research around the phenomena, or best practices for how to address it on an essential level.
Enter the Troll Project, founded by Dr. Ruth Diaz, Psy.D., someone with both an academic and experiential background on the topic -- including time as a VR Community Design Developer at Meta.
"I built a thriving and diverse community in Horizon Worlds, growing it to 6,500 users," as Dr. Diaz explains. "I focused on fostering an inclusive and supportive environment, which became a model for healthy community interactions." This was before the official launch of Meta's virtual world, but Dr. Diaz says this group wasn't adequately supported by the company:
"Meta and I parted ways due to a 180 degree change on how the app's community was supported and developed. Meta had introduced competition cycles within the community, creating a high-pressure environment that conflicted with my approach to sustainable and collaborative community building."
Since departing Meta, Dr. Diaz has been building on this approach, bolstered by fascinating in-world interviews in VRChat and other social VR apps, where players talk about their experiences with trolling -- both as the victim and the victimizer:
The Troll Project's YouTube page is an impressive compendium of often-moving interviews like these; the bulk were conducted in VRChat, but Dr. Diaz tells me the project will also feature interviews from other virtual worlds including Engage, Horizon, and Rec Room.
Mission statement from the site:
Troll behavior challenges the transformative potential of interactive technology. Trolling hotspots can indicate oversights in community design pathways.
The Troll Project explores the motivations, experiences, and impacts of trolls through immersive experiences. We are made up of professionals in psychology, community development, game design, and super-users of social-immersive applications.
Speaking of which, the Project is looking for people who want get involved -- sign-up page here.
Thanks to Julian Reyes for first telling me about Dr. Diaz's work!
Above: Dr. Diaz talks with someone who formerly trolled them
Update, June 10: Added a clarification that the project will focus on multiple virtual worlds
I admire Ruth's enthusiasm and engagement for social VR; her work will certainly help individual people to grow. I'm less optimistic about the large-scale effect of her work on toxic communities, because trolling and toxicity might be important for many virtual communities to stay stable (and relatively small) by limiting the influx of new users (and thereby avoiding the "Eternal September" problem). (I consider Second Life, VRChat, Rec Room, and Gorilla Tag examples of such communities.)
The way to grow a community beyond those limits appears to be the fragmentation of a community into smaller groups ("bubbles") with limited communication between groups. Facebook has perfected this approach, but long before Facebook (and other social media), competitive team-based multiplayer games have strictly limited communication between competing teams to limit toxicity.
Whether social VR can adopt this approach of fragmentation to grow larger communities is an open question. One approach might be that of "Walkabout Mini Golf VR": most people play it only with people who are already in their social bubble. Meta probably welcomes this approach if the social bubble of most players is defined by their Meta/Facebook-friends.
Posted by: Martin K. | Friday, June 07, 2024 at 06:02 AM
I appreciate your unusual pro-trolling position, Martin K, instead of the usual knee-jerk negative reaction. Having hosted hundreds of social VR event, I agree that a certain level of openness to where someone is coming from can turn trolling behavior into something more human than boots and bans.
But if I understand your point, you see trolling as a plus because it ensures a social metaverse of many many teeny groups with limited contact between them. You suggest Facebook Pages represent perfection in this regard. I respectfully disagree. Facebook pages have been shown to be a hotbed of manipulation and fake players. Facebook has come closer than many to perfecting stickiness, but not community.
You make sweeping generalizations about what has and hasn't worked. Speaking for myself, communities that have a stable base but also welcome newcomers are what keep me coming back to social VR. I don't want to be locked into a small bubble with just a few people who agree with me on everything. I enjoy the living networks I am part of that feature familiarity and newness.
Learning how to respond to behavior seen as trolling will help make social VR human and the kind of place I want to hang out in. Thanks for taking a position on this Martin. See you 'round the metaverse, I hope we can keep talking.
Posted by: Thomas Nickel | Friday, June 07, 2024 at 01:40 PM
Yeah, I'm curious if you are basing that on any research or data, and/or do you have expertise in this area Martin? It's an interesting statement to propose that trolling could be stabilizing considering I've spent decades working for online game companies, all of which have spent insane amounts of time and money building guardrails to try and dissuade and manage it. Honestly, it feels crazy that more deep level research hasn't been done on this and hopeful even with the initial work on this. My personal (but lacking expertise) opinion on the idea of trolls as helping fragmentation is that there are *vastly* better ways to handle that problem. if they are indeed 'helping' that process.
Dr. Diaz (we should acknowledge her education and prior accomplishment) is doing the work to specifically understand how we can evolve beyond basic reactive community dynamics through community development and education. We can't grow if we don't try. I suppose you just said you aren't optimistic though, and that is fair. That being said, while she is certainly very empathetic and helpful to individuals, it seems like her main driver is developing resilient community structures, and she's looking at the full spectrum.
Disclaimer that I've worked with Dr. Diaz directly, so I have a lot more exposure to her work than this article, but she has done that work on multiple platforms in addition to having extensive educational and professional experience in psychology and social science. As a game developer, I find her work extremely promising and am really glad to see someone putting serious muster into understanding trolls at a community dynamics and psychology level. The fact that I've worked for so many companies building these environments and NEVER encountered anyone with this expertise or in-depth understanding as part of any of those efforts speaks volumes to how this has basically been treated as a basic requirement and not given the time and expertise it needs to be really sorted out.
Posted by: Mike Hines | Friday, June 07, 2024 at 02:37 PM
Thomas, I'm not taking a "pro-trolling position". I'm just following the growth of some popular virtual communities over recent years based on GoogleTrends data and try to understand what drives these communities. I hypothesize that trolling and toxicity are (A) part of the reason why Gorilla Tag, VRChat, and Rec Room are among the most popular social VR apps, and are (B) part of the reason why the growth of these communities is limited, and (C) that this limited growth is "stabilizing" these communities in the sense that there is neither rapid growth nor rapid decline. (From a business perspective, stagnating community growth is very challenging; thus, this is not necessarily a positive feature.)
Is (A) controversial? Is (B) controversial? For (C), I was thinking of the exponential growth of Second Life in early 2007 and a similar growth pattern of VRChat around January 2018. According to GoogleTrends data, these rapid growth periods were followed by similarly rapid decline of interest. Second Life has never reached its early popularity again. It took VRChat more than 2 years to reach the level of its early popularity again. A similar growth pattern of Usenet in the 1990s is sometimes called "Eternal September". Apparently, this kind of exponential growth is extremely challenging for virtual communities.
My hypothesis (C) is then that the communities of Second Life, VRChat, Gorilla Tag, and Rec Room are currently not at risk of growing that rapidly because of the current levels of trolling and toxicity in those communities. On the other hand, the current sizes of these communities probably depend on some level of trolling and toxicity - without it, many of the current users would probably lose interest in those communities. In that sense, the trolling and toxicity are stabilizing these communities. Again: from a business perspective, a stable, i.e. stagnating community size is very challenging.
> But if I understand your point, you see trolling as a plus because it ensures a social metaverse of many many teeny groups with limited contact between them.
I'm afraid you didn't understand my point. I'm not trying to advocate anything, I'm just trying to understand what the role of trolling and toxicity is in popular VR communities like Gorilla Tag, VRChat, or Rec Room. Also, I'm trying to understand why Facebook was so much more successful than, say, Second Life, in terms of community growth. And then I try to extrapolate what the most likely future of popular social VR apps is going to be.
> You make sweeping generalizations about what has and hasn't worked.
I look a lot at Google trends data and try to understand it. Thus, when I write about "success" or "popularity" I only refer to quantity, not quality. And quantity matters for the business success and therefore the longevity of social VR communities as users of Altspace VR and Echo Arena and many other former VR communities know very well.
> Speaking for myself, communities that have a stable base but also welcome newcomers are what keep me coming back to social VR.
That sounds nice. Why don't you share the names of these VR communities? Are there social VR communities that are comparable in size to the communities of Gorilla Tag, VRChat, or Rec Room but far less toxic? I would be very interested to learn about them!
Posted by: Martin K. | Friday, June 07, 2024 at 06:43 PM
Mike Hines wrote:
> Yeah, I'm curious if you are basing that on any research or data, and/or do you have expertise in this area Martin?
I'm just looking at GoogleTrends data and try to make sense of it and my experiences in social VR.
> It's an interesting statement to propose that trolling could be stabilizing
Wouldn't you expect VRChat and Rec Room to be more popular with regular people (as in: non-gamers) if there was less trolling and toxicity?
But if all trolls and toxic people would magically disappear tomorrow from VRChat and Rec Room, wouldn't you expect that many of the current regular users leave the community because they enjoy and are used to the drama and action that comes with trolling and toxicity?
In my opinion, without trolling, there would be long-term potential for larger growth among non-gamers, but also a higher risk of business failure in the short term. In other words, it would be a much less stable situation.
> how we can evolve [...] through community development and education
Do you have success stories how this could work at large scale given the anonymity of the internet?
Posted by: Martin K. | Friday, June 07, 2024 at 07:41 PM
Wow, this is a FASCINATING Discussion!
Dr. Diaz asked me to pop over here to read this specifically because of your initial comment @Martin, but I'm intrigued by your questions @Mike!
I was invited here to review this because you're _effectively_ talking about my research! I'm actually an online community architect of 10 years, a content creator in the social science of community, and a Ph.D candidate at the University of Melbourne! I study the following research question as funded by Australia's internet Safety-By-Design commission (my YT channel has more on this [1]).
> "How do online communities inoculate themselves from toxicity WITHOUT isolating themselves from the dominant context; by creating harder barriers to entry (bubbles) and encouraging homogenous group values?"
There's a LOT to unpack in that research question--that's what makes it good for a Ph. D.--but my assumptions may sound rather familiar to Martin's hypothesis:
> "I'm less optimistic about the large-scale effect of her work on toxic communities because trolling and toxicity might be important for many virtual communities to stay stable (and relatively small) by limiting the influx of new users [...] I hypothesize that trolling and toxicity are A) part of the reason why [platforms] are among the most popular social VR apps, (B) in part why the growth of these communities is limited, and (C) that this limited growth is "stabilizing" these communities in the sense that there is neither rapid growth nor rapid decline." ~ Martin K.
The fun part about this is that I got my Ph.D position based on identifying 7 major "burdens to community growth" that are directly related to your hypothesis (burdens are outlined here in source [2]) I think you might find a lot of vindication in your observations, but for reasons you may not expect ;D. The purpose of these burdens is to say that I believe there are intrinsic flaws in the social systems we use to create community ecosystems and those flaws are fixed using social interactions, rules, and cultural mores the community organically uses in order to solve the 'symptoms' of those problems in some way shape or form.
That said, I'm quite literally the person that Diaz, is doing this work FOR so rest assured @Martin, she's already having a large-scale effect! I have a strong stake in Diaz's newly released work because she's doing really well at illustrating a key problem we have 'at scale' in the academic space. Specifically, the current academic discourse currently thinks that 'trolling' and 'toxicity' are not necessarily one and the same. Trolls don't solely produce toxicity. Toxic communities can exist in heavily governed communities that look according to all our ways of measuring them, to be perfectly healthy and civil places [3].
For example, In Trott, Beckett, and Paech's paper, "Toxicity in the manosphere" they argue that we have to define the impacts of toxic or trolling conversations by why they happened, what became of those actions, and the scale of that contributor's impact [3]. Individual trolling contributions for clout-sake could mark someone who is going against the cultural norms of a community because it gives them attention, or because they believe the community is actually wrong for some belief. This is not necessarily 'toxic.' Instead, this is a lone act of "incivility" we can consider just acting up. We define incivility then as "an act that attacks, degrades, or calls for a community to deviate from its previously established values, goals, or practices."
This is broadly different from 'trolling' acts that come from a demonstrably more harmful source; what we'd consider 'toxic.' A toxic contribution comes from someone who was probably already harmed in some way by another community. this is a contribution whose sole intent is to explicitly harm the well-being of a community due to the 'poisoned influence' from another already-soured community's impact on them. Definitionally then Toxicity is, "a measure of the potential harm one community's ideology may have on another, or on the health of the public sphere." These contributions are capable of harming a community's health as backed by entire swaths of populations who believe that same thing - a much more scalable threat.
One is a personal act of rebellion. The other is harmful cross-community spill-over whose impact causes polarization, bubbling, and worse.
A worthy illustration of this point also, from Trott et. al, is that a clearly toxic community can very easily pass the sense of belonging survey and show that the community is perfectly 'healthy' without any troll behavior being present. After all, the bad behaviors are totally expected in that community so there are no trolls. Everyone fell in line. just so happens that the expected line is not a good one. This pretty much defines all of the communities in the great Reddit purge of 2017 and 2020.
Returning to this discussion, and why I think that Diaz is providing a massive contribution to the troll project; she's produced a massive database of interviews we will require in my work to tease out the difference between harmless acts of 'incivility' vs. critical 'toxicly' powered actions.
She's like, nailing step 1 in the research process on this topic I'm working on, so I can skip it. Coincidentally, your comments also helped me validate my research! So interviewing you each would be AMAZING as a way to help me augment mine as well ^^
It's a great point that toxic trolling can act as a way to homogenize community values and force the burden of stratification into smaller and smaller groups, but acts of incivility that we often label as trolling, could use humor in order to call for change in a community.
Truly incredible and fascinating stuff!
-------------
[1] primer on my research: https://www.youtube.com/watch?v=7h8NjH-OS3g&t=1s
[2] The 7 burdens of community growth: https://www.youtube.com/watch?v=SL1YcAWd3Fo&t=700s
[3] Trott et al (2023): Operationalizing Toxicity in the Manosphere: https://drive.google.com/file/d/1QpPCekEpPh790EwBWXcbLN_yfsHSidXy/preview?usp=drivesdk
Posted by: Samantha Venia Logan | Saturday, June 08, 2024 at 03:22 AM
@Martin -- I'll reply, but these are my own ramblings. I do not purport to really understand any of this fully and don't claim any expertise on the underlying social psychology or data.
Nothing is going to happen magically or quickly, and I don't think anyone is proposing or expecting that. Anything worthwhile will most likely take a very long time to implement and will be a gradual change that populations would adapt to. But as someone who has done a ton of professional work around "how do we get the general public in VR" it feels like opposite land to think that VR is either A) Sustainably successful, or B) Stable. If the goals of any industry rely on trolling behavior as their anchor, it will not survive. XR had an early (rebirth) moment in the sun, but it's currently relying on a lot of people investing a LOT of money on the longer bet of widespread usage and adoption.
I deeply believe in the potential of XR and it is my entire professional focus, but it's certainly not guaranteed. In that pursuit, at least based on my experience, one of the most notable blockers of widespread adoption is trolling. Hardware comfort and overall user comfort are a blocking pain point, and those issues have prevented a lot of the more robust development, which is a second major reason. But trolling and new user anxiety over interacting in spatial environments definitely feels like a major issue for widespread adoption. And without widespread adoption happening, at some point ALL of the things you're talking about will fail. You can't have a massive, hundreds of billions of dollars, hardware based industry propped up by a subset of gamers and a few relatively low population VR social platforms.
It's *possible* that gaming culture could make XR moderately successful in and of itself if there was high enough saturation in that market, but people have been aggressively pursuing that hypothesis as the starting point for most of its history with the justification that "early adopters and users of the headsets will be gamers". Personally, I think that is alienating the majority of humans who would find vast numbers of meaningful uses for XR if they didn't think of the hardware as a gaming device and if a big chunk of their exposure to it wasn't videos of people trolling in VR. I've done a lot of VR demos to people outside of the gaming sphere, and "Oh, I just thought those were gaming devices for kids." is usually the first thing people say. They then put the headset on and have powerful, perspective-changing, emotional reactions.
We need to change the narrative. We *need* the general public in XR, and I believe it is happening, just not as fast as it could be if we had better solutions for these challenges. XR and the 'metaverse' are really just variants and/or layers on our existing online communities, and just like the internet, every niche and group will still have their domain where their culture (including trolling) can continue if desired by the majority of that community, but the majority of humans aren't looking to be trolled. We need solutions for all of those groups and ways to make online communities as healthy as they want to be. That includes ways to include, engage, and otherwise incentivize and motivate people who enter a community that doesn't want trolling.
And on that note, I think understanding the motivations and reasons for trolling behavior will let us incentivize, call in, and engage with many people who are trolling. I specifically think this work is valuable because it isn't about punishing, banning, etc. It is about understanding and pulling in (at least as I understand it). That is the ONLY thing that works with anonymity.
Posted by: Mike Hines | Saturday, June 08, 2024 at 09:36 AM
@Samantha -- Great to hear about all of that work and very interested to follow it. Happy to do an interview with the qualification that I am just spouting my opinions ;)
I would type more, but I overdid it on the reply to Martin and have a massive pile of work to do. Happy to talk to you about it directly more if we do a call or something though. My contact is on my website, but I think Dr. Diaz may have already connected dots there.
Posted by: Mike Hines | Saturday, June 08, 2024 at 09:41 AM
@Mike: I agree with most of what you are saying; in particular that the "few relatively low population VR social platforms" won't get the general public in XR.
> XR and the 'metaverse' are really just variants and/or layers on our existing online communities, and just like the internet, every niche and group will still have their domain where their culture (including trolling) can continue if desired by the majority of that community
That's what I meant with "fragmentation" and "bubbles": relatively small online communities that can agree on a specific culture or at least on a working set of minimum social rules, including rules how to deal with members who keep violating those rules (which might include any number of attempts at understanding and pulling in those members). That scenario stands in contrast to the "big happy family" ideal that some of today's popular social VR apps are apparently pursuing.
Posted by: Martin K. | Saturday, June 08, 2024 at 11:14 AM
@Samantha: Thanks for your thoughts! I'm afraid I don't quite understand the role of the semicolon in your research question; thus, I'll just ignore it.
Also, I'm not sure I understand your definition of incivility and the precise difference between trolling and toxicity - specifically when it is about trolling with discriminatory undertones.
> Individual trolling contributions for clout-sake could mark someone who is going against the cultural norms of a community because it gives them attention, or because they believe the community is actually wrong for some belief. This is not necessarily 'toxic.' Instead, this is a lone act of "incivility" we can consider just acting up. We define incivility then as "an act that attacks, degrades, or calls for a community to deviate from its previously established values, goals, or practices."
Maybe some examples could help me to understand this better:
Is it an incivility if a male user tells a female user to make him a sandwich/to bend over because he wants to impress other male users and thinks that he should be allowed to make "jokes" like this?
Is it an incivility if a man calls a woman the "c" word and thinks that he should be allowed to do so because it is just how he usually talks in his local community?
What if someone tells one of the countless Anne Frank jokes making fun of her death because they think that it is just a "dark" joke that should be allowed?
(My thinking about such jokes is influenced by the fact that Nazi propaganda in Germany in the 1930s intentionally spread jokes about Jewish people to make racist attitudes more acceptable in German society: someone might think that they just repeat a dark joke, when in fact they are an active part of a very successful racist propaganda machinery.)
I feel I should first better understand where you draw the line between trolling and toxicity. One of my concerns is that trolling is often motivated by toxic attitudes but intentionally toned down not to cross the threshold to be sanctioned by a community. Sometimes a lone act of trolling triggers a long sequence of similar acts, which in their entirety clearly convey a toxic attitude. Sometimes the purpose of trolling is to test how far one can go with expressing toxic attitudes without push back by one's audience.
Thus, when it comes to trolling with discriminatory undertones, I honestly have no idea where one should draw a line between trolling and toxicity, but my gut feeling is that your definition might be easily exploited to spread toxic attitudes by "lone acts of incivility".
Posted by: Martin K. | Saturday, June 08, 2024 at 01:54 PM
@Martin -- I actually meant that a few VR social platforms and games can't financially prop up the industry. I think social VR platforms have been one of the few things actually pulling *some* of the general public in, but the toxicity and trolling is certainly a major block there. We all know the realities of someone trying out public VRChat worlds. That being said, I don't think the social VR platforms get us there alone by any means, even if we managed to make significant strides with the toxicity. Ultimately XR will succeed with the general public when it solves real problems. But many of those problems are related to the various areas I mentioned (education, business, arts, performances, etc.), and without better systems to manage this, they are all majorly at risk of ever being viable. Closed sessions, groups and invites certainly work to a degree, but it adds friction to onboarding and use, and it greatly limits spontaneous connections and natural community growth.
In my work we've called 'fragmentation' (which feels kind of negative) interest-based sorting or enticing people into communities that align with their interest. I don't believe anyone thinks everyone should just generically hang out and be a singular community. At least I haven't heard that proposed or treated as a goal. I think there is a sense that those collective, interest-based or cultural/social groups can exist as a meta-community that is joined in its love for the overall ecosystem and its belief that the underlying infrastructure is a shared resource that they all support. But I don't have direct insight into everyone's approach obviously.
I personally don't believe we will ever eliminate all outliers who just want to tear things down too. At least not for many, many generations. Most of our systems are built on artificial scarcity and profit motive at this point, not even getting into personal traumas and other broken systems that are just hard problems to solve. So even attempting to build cohesion among people is fighting that deluge. I still deeply believe it's worth it though, because none of this is binary. It's a gradient, and the more we do to empower and build communities to be resilient, the further we shift our path along that gradient to thriving.
Also, regarding the response to Samantha -- I think it's always going to be messy, but the gradient obviously exists. All trolling is not equal in destructiveness or malicious intent. I agree that it feels very messy, and I think it will always be that way. It's the same reason that the difference between manslaughter and murder is significant from a legal perspective in most systems. Intent is still important in how you approach addressing the issue. Treating a malicious murderer the same as someone who created a risky situation that led to a fatality is ignoring the actual human and how to best approach rehabilitation or at least attempting to avoid others following the same path. It's certainly true that both a slippery-slope type offender over time and an egregious and malicious offender in a short time can get to similar levels of damage, or that over time the slippery-slope scenario can even be worse and more insidious. That being said, I've seen multiple instances of trolls who were attention seeking or trying to disrupt systems they saw as bad otherwise engaged and converted into allies and powerful contributors to better community. That was particularly common during my years in MMO development. That is the reason to understand 'incivility'. Not because it can't be damaging, but because you can approach those individuals with a different set of tools and likely have a massively different outcome. That's at the core of the work Samantha and Dr. Diaz are doing based on my understanding.
Posted by: Mike Hines | Sunday, June 09, 2024 at 10:53 AM
@Mike: Again I agree with most of what you are saying.
> I don't believe anyone thinks everyone should just generically hang out and be a singular community. At least I haven't heard that proposed or treated as a goal.
I think that some early social VR apps that started very small in 2016 felt that it was almost a necessity that most of the few users who happen to be online at any point in time can hang out or play together. The Chief Creative Officer of Rec Room Inc. (formerly Against Gravity) discussed their approach when describing their "social mission" as "Rec Room is a fun and welcoming environment for people from all walks of life" https://youtu.be/AXqe-wKwhXs?si=KUCSsnSnFm3smawn&t=312 . Since this is a mouthful, he sometimes has shortened this mission to the ideal of a "big happy family" (e.g. https://youtu.be/z1CHWG-xA-s?si=BgQ4nmNsEST9bCze&t=212 ).
With the experience of how this has been working for Rec Room in the last 5 years, this mission might appear overly optimistic and even somewhat naive. But even if you enter Rec Room today, you can still see this mission in action: there is still the "Rec Center" as a public community hub for all players, and when you enter one of the built-in worlds, you are nudged towards entering a public instance of the world with voice chat available by default (unless you deactivate it or you specify that you are younger than 13 years when creating your user profile). And to the credit of Rec Room, it is very easy to find and talk to other players in Rec Room - and not all of them are trolls.
Posted by: Martin K. | Monday, June 10, 2024 at 06:33 AM
@Martin -- Thanks for the specific points. I think the trick with Rec Room is that it is *already* an intents/interest based sub-community. They have always had a very strong narrative and focus on youth focused gaming. Hosting anything else on Rec Room (especially anything serious or non-game) just feels inherently off. I've never really considered it as a platform for any other type of experience because anyone not in that community will be filtered out by the branding and onboarding. Even VRChat has some of that, but it's certainly more broad. Its branding is certainly geared toward a type of interest demographic, although kind of hard to pin down. Roblox is obviously the same too. Altspace was in more broad in application, but their onboarding and avatar style certainly trended toward playful. And things like Engage push notably business and therefore filter out gamers.
Essentially I don't think anyone has really succeeded with a more general public, open-ended entry funnel. Which makes sense, because it makes acquisition much harder. And of course, even if everyone had the same interests, you're still going to get sub-communities based on connections, factions, etc. So even if you have a good solution for all of these aspects, you still need to deal with self-segregating communities and how they work collectively. That's just humans.
I just don't think trolling should be considered a viable or stable solution for filtering or segmenting communities, even if it's currently acting in that way at some level. It has way too many detriments to make that acceptable and I am convinced there are better ways to do it. We just need to be much more intentional about it. Rec Room's branding, onboarding, and core content was an attempt to do that, and it's generally made their product more stable than most in the VR sector. Their 'one big happy family' is just a well-defined subcommunity already. It's just one that embraces trolling as a core social activity (or so it seems to me - haven't spent a lot of time on the platform because I was filtered out by the vibe), so that feels like a different problem.
Posted by: Mike Hines | Tuesday, June 11, 2024 at 10:33 AM
RE: Your questions about toxicity vs incivility:
Thanks for the questions @Martin :) I'm happy to elaborate in what I hope will be a more helpful way.
First, I think it's important to consider the fact that all community interactions will occur with a specific set of member roles - most importantly the key stakeholders who have sway and control over the community guidelines, codes of conduct, terms and conditions, and membership processes for any one community. We can view this structure as a "status-quo" system put in place to allow Community builders to sway Community culture as a whole. You herd cats by tilting the floor.
As new members enter this environment they will follow a process where they find value in being in the space. In exchange for that value and potentially a sense of belonging, they are expected to conform to specific guidelines and restrictions about their participation. Sometimes they may break a few rules and the established members will work to right that supposed lapse in behavior as they learn. This is a very standard enculturation process in Communities.
As you posted in your comment to Mike, community builders will generally use the community guidelines and codes of conduct as a way to filter out and enculture members into that space. The role of these key stakeholders is to guide that enculturation process toward a healthy(ier) community.
We can break down this community-building process into establishing cultural beliefs that include (in order); the goals, values, experiences, practices, and artifacts that define a community. Each 'input' builds upon the other, to enrich the cultures of the spaces they belong in. All participate in fostering those inputs.
So, what do we have as a result? We have a status-quo of people in power, looking to build established guidelines for these 5 inputs of Community culture. The expected outputs of that culture should include a sense of purpose, membership, emotional belonging, and practice (McMillan & Chavis, 2008). Between these inputs and outputs, we have community members who see value in a space, but who all have different definitions for what the values, goals, and purpose of that community might be. The shared construction of that community is in constant flux. This is the inherent flaw and function of Democracy. If all people have a voice, all voices must be considered, but no action can respect all sentiments.
Crucially, when members don't receive their perceived outputs for participating they'll usually attempt to change the way they contribute to that community's inputs. Usually, that's totally fine. You get out only what you put in.
Unfortunately, it is here that many members will act contrary to the established status-quo (incivility) and we have to better judge the reasons why that happens. Some who do not feel they have gotten what they need from the space might act to secure for themselves what they are missing through harmful behaviors. These behaviors can go against policy, but that could be for good reason. Something might be unfair and protest might be necessary. Conversely, the act of incivility could also harm other members around them to selfishly get what they want out of the community. This can range from harmless behaviors that need a small step-in, to deeply upsetting ones that require community action at scale.
Then there's the other, far more harmful origin to that behavior; actions that are systemically performed explicitly to harm or deviate a community from its healthy status-quo, or whose intent is to bring harm to members and the public good. These behaviors may be sourced from harmful ideologies generated in toxic communities, or it may be a result of an individual who shares both feet in a toxic and a healthy community. they simply can't tell the difference. Regardless these more harmful toxic actions allow 'toxicity' to spill-over.
My research is about better telling the difference in this super mushy grey area. Many things in this system can go wrong but some of them are a special level of bad. Established key stakeholders could introduce a status-quo that is inherently good for them, but harmful to the community. Over time this can become toxic without any one individual having actually done something wrong. Conversely, new members could come in who also have a presence in toxic communities, bringing that toxicity with them. Or you can have people who seek to do harm, choose to play nice for a while until they have the reputation needed to garner power, and then they 'flip.'
Or a community member could go drunk with power. That's a thing too.
We currently do not have a way to measure or understand the difference. Borrowing the words of Justice Potter Stewart, "We cannot attempt at this moment to define what it is, but we know it when we see it."
My job is to eventually identify and predict whether actions are mere incivility, or truly toxic. My job is to metrics in place that will help community builders identify the differences so they can know when they must listen empathetically or when to whip out the ban-hammer.
So to answer all of your questions in one go: the difference between an act of incivility and one of toxicity is one of intent - but we have no current measures for that intent that allow us to determine, at scale, the difference between someone who is fighting the good fight, or someone intentionally knawing away at the health of a community. We can only do so case-by-case, and when we have 100,000 moderation tickets submitted to large communities per day - case-by-case is not possible.
Posted by: Samantha Venia Logan | Sunday, June 16, 2024 at 08:29 PM