If you've been following the news about Facebook/Meta's metaverse project lately, you'll recall the slew of bad press when a female user was sexually assaulted in Horizon Worlds, leading the company to hastily add an avatar "boundary" system.
And if you've been following virtual world/metaverse development for any substantial amount of time, you've probably been wondering why Meta allowed this to happen at all. Understanding and preparing for avatar-to-avatar harassment, especially directed at female avatars, is a fundamental challenge. How did a company spending billions of dollars on making a metaverse platform of its own somehow miss lesson #1 from Metaverse 101?
As it turns out, Meta was warned about this many times -- by a well-known virtual world veteran who was a senior member of the Oculus team. But somehow, his warnings, recommendations, and best practice summaries were not centered. And definitely not put into place.
"I was literally banging the drum at Oculus Connect two years in a row," Jim Purbrick tells me, with evident frustration, even sending along the talk he gave on the subject at Facebook's own conference back in 2016. (Watch below.) "I also told every new Oculus employee I met to read My Tiny Life in addition to Ready Player One, but the message didn't reach every part of the organization, sadly."
My Tiny Life, of course, is Julian Dibbell's classic account of virtual world sexual assault... from the 1990s. Yes, the problem has been well-known and documented for that long.
Purbrick, as regular readers know, was an early developer at Linden Lab, going on to consult with CCP, the developers of Eve Online, before joining the Oculus team. He also documents virtual world/metaverse best practices on his blog here.
And when he joined Facebook's XR team, Purbrick took pains to carry over the wisdom learned from Second Life and from the knowledge base of virtual world development in general:
"I talked to [founding Linden executive] Robin Harper when I was working on this at Oculus to make sure I learned the lessons from her experience at Linden," Jim tells me, "as well as Raph [Koster] and Daniel James: the best practices have been known for a long time." (James is a fellow virtual world veteran who also worked at Facebook, until 2017.)
Purbrick left Oculus/Facebook in 2020, but not before advising the company on a system for minimizing avatar harassment:
"When I was last working on avatars I was proposing fading out avatars when they got close to avoid creepy and disturbing intersecting geometry," he tells me.
By contrast, Purbrick isn't convinced Meta's barrier solution is a good one:
"I don't know the details of the personal boundary plan," as he puts it, "but it has historically been a bad idea as it allows bad actors to blockade avatars and stop free movement." (I can confirm that as well. Again, this is also Metaverse 101.) "I think we did a pretty good job with Oculus Venues, where we had the ability to implement a good set of tools and policies," he adds.
As he departed the company, Purbrick spoke directly about the topic with developers of Meta's consumer metaverse platform:
"I was talking to the Horizon team when I left Facebook and at least some of the team were aware of the issues and best practices, but the work clearly didn't get prioritized," as he puts it to me with classic British understatement.
It is truly mind-boggling, and affirms what I've heard elsewhere, that Meta's Horizon project is beset by a lack of design direction.
As for what this says about Meta, I'm thinking about the company CTO, who only last November, was saying bad metaverse moderation could pose an "existential threat". But if Meta really believes that, why did they ignore best practices around virtual world moderation that have been around for literal decades -- even after they were paying someone to relate them to the team?
lol about to find and close the nearest pool in the fb world
Posted by: Jessica Pixel | Tuesday, February 08, 2022 at 05:24 PM
You were warned. You were warned by someone informed by past experiences and scholarly observations in the matter. You ignored it. Meta, this is why the news that was already bad when it occurred on a smaller less-watched virtual world blew up twice as hard on yours - you didn't think to work on blocking/reporting or permissions tools. Now, suffer.
Also: Pool Closed Due to COVID-19.
Posted by: camilia fid3lis nee Patchouli Woollahra | Tuesday, February 08, 2022 at 06:18 PM
Do you remember when Google launched their failed virtual world Lively back in 2008? Check this out: I registered and logged into Lively, and within, I’m serious, my first five minutes, my avatar was lying on the floor of a reception area, beaten up by other avatars and bleeding out virtual blood. Lively allowed avatars to punch, kick and hit other avatars with objects and knock out other avatars, causing them to bleed and show damage on their avatar body. This was Google’s idea of a virtual world and my first five minutes of Lively. I deleted my Lively account in the sixth minute.
Posted by: Luther Weymann | Wednesday, February 09, 2022 at 02:54 AM
Great information. Thanks for sharing
Posted by: Abigail Jordan | Wednesday, February 09, 2022 at 03:44 AM
Buy your favorite refurbished iPhones online at The iOutlet UK
Posted by: Robert | Friday, May 27, 2022 at 01:40 AM