The most fascinating story about next gen AI in technology may not be about DeepSeek or whatever OpenAI is announcing, but reside in a 21 year old metaverse platform which has something no LLM can boast about: A simulated virtual world with terabytes of meta-tagged data constantly evolving.
That's my take, at least, talking with Linden Lab heads Brad Oberwager and Philip Rosedale recently. And while the company has added its own official AI-powered characters last year, the user community has been creating them for many years, even long before OpenAI existed. They recently got a demo from an SL community creator who connected LLMs to an upcoming RPG that blew their minds:
"The whole experience that he had built using AI was really very strong," Philip tells me. "And I think that's one of those cases where, if we can just get people to him, the ones that want to [play his role playing game] it was just awesome." (More on that game later.)
They also thought, he added: "Okay, let's modernize this and make it easily expand upon this stuff."
But doing that means taking incredible caution: "We're running a lot of tests on characters that are infused with AI personalities," Oberwager explains. "They're sort of next gen NPCs. We ran a test, we got feedback, we took the test down. It's not like it went sideways, but some people didn't like some things, but that's the point of the test."
"Using Runway ML is actually straightforward," he tells me, explaining the platform/technical process and artistic approach. "You start by uploading an image and describing what you want to see happen in the video. Then you wait to see how the output aligns with your vision. Personally, I’d estimate my success rate is about 25%. I define success as the output effectively conveying the emotion or sensation I’m trying to communicate.
"To make the process more efficient, I take notes on the wording and syntax that lead to favorable results for my specific goals. I’m mindful of the carbon footprint of this technology, so I prioritize producing meaningful, impactful results. If the outputs don’t show promise quickly, I end the session and reconsider the prompts or the source image. It’s easy to approach the tool like a hammer and just keep hitting anything and everything, relying on it to generate something interesting without much progress towards the original vision.
"With my paintings, for example, I was inspired by one of the more thrilling sensations in Second Life: the feeling of hopping just off the ground, hovering, and zooming closely over the landscape. I wanted to capture that same energy in my real-life paintings, so I tailored my prompts in that direction.
"Once the painting is adjusted in Photoshop, I upload it to Runway, add a text prompt I think will work, and wait for the render. It’s a bit like baking a soufflé—you hope it rises but prepare for the possibility of collapse."
"The resulting output was exciting enough to explore further. It sparked a realization about AI’s potential in art. The experience reminded me of when I first started building with prims in Second Life: the thrill of exploring a new medium of expression with an incredibly low barrier to entry.
"For me it’s a precision thing. Creating felt more precise in Second Life than painting on a canvas. Creating with AI feels more vague than painting, blurry in a way that’s difficult to focus. But this is temporary. We’re in the Daguerreotype era of AI."
All of this, of course, brings up numerous thorny topics about the clash between traditional art and gen AI. Berg is one of the few people highly qualified in both fields to discuss it with nuance. (Along with his SL art, he's an academically trained painter; on the tech side, he was a designer at IBM and more recently, worked on a visualization project for NASA, among other coolness.)
So Berg has some solid back to consider, for instance, the future of traditional art in the AI era, especially as it evolves:
"Although we might view the introduction of AI media through the lens of anti-AI sentiment as many do, that very sentiment could instead be viewed as a renewed appreciation of handcrafted works," he argues. "Regardless of one’s opinion of AI art, it has people talking about art and human agency in ways we haven’t in a very long time.
"The shaman who told stories by the community fire as the shadows dances on the cave walls may have taken exception to written glyphs, wondering how the human experience would be retained on cold stones. Despite the spectacle of AI, these themes and concerns are ancient. I’d be more worried if we looked at AI and rejected it wholesale than having the courage to see what it means to be human in a world in constant tension with the technology we invent."
Does that imply he plans to use gen AI in his own "official" works of art? In other words, works he'd show to the general public in a gallery setting, or even a platform like SL?
Background: Historically we mostly only feature images from SL's Flickr community for various reasons (longevity, embed features, etc.), but we also want to highlight the Primfeed's fairly large virtual world community as much as possible.
In any case, please tag your favorites Primfeed artists in comments so we can follow and/or feature them!
Cajsa Lilliehook covers the best in virtual world screenshot art and digital painting
Cursiichella recently posted this delightful picture and I immediately thought she’s Pippi Longstocking all grown up. I love the mischievous spirit she portrays. Definitely, she would lead an expedition to a circus.
For more of Cursichella’s spirited pictures, click here:
I keep being shown a viral Instagram post about someone tagging major real world logos that are showing up unauthorized in Second Life. So let's talk about it.
Will anything happen because of this one post tagging RL brands? Debatable. However, if someone were to professionally write to the legal teams at Chanel, Balenciaga, Kellogg's, McDonald's, etc., yes. Something could happen.
Only not in the way they think.
When you sign into Second Life for the first time, or whenever Linden Lab updates the Terms of Service, you have to agree before proceeding. Does anyone really sit and read the whole ToS? Probably not. But let me hit you with a couple of lines from section 6:
Linden Lab encourages the creation of original content in Second Life. You should not use copyrighted, trademarked, or celebrity material in Second Life.
So those items I showed you the past couple of months with Versace and Vuitton on them? You guessed it. They're breaking ToS.
"But, Ali. Fashion can't be copyrighted!"
No, it can't, for the most part. You can absolutely recreate a pair of Alice+Olivia jeans in the cut, style, and shape. What you can't do is take one of their embellished jeans and recreate them down to the EXACT graphic design down the legs. You could recreate them and put flowers down the legs, just not the exact ones that are designed specifically for Alice+Olivia.
You can recreate a Juicy tracksuit. You cannot put the exact Juicy logo on the butt.
Cajsa Lilliehook covers the best in virtual world screenshot art and digital painting.
Langstrath Valley shot “Windmills, Sancho? I see Giants!” which portrays the most famous scene from Don Quixote... I loved the book when I read it in high school Spanish and dragged my friend to see the statue at Plaza de España. I love the book far more than “Man of La Mancha,” the musical.
Why did I decide to hunt for Don Quixote in Second Life? Chatting on the phone with a friend, I characterized a mutual friend’s action as quixotic. I then wondered why we pronounced quixotic so differently from Quixote. I had my stream choosing music for me based on a Ben Webster tune and, rudely, The Impossible Dream began to play. I have never liked that song. The algorithmic overlords should have chosen Dulcinea. Anyway, it made me wonder about Don Quixote in Second Life. There were some lovely surprises such as Langstrath’s picture that is so true to the story.
For more of Don Quixote in the Metaverse, click here:
Or I should say, adding next gen AI-powered NPCs, because as Brad notes, "the horse has been out of the barn" for years. But the promise and peril is far greater, now they can be more powerful than ever before:
"What we're trying to figure out is," he tells, "how do we give people the opportunity to have these NPCs, to build these characters, without all of a sudden, you know -- the classic one that everyone says is, 'Without a bunch of Nazis running around.'"
Read it all here, and don't miss the fascinating (if a little creepy!) story of the LLM-powered surfer NPC that Brad once met -- it's totally gnarly.
... We need to drive sales, retention, and engagement across the board but especially in MR. And Horizon Worlds on mobile absolutely has to break out for our long term plans to have a chance. If you don't feel the weight of history on you then you aren't paying attention. This year likely determines whether this entire effort will go down as the work of visionaries or a legendary misadventure...
You don't need big teams to do great work. In fact, it may make it harder. One trend I've observed the last couple of years is that our smaller teams often go faster and achieve better results than our more generously funded teams. Not only that, they are much happier! In small teams there is no risk of falling into bad habits like design by committee.
Start strongly boosting our metaverse KPIs or get fired this year.
... because the only time the leaders of a large company talk about the value of small teams is when a huge staff is about to become a tinier one. Also, I've personally heard multiple insiders talk about a coming cut, recently -- and have been hearing about Reality Labs' "design by committee" problem since roughly 2019.
It's tragic that Meta's staff is now bearing the brunt of the very fundamental mistakes that Zuckerberg and Bosworth themselves made over the last 10 years -- especially when these were foreseeable errors at the time.
To highlight just a few I've followed closely:
Zuckerberg/Bosworth spent billions to mass market VR, without researching (or knowing) why VR tends to make females literally vomit:
[Microsoft's] danah boyd is not an obscure researcher, but frequently cited in mainstream media and tech news sites. So when she ended her 2014 essay with a call for researchers across Silicon Valley to follow up on her initial findings, I assumed this would immediately happen.
It did not.
Reached while writing Making a Metaverse That Matters, she told me that few if any VR industry members contacted her after the essay was published. Neither did they even follow up with her in 2017, when a study published in Experimental Brain Research found that when women volunteers played a game in an Oculus VR headset, 78 percent of them experienced nausea. “To my knowledge,” she told me, “[Oculus and Meta] did not pursue any of those research questions.”
Over the years, I’ve asked several senior Meta staffers about this, [including Zuckerberg/Bosworth's PR team] and have received no adequate reply.
Zuckerberg/Bosworth did not follow the advice of former Facebook VP / Second Life co-founder Cory Ondrejka, Jim Purbrick, and other virtual world veterans they hired.
If you've been following the news about Facebook/Meta's metaverse project lately, you'll recall the slew of bad press when a female user was sexually assaulted in Horizon Worlds, leading the company to hastily add an avatar "boundary" system...
"I was literally banging the drum at Oculus Connect two years in a row," Jim Purbrick tells me, with evident frustration, even sending along the talk he gave on the subject at Facebook's own conference back in 2016. "I also told every new Oculus employee I met to read My Tiny Life in addition to Ready Player One, but the message didn't reach every part of the organization, sadly."
To the statement, "I believe Meta will successfully build the metaverse" (above), only 50% of employees answered yes. In November 2021, 77% of staff answered in the affirmative... 56% say their own CEO has not explained the metaverse clearly.
In a funny irony, Bosworth pinged me on Twitter when I pointed out that Meta seemed to be walking away from the Metaverse... in a statement he wrote that actually shows little knowledge about the concept. The painful irony is his own employees will now have to labor under the gun of that misapprehension.
Cajsa Lilliehook covers the best in virtual world screenshot art and digital painting
Maloe Vansant has been a favorite artist of mine for years, something I said the last time I wrote about her. It’s as true now as then. Pictures like this are the reason why. The woman’s face is so beautiful, but in the texture of the skin we see the breakdown and deterioration. There are scratches and breaks in the surface and there is a growth of gold daubs of paint, flecks that fleck off, a kind of beautiful decay. The title is “Sweet Dreams Are Made of This” and it immediately reminds me of that video that frequently zoomed in on Annie Lennox and her castmates’ faces, but most often on her face. Even without the Eurythmics connection, though, this picture is extraordinary.
For more of the breathtaking imagery from Maloe Vansant, click here: