Originally published on my Patreon
Second Life’s long-awaited mobile app launched last June to Premium subscribers, and is showing decent adoption so far, already attracting about 1 in 3 of them. While development and design was led by a team at Linden Lab, the company had a secret weapon in its creation: Sine Wave Entertainment, the developer of Unity-based virtual worlds Sinespace and Breakroom.
“It wasn't very stable,” Sine Wave CEO Adam Frisby allows, showing me the very first test version of the app (above), which his team slapped together in two weeks. “Crashed every sixty seconds on the device, but we got one of the Welcome Islands rendering, and some of the avatar content. It was quite a while before we then actually officially started on this.”
While Adam is not with Linden Lab, his history with Second Life stretches back nearly 20 years, when he was better known as content creator and land baron Adam Zaius. His engineer Joshua May, better known in SL as Karsten Rutlledge (creator of popular SL game Greedy Greedy) also drove mobile development on Sine Wave’s side.
Once the mobile app project was launched, this early test version was scrapped. Sine Wave’s team stretched across the globe, with Adam in Australia, Joshua in the US, with development and QA support with Sine Wave’s office in Shanghai. (“Poor Miklos and Kristoff have been running thousands of QA tests every week. And sometimes that includes the rest of our QA team too, particularly for scalability testing -- Louie and his team.”)
Even with all that development over nearly 2 years, Second Life’s mobile app is far from complete:
"It's going to take a while before we get mobile exactly the same as the desktop experience (and some of that may prove impossible or impractical),” as Adam puts it. “But there's definite resolve from Linden Lab to make it happen, and a lot of the effort up to now has been focusing on the hardest problems first -- just getting the world to render at more than 20 frames per second on device took an insane amount of work.
For the first time, Adam tells the story of creating Second Life’s Unity-based mobile app from Sine Wave’s point of view, shares all the strange and hilarious bugs which popped up along the way -- plus offers tips for both users and SL creators who want to get the most from their iOS/Android experience:
If you enjoy reading this interview, please consider joining my Patreon for free or subscribing for premium benefits.
Wagner James Au: So when did you get the offer to work on the app, and how did it come about?
Adam Frisby: It was a very roundabout process actually; Philip introduced me to Brad a few years ago, not too long after Brad started running things at Linden Lab. We got talking, and we showed him some of the work we'd be doing on mobile, and it sort of started from there. It wasn't exactly one firm step - we chatted for a long while first, making sure this would work for both our teams, and our respective long term goals.
Obviously it helped a lot that we had experience with SL’s technical architecture, back from my OpenSim days; and over a decade of experience since with Unity. Plus we’re still working on Sinespace and Breakroom, so we’ve got the virtual worlds experience to boot.
WJA: What year did you guys actually start working on it?
Adam F: We started properly back in October '22, so we've been working on it for a bit over a year and a half. We did a little bit of prototyping before then though.
WJA: What kind of prototypes?
Adam F: Oh just getting scenes to load in Unity, proving the frame rate could be good enough, that kind of thing. SL is a rather unique thing in terms of rendering; avatars frequently have higher requirements than even the most graphics intensive PS5 games.
Adam F: So this was the very first "actual real avatar" rendering we did - a lot was janky. Much of the work in mobile has been "let's get it vaguely looking right", then "mostly right", then finally "bug for bug accurate with the desktop SL viewer".
Initially there was a lot of hope from both teams that we'd be able to approximate in tricky places - but one of the lessons we learned the hard way was, you just can’t do that, so a lot of work has been done creating notes on exactly how the official viewer works -- obviously with over 20 years development, with a range of developers over that time, there wasn't a lot of ready accurate and up to date documentation about how avatars really work; so a lot of what we've done has been figuring that out.
That early avatar for example was relying on a bunch of assumptions about things like skinning matrices which didn't actually hold up when we threw more content at it -- often we'd get a solution, we'd test it against our 20-30 test avatars, declare victory, then find a popular bit of content which violated all our expectations. In many respects, this has been a humbling project at times.
We have to pair that with other constraints too - one of the biggest challenges we have with mobile is just how few device resources we have to work with - while mobile GPUs are surprisingly capable, and the CPUs performant enough; mobile RAM limits are like stepping back in time 20-25 years, back to when SL first launched.
Depending on your device, we're fitting everything into somewhere around 500 to 800MB of RAM.
On the biggest devices, like say an iPad Pro, we might get 2GB reliably. That’s about a eighth of SL’s current minimum spec, let alone the recommended one; and we have to fit all our code and built-in resources too. It’s a really tight fit, and we have to account for every single kilobyte used.
That's not helped by the sheer volume of content the average SL scene might have - most other virtual worlds (or games even!) will have a few hundred unique assets visible at a time - with SL, our first real intensive tests (where we'd teleport to a complex sim) showed upwards of 50,000 unique assets loaded at once.
That early avatar for example was relying on a bunch of assumptions about things like skinning matrices which didn't actually hold up when we threw more content at it -- often we'd get a solution, we'd test it against our 20-30 test avatars, declare victory, then find a popular bit of content which violated all our expectations. In many respects, this has been a humbling project at times.
We have to pair that with other constraints too - one of the biggest challenges we have with mobile is just how few device resources we have to work with - while mobile GPUs are surprisingly capable, and the CPUs performant enough; mobile RAM limits are like stepping back in time 20-25 years, back to when SL first launched.
Depending on your device, we're fitting everything into somewhere around 500 to 800MB of RAM.
On the biggest devices, like say an iPad Pro, we might get 2GB reliably. That’s about a eighth of SL’s current minimum spec, let alone the recommended one; and we have to fit all our code and built-in resources too. It’s a really tight fit, and we have to account for every single kilobyte used.
That's not helped by the sheer volume of content the average SL scene might have - most other virtual worlds (or games even!) will have a few hundred unique assets visible at a time - with SL, our first real intensive tests (where we'd teleport to a complex sim) showed upwards of 50,000 unique assets loaded at once.
WJA: What are some specific examples of popular content that wouldn't work well on mobile at first?
Adam F: There's currently a lot of things we're still working on - deformers have been an ongoing challenge. There's a bunch of ways they can be loaded differently in the official viewer, and we've not quite figured that 100% out yet. That's not a sign to creators "don't use these!", just an area that we're still spending quite a lot of time figuring out. These kinds of problems are going to need time to fix.
Another known big one is rapidly updating content; we do quite a bit of work on "batching" (a technical term, meaning we take lots of individual tasks, and try organize them into batch jobs that can run faster), and rapidly updating content breaks those batches in ways we don't currently handle, which can lead to a low update frequency.
Finally, we know we've had a lot of requests to add PBR support to mobile - I had a great chat with the lead PBR developer, and we've figured out a way to get that in, without sacrificing too much extra RAM (always a challenge!), so we have that in the pipeline at the moment as well. It'll be a while before that hits devices though.
WJA: So here's a naive question: SL already runs on another 3D engine, so how is the 3D content created to run on the Unity engine?
Adam F: Generally it's all loaded at runtime; so we've got a bunch of code that adapts things to work in Unity's way of doing things. Some of that is pretty fundamental - IE, in SL the Z axis is up/down, in Unity it's the Y axis.
Others, we do things like convert the original JPEG2000 textures into device native formats before we can load them
WJA: So Unity translates SL's 3D data live on the fly?
Adam F: Not really, no, more we translate it for Unity, in mostly real time -- Unity itself is mainly just a display engine to us. It gives us good cross device support, so we don't need to reinvent the wheel for every device that's out there that we want to run on.
WJA: What's the advantage of doing that, rather than porting the SL desktop engine to mobile or building a new one?
Adam F: It'd be a total rebuild job in all likelihood, and then we'd be stuck building and maintaining an engine across both iOS and Android, as well as making it functional for users to be able to actually do things with it.
WJA: It would perform better though, right?
Adam F: Not really. In a perfect scenario, yes. Something hyper optimized for a single use case in theory will perform better than a generalist solution, that’s always going to be true.
But you'll end up using a lot more resources getting there, and staying there - you have to stay current on iOS and Android versions, if you want to use the latest and greatest features, or even ship to the latest devices, you have to spend a lot of time just doing pure maintenance work, rebuilding for the latest graphics APIs when the vendors insist on it (such as OpenGLES to Vulkan, or Metal), and so on.
And the gain? Mostly academic. On the devices we're targeting, we're getting good frame rates - on the iOS side, we're almost always at the 30fps frame cap (uncapped it can go to 200fps fairly often). Android is a bit of a mixed bag, and has more variability, but on flagship devices, the results are similar; and we have plenty of room to maneuver still for optimizing performance further (including making it more accessible on lower end devices).
How much better would a dedicated engine be? Nowhere near enough to justify the added and ongoing work. Plus of course, that ignores the fact that the device manufacturers themselves optimize for Unity. They bug test new releases with Unity apps, they work with Unity to add support for new things - so that carries us forward, without having to do anything ourselves, that’s a big win. I like big wins.
WJA: So now SL is on Unity, should going to consoles and Steam be easier?
Adam F: Consoles yeah - obviously there's a lot of functionality needing to be added before that could be considered, but yeah - Unity has great support for a whole host of devices besides mobile, so going there is something to explore in the future, but it’s not something I’ve discussed in detail with Linden Lab yet. Still, it could be a possibility. Mobile first though.
Steam on the other hand, that could be done today; but that's something you'd need to ask LL about directly, chances are it’s very messy since there’s payments and so on, that has to be accounted for.
WJA: What were some of the weirdest challenges you all had developing the app?
Adam F: There's a lot of weird stuff about SL at a technical level, obviously the intensity of content is a challenge, but there's a lot of "stuff built on stuff in a different way" that makes reimplementing things challenging.
Take for example Bento - the skeletal update that added a whole host of new bones. Well it turns out those bones behave differently to the old ones. The way the positioning is calculated, and whether bone scale affects a joint is quite different under the hood. We have a whole little system setup to implement things in two different ways.
Or eyes. The SL "Ruth" base eyes have corrupted rigging information, the data in the files is junk and cannot be used, on top of that, getting the eyes positioned right requires arcane magic - and we’re certain the current implementation is still a bit off.
Speaking of Ruth, the Ruth avatar is actually a great example of some of SLs history.
SL was initially implemented back in the days before we had standards for how character animation should work in games -- I recall Half Life ended up setting a lot of the modern precedent, but suffice to say, SL's Ruth avatar stores rigging information in a way which took a little while to figure out, because it doesn’t have that genealogy.
Normally meshes in games and virtual worlds store information about which bone an individual vertex is weighted to - usually there's a limit like four bones per vertex.
SL doesn't do that. Instead SL stores just one value per vertex, but blends it between it and the next bone in the chain - the result is similar, but the implementation is not. It's definitely an anachronism -- but also shows that in building this viewer, we're engaged in some software archaeology too.
There's twenty years of weird and wonderful design decisions to figure out, and to reiterate, lots of content uses these oddities, so unless we break things in the exact same way, content doesn't look right.
You should have seen some of the early deformers. I feel sorry for Grumpity -- who seems to find all the avatar bugs, her avatar is often the first to break and the last to be fixed; but it has been tremendously helpful.
WJA: What's your advice to SL content creators who want to make items for SL mobile users?
Adam F: That's an interesting one; part of me says don't change. We're helping build the mobile viewer to support your content, not the other way around; the other half says "efficiency would let us do more".
It's very tricky answering that, because part of what makes SL special, is the same content that would make a tech artist at a game studio scream (or cry). People really do invest in extremely high detail content, and that’s something traditional real-time 3D isn’t good at.
The catch is, supporting that means we have to make tradeoffs - whether that's draw distance, frame rates, battery life, et cetera. (and sometimes those tradeoffs make me cringe a little - we need to spend a bit more time on transparency for example).
It's obviously hard to be an impartial observer, when you're working on The Official Mobile Viewer, people will naturally adapt to what we do -- decisions we make will have consequences, but, I don't feel comfortable dictating to the tens of thousands of creators how they design their content - and especially not at this early stage. I'd much rather SL creators do what they like, and we figure out how we can make that work as best we can.
That said, let me provide some tips for those who do want to follow them: microtriangles (see also) are difficult for us to handle and result in lots of overdraw, atlasing textures also does help keep draw counts down - but focus on keeping the total texture count low (and re-using), versus preferencing atlasing, if you have a choice.
Migrate to BOM and use alphas, the older solutions don’t play as nice (in theory) on mobile as the modern ones do. The other one is a big ask, but would make our lives a lot easier - please include real LODs, or let SL calculate them, when you upload meshes. Each LOD level should be ~50% of the polycount of the next; if you just stuff the same mesh into each LOD level, we can’t use them, and have to start doing our own thing; that said, if even one of the LOD meshes is appropriate, we’ll use it - so if you must, inject just a “real low detail version” into the lowest LOD level. We’d much prefer to render a real avatar than an outline, but that comes down to memory and rendering budgets; and giving us options lets us make better choices.
WJA: What should users expect with mobile updates, and how can they help improve it?
Adam F: So the mobile viewer is still experimental - I think everyone knows that, we're not done yet - not by a long shot; but by tackling the hardest parts first, you'll see us adding more and more functionality as time goes on - obvious stuff like HUDs and scripted interaction are high priority, and we know they're missing right now.
Also we do know about the commonly reported issues, avatars not loading or looking weird, those are also some of our highest priority items, and the next few updates aim to help with that, we've already pushed a few updates, but some avatars are stubborn.
The best way to help is to continue sending in feedback and bug reports. We've been reading every single comment, criticism, bug report and bit of feedback out there (the feedback site, forums, Reddit -- you name it) -- we know there's still a lot to do, but it helps us guide our priorities.
The other thing that really helps us is sending bug reports using the in app bug reporter, it doesn't seem like much, but it gives us a lot of technical information besides your comment -- and we can use that to reproduce faults more quickly and accurately; we invested a lot of time into that tooling and it really does help out.
Questions for Adam? Please post in comments below! And if you enjoyed reading this article, please consider joining my Patreon or subscribing for premium benefits.
Sounds like quite a struggle for the Linden Lab and Sine Wave teams. Makes it all the more impressive that Alina Lyvette created the Android app Lumiya years ago all by herself.
Posted by: Julia Benmergui | Thursday, August 15, 2024 at 04:48 AM
Agree, Julia. That Lumiya story would be the most amazing read, if ever written.
Posted by: Lumiya-fan | Thursday, August 15, 2024 at 05:47 AM
I've been trying to get back in contact with Alina for years! If anyone knows her, please let her know!
Posted by: Wagner James Au | Thursday, August 15, 2024 at 09:21 AM
I still use Lumiya regularly. While graphics are a bit limited distance-wise, they are amazingly clear. I also love that you can use HUDs (which I think at the moment don't work on the LL mobile app). Mesh bodies don't display properly, but they still display. What I don't understand is why LL didn't buy the Lumiya creator out and build on to what she created. Could this have saved them thousands of hours of work and dollars? I'm sure there is a story behind this that I would love to hear. I hope it was not due to hubris on LL's part...
Posted by: Kaylee West | Saturday, August 17, 2024 at 11:26 PM
Do you think Second Life on mobile can (or will) change the first experience for new users?
Seocond Life mobile needs a simple tutorial like IMVU or inZoi (inZoi character studio doesn't even need a tutorial). And I think it needs to be easy to change avatars, take screenshots and videos, and share them on social networking sites.
Posted by: Sanny | Saturday, August 24, 2024 at 07:24 PM