This is a pretty cool Vision Pro demo built in Unity PolySpatial, a suite of technologies for Apple's headset. "It's basically play-in-reality," as the creator Michael Becker enthuses.
It's also a glimpse into the state of virtual world/game development on Apple Vision Pro. While Unity seems to be the most powerful engine for Vision Pro, Unreal and the open source Godot engine show promise -- though those two don't have mixed reality capabilities.
I was recently chatting with a colleague who's a developer with a top game publisher, and is experimenting with these engines for projects on the Apple Vision Pro; they have some guarded enthusiasm for Unreal on AVP down the road.
The basic TL;DR is Unreal has quite a lot of potential, but Unity PolySpatial is the current preference by far for most Vision Pro applications. For the full download, strap in, this gets pretty technical:
“Unreal has the potential to be a seriously cool development pathway”, as they put it. “It’s not quite ready for prime time at the moment, due to a fickle, undocumented, ever-changing engine setup process and lack of clarity around available features.
"For example, while Passthrough can currently be hacked, it is in all likelihood not going to be officially available if you’re working in Unreal due to the fact the plugin is an OpenXR wrapper around the Metal API, which otherwise doesn’t allow for passthrough.
"I would expect other ARKit-specific features like Foveated Rendering and plane/mesh detection to be unavailable for the same reason. That said, hand tracking does seem to be fully mapped to OpenXR and accessible in Unreal, which is great. Overall there is currently a lot of ambiguity around what is and isn’t available, but that will hopefully change with better documentation and resources down the line.
“In comparison, Unity is still the current State of the Art for engine-based AVP dev - they’re Apple’s only officially endorsed game engine for VisionOS development, and you can easily tell. Unlike the Unreal OpenXR plugin the Unity VisionOS Package, as well as PolySpatial, provide a way to leverage virtually all of the device’s features, in immersive mode (VR) as well as full and shared space (AR). In addition, tools like play-to-device and in-editor XR room simulation make iteration much less of a pain; they mean you don’t have to compile and deploy a build to the device every time you want to test a minute change. Both engines have their strengths, as is the case outside of XR.
"The Unreal path won’t be for everything / everyone - namely the lack of certain mixed reality features will be noticeable for quite a few advanced use cases. But if you need an immersive app with all of Unreal's graphics prowess, it's incredible to wield Unreal when building for what’s easily the strongest portable XR wearable of this hardware generation.
"At the moment, though, the development experience in Unity for Vision Pro is better, they estimate, by "a significant margin. User friendly, capable, and most importantly, robust. Unity/Polyspatial is production-ready on the Apple Vision Pro; Unreal isn't yet (but we're all very excited to see it mature)."
More about PolySpatial on the video above. Given that Apple was working early on with developers of the Unity-based Rec Room, I'm not surprised that engine has the lead (so far).
Video demo credit: Michael Becker
Comments
You can follow this conversation by subscribing to the comment feed for this post.