A @6d_ai experiment, shows 6D visual scans are *also* slam maps where coords = real-world coords. Phone is tracked & rendered in the scan! Content placed in a scan aligns with the real-world. Critical for authoring of AR content when you can’t be at the physical site & mixed ARVR pic.twitter.com/5npt7MBQyW
— Matt Miesnieks (@mattmiesnieks) September 24, 2019
Remember this demo from from 6D.AI, a startup that's creating a 3D mirror world of real life from the input of standard smartphones? Here's the latest demo from CEO Matt Miesnieks, showing that the virtual world (on the laptop) has a near-exact overlay with its real world counterpart, and is updated/mirrored in real time. So much so, the demo suggestions, that you can walk through the real life location simply by navigating your position in the virtual version.
"It’s accurate to about an inch, plus or minus," Miesnieks tells me, which he attributes to "[an] error in ARKit tracking." (I.E. from the attached iPhone he's using.)
Does this mean that this technology could turn a whole real life environment into a one-to-one 3D game/virtual world space?
"Yes, that’s exactly right," Matt Miesnieks tells me. "No special hardware, just scan it all in from any newish iPhone or Android phone. Coordinates are persistent so anything left in place will remain in place (and be visible to anyone in the space)."
Sorry, no dev kit yet, but (someone at 6D tweets to me), "We think we’ve figured out the approach we want to take with regard to productization. Now some work still to do."
Comments
You can follow this conversation by subscribing to the comment feed for this post.