Watch this new tech demo from DeepMotion, the AI company which taught NPCs how to argue, and learn to dribble basketballs: An AI "stuntman" agent who can run through a physics-enabled breakable wall -- and adjusts his momentum based on the wall's thickness.
"[W]hat's changing each iteration is the strength of the wall," DeepMotion's Anna Ploegh tells me, "by way of adjusting the mass of each brick. So the running movement is responding each time to a difference in the wall obstacle. Over each iteration, that mass of the bricks increases until it is too difficult for the character to jump through."
This is why each crash in the demo is different, as is the way the AI agent falls. This is in contrast to technology often used in videogames to simulate a similar effect, where an NPC is suddenly turned into a physics-enabled ragdoll (i.e ragdoll physics). Here's why:
"The entire motion sequence is simulated," as Anna puts it, "so the character is an active ragdoll that is performing learned skills, rather than just transitioning to ragdoll for the falling. In the first iteration you can see how, if physically feasible, the character can continue performing the running motion. The character can perform gradations of various motions and interactions with its environment with fully simulated, procedural animations!"
So while we're still a long way off from convincingly simulating AIs which truly look and act like a human being, at least we're making good progress on getting them to die like us.
Look and act like a human being - who continually runs in to a wall that may or may not be breakable for no apparent reason. Can appreciate the simulation for its actual tech as its rather neat but can't shake the feeling there's a metaphor lurking.
Might just be the urge to teach it to go around - or nick a ladder (hey there is always a ladder or pallet or....)
Posted by: sirhc desantis | Thursday, May 16, 2019 at 04:01 PM