Hot steppers

Watch this M.C. Esher-like simulation teach 4,000 robots how to walk

Researchers from ETH Zurich collaborated with Nvidia to train an algorithm for real-world application.

Anybotics

Thousands of robots marching in identical patterns across an Escher-style landscape may sound like nightmare fuel, but we promise, it’s all for science. The video you see here is part of a simulation called, “ANYmals” that researchers at ETH Zurich and the GPU-maker, Nvidia, devised for training individual robot dogs.

Robotic Systems Lab: Legged Robotics at ETH Zürich

The simulation, described in a recent paper, is part of what researchers are calling a “novel game-inspired curriculum” designed to train dog-like robots how to move and handle different terrain so that they might overcome the same obstacles in real life.

Robotic Systems Lab: Legged Robotics at ETH Zürich

4K

anybotics

In the simulation, the robots tackle slopes, steps, and steep drops

Whenever a challenge was completed, a harder one was introduced prompting the algorithm to become more sophisticated.

“AI has shown promise for training robots to do real-world tasks that cannot easily be written into software, or that require some sort of adaptation. The ability to grasp awkward, slippery, or unfamiliar objects, for instance, is not something that can be written into lines of code.”

Will Knight, speaking to Wired

Nvidia

The virtual robots were trained through reinforcement learning — an AI method based on constant positive / negative feedback. Whenever a robot moved its legs, the algorithm evaluated how this movement affected its ability to walk and adjusted from there. Specialized Nvidia chips allowed researchers to complete the project in “less than one-hundredth the time that’s normally required.”

Wired

ANYbotics

The combination of simulation, AI, and specialized chips, could broaden the scope of robotics moving forward by expanding the capacity of robotic intelligence. Training could involve more nuanced and specialized tasks like harvesting crops or sewing clothes. For a more in-depth look at the simulation and its implications for the future head over here.

Robotic Systems Lab: Legged Robotics at ETH Zürich