Connect with us

Tech

1X robotics company showcases its androids driven by neural networks

Published

on

1X robotics company showcases its androids driven by neural networks

1X lauded the capabilities of its robots, sharing details on various learnings installed through data.

The actions in the video above are all controlled by a single vision-based neural network, emitting at 10Hz. The network absorbs a range of imagery to issue commands to control the driving, arms, gripper, torso, and head. That is all. There are no graphics installed, no video speedups, no teleoperation, and no computer graphics.

Everything is controlled via the neural networks, with full autonomy, operating at 1x speed.

Next generation

Thirty EVE robots were used to put together a top-rated, diverse dataset of demonstrations, to generate the behaviors. The data is then taken to train a “base model” to identify a range of physical behaviors, from standard human tasks such as tidying homes, picking up objects, and interacting with other people (or robots).

The model is further manipulated for more specific capabilities (for example, opening a door) then drilled down further (open this type of door) with the strategy allowing for the onboarding of additional, related skills within minutes of training and data collection on a desktop GPU.

1X celebrated the achievements of its android operators, representing the next generation of “Software 2.0 Engineers” who have expressed robot advances through data, not by writing code.

The company believes its ability to teach robots is no longer restricted by the number or availability of AI engineers, resulting in flexibility and choices to meet customer demand.

Image: 1X

Continue Reading