We first began developing robotic simulators in 2009 with a Java based simulator as part of the “Myoelectric Training Tool”. We also built a basic simulator of the physical Bento Arm in 2014 that was developed to work with our Robot Operating System (ROS) based software. These early simulators only allowed for movement of the arm and did not include interactive environments or objects, so their applications were limited. More recently with the release of better physics engines we decided to try and build a fully interactive virtual training environment that could complement the physical Bento Arm as a platform for myoelectric training and research.

The latest iteration of our software is called the Virtual Bento Arm and was developed in Unity (Version 2020.1.0b5) using Visual Studio Express 2015 as the editor for C# scripts. The software includes a realistic simulation of the Bento Arm that can be viewed on an LCD monitor or laptop display, adjustable camera views, and six tasks, five of which include objects that the arm can interact with as part of a virtual environment. The software includes a standalone mode that allows the virtual arm to be controlled with a keyboard as well as an externally driven mode from which the virtual arm can be controlled via an external software (e.g. through your own custom C#/python/matlab script). The Unity project files for the Virtual Bento Arm are now available open source on github and you can also try out the externally driven mode by downloading the latest release of brachI/Oplexus.

Click HERE to tryout a demo version of the Virtual Bento Arm in your internet browser.

NOTE1: The condensed title ‘Virtual Bento’ is used in the logo and title texts where space is limited, but in long form text we also sometimes use the terms ‘Virtual Bento Arm’ or ‘Virtual Bento Arm Environment’ to refer to the same software.


We would like to thank the University of Alberta (UofA), Alberta Machine Intelligence Institute (Amii), Bionic Limbs for Improved Natural Control (BLINC) Lab, Sensory Motor Adaptive Rehabilitation Technology (SMART) Network, Alberta Innovates, Canadian Research Chair (CRC) program, Canadian Foundation for Innovation (CFI), Canada CIFAR AI Chairs program, and the Natural Sciences and Engineering Research Council of Canada (NSERC) for their continued support in this project.