Development on the HANDi Hand began in the summer of 2014 as the need for an inexpensive and sensorized multi-articulated hand became apparent in the BLINC Lab. Originally conceived as a hand to use with the Bento Arm the HANDi Hand has since evolved into its own full blown project. The original inspiration for the finger mechanism came from the inMoov hand, but the design has substantially deviated and improved through the process of developing two functional prototypes. This summer we will be building our 3rd prototype in which we will be evaluating an alternate finger drive mechanism.

We are still working hard to improve the HANDi Hand, but if our next design revision solves some lingering issues we are tentatively planning for a hardware open source release for Fall 2017 followed shortly after by the software release in Winter 2018.

The HANDi Hand is comprised of six Hitec HS-35HD radio controlled servomotors and custom 3D printed parts. The 3D printed parts have been designed to print on the commonly available reprap 3D printers in the PLA material. Potentiometers are integrated into the knuckles of the fingers to give position and velocity sense and force sensitive resistors are embedded into the fingertips to give contact and grip pressure. A USB camera has been included in the palm to help provide context sensitive information of the objects being grasped to machine learning controllers.


The HANDi Hand can be controlled using any microcontroller board that has PWM outputs in the 0-5V range such as the Arduino microcontroller development board. For our initial testing we have been using an Arduno Mega controller and custom softwares that include mapping functionality to allow for control interfaces such as joysticks or muscle signals to be easily mapped to the finger movements of the hand. In the future we would like to develop a fully integrated palm controller.

For more information about the software and arms that are compatible with the HANDi Hand please see our BLINCdev development guide.


We would like to thank the Alberta Machine Intelligence Institute (Amii) and the University of Alberta for their continued support in this project.