Saturday, July 31, 2021
Home » Flowstone Workshop » Flowstone Workshop 10 Part 3

Flowstone Workshop 10 Part 3

Flowstone-HeaderHow To Really Program A Humanoid Robot – Part 3


Welcome to the FlowStone Workshop number 10, part three, where we provide a beginner’s guide to computer programming using the FlowStone graphical programming language. In the last two issues (parts one and two) we looked at what makes up a humanoid, as well as the different ways to program them. We built a Hovis Eco humanoid from Dongbu Robot (courtesy of RobotShop) and tested the stock software. In part three we will program the humanoid using FlowStone, add some IK, an MIDI key- board and some additional sensors like gyros and pressure sensors.


FLOWSTONE WORKSHOP 10 pt3-2Before we start, I’m proud to announce a new software platform that we will be using from now on. It’s made exclusively for the robotics market by DSPRobotics in collaboration with RobotShop. This new platform is called Flowbotics Studio V2. Some of you may be familiar with the name as we used it a while back for our robot arm example program for PLTW (Project Lead the Way). The new Flowbotics Studio is a totally different animal, however. Flowbotics Studio V2 has an application center on top where you can browse various professionally made robotics apps. The initial design is for the LynxMotion range of robots: Quadrupeds, Hexapods, Bipeds and Two Wheeled Rovers, as well as for general robot sequencing. It will encompass many of the other popular hardware lines as time goes by.

There is one special feature, though; there is FlowStone inside, and all of the apps have been written in FlowStone! Therefore you can, with the click of the mouse, drop into the source code at any time and modify the projects to make your own creations, add additional sensors, etc. This means that for just about any robotics hardware you can buy, you will have some pre-made source code that you can use as a basis for you own programs. So, all of our examples from now on will be using FlowStone from inside Flowbotics Studio. Flowbotics Studio will be available exclusively from RobotShop by the time you receive this magazine, at a cost of around $39.00 USD!

The Flowbotics 32-channel servo sequencer.


In the last issue of Robot we built a Hovis Eco robot and tested the stock software; now it’s time to see if we can take control of our robot’s servos and control it from inside FlowStone. The Hovis Eco robot has a cool, undocumented feature allows you to send it raw servo data via its serial input and gain full control of the robot’s servos when you put it in remote control mode. This is really cool and works very reliably; we have had it connected literally for several hours without a single issue, which is more than can be said of some of the other platforms we tested.

Since the Hovis uses Herkulex servos from Dongbu Robot, we first needed to understand the servo protocol. These servos are made specifically for robotics and are very similar to the Dynamixel servos we have used before. The cool thing about these servos is that they are intelligent and know what the torque and temperature is of each servo. This means that if you make a mistake and overload the servo it automatically switches off, preventing the servo from burning out! The down side is that they are more complicated to communicate with but since we’ve done the hard part you needn’t worry about this. In order to control our servos all together in sync, we used a feature that allows you to send a long hexadecimal string of servo data to control all 20 servos in this robot. The protocol follows this structure:


Packet Size



Check Sum1

Check Sum2


We used some Ruby code within FlowStone to generate the correct data sequences to program a Herkulex servo engine.

Now that we had control over the servos we needed a user inter- face to adjust them and here’s the really cool part, Flowbotics Studio comes with a generic, 32 servo sequencer so all we needed to do was replace the LynxMotion SSC-32 servo controller engine with our new Herkulex servo engine and we now have a full pattern sequencer for our robot! If time permitted we could even read the servo positions back from the robot so that we could position it with the servo torque off and make creating sequences even quicker (this is what they do with their own stock software). But for now we have to manually position each joint using the sliders on screen. To do this using Flowbotics Studio only took about 20 minutes, demonstrating the power of this new platform.


The next job was to add some control from our 88 note MIDI key- board. Using a MIDI keyboard is a great idea to get some dynamic control. All too often, remote robotics controllers use digital switches that don’t allow you much sensitivity in controlling the robot’s movement. What we wanted to do was move our robot by literally playing the piano notes. If you pressed hard to make a loud note the robot would move quickly, if you played softly the robot would move slowly.

To do this we added a MIDI input module in our FlowStone code and decoded the notes played into discrete movements while using the velocity sensitivity (loudness) to control the speed. Since FlowStone fully understands MIDI, there are many MIDI modules you can use to do this. We used the MIDI Split module to decode the note and velocity data for us.

There were three ways we used the midi data to control our robot:

The MIDI keyboard used to control the robot!

• To move to a set position with a variable speed based on the note velocity.

• To move to a position proportional to the note velocity.

•To play a sequence pattern (e.g., walking) at a speed proportional to the note velocity.

For number one and number two, we just added values to the servo positions and changed the global servo speed to match the velocity of the note played. This worked incredibly well and gave us a real feeling of being in control of our robot’s emotions. Having worked with robotics for a few years now, this was a real first and became quite entertaining, especially when the robot was interacting with other people and we were control- ling it. I guess this is similar to Hollywood animatronics on a budget!

For sequences represented by the number three, we used a special feature of Flowbotics, the user programming area in the FlowStone code. This is nothing more than a module with a pre-made scripting area based on Ruby code with some predefined key words to control the Flowbotics pattern sequencer.

Decoding the MIDI data in FlowStone.





goto{pattern name}



speed{0.25,0.5,1,2,4 or 8}

Here’s how this is written in the actual Ruby code in FlowStone:

Ruby Script for the Herkulex Servo Protocol.

Here you can see that a trigger starts the ‘ToWalk’ pattern, so the robot gets into a position that is ready to walk. Once that pattern is finished, the ‘Walk’ pattern is start- ed and looped indefinitely until a stop command is triggered. Then the ‘FromWalk’ pattern is played to bring the robot back to a home position. The speed of playback is also controlled by adding the ‘@ Speed’ input variable to the play- back speed keyword. You can cre- ate very complicated routines with this simple scripting.

Using the Biped app. to add IK to the legs.


Simple Ruby scripting to control sequencer patterns.

The next project was to add some inverse kinematics to the legs of the robot and see if we could get it to balance by using some gyros and pressure sensors. For this we used another Flowbotics project and modified it with our Herkulex servo engine. This time we used the Biped program that comes with Flowbotics; this already has the inverse kinematics built in so it was just a case of wiring the servos in the correct order to get the legs moving correctly:

In order to get the robot to balance, we added a Phidgets three-axis USB gyro and some piezo pressure sensors connected to a Phidgets 8/8/8 and attached the sensors to the feet of the robot. The gyro would tell FlowStone the movement but was unaware of the robot’s balance state so the pressure sensors would give feedback on the robot’s balance state. The aim was to move the ground like a rocking boat and keep the robot upright. This proved to be relatively simple as the IK took care of con- trolling the legs and all we had to do was add a constant in the right direction proportional to the detected movement.

Piezo pressure sensors.

One thing that we discovered was a neater way of detecting the foot pressure without the need for pressure sensors. We did this by using the servo’s torque feedback to calculate the robot’s cur- rent balance. For example, if the robot rocked onto its toes the torque would increase in the foot of the robot. It worked the same with side-to-side movement when the torque would increase in the ankle. Maybe this is an idea that the robot manufacturers could take on board in combination with a gyro to give their robots better balance.


We have shown a few different ways to get dynamic control over humanoid robots. The key seems to be bypassing the on-board robot controller and using a more powerful PC or laptop to send telemetry data to the robot for control. This way you can add whatever sensors you like. You could even add things like speech synthesis, webcam facial recognition and gestures found only on the most expensive robots, and you can do this for little or even no cost. There is really no limit to what you can do. I can certainly recommend the Hovis Eco Robot as a great research platform because it is very open and more importantly, it is reliable. Moreover, there is something definitely more endearing about a robot with a body shell and a head with eyes. As always, any source code will be available to download from the DSPRobotics site and will be in Flowbotics Studio for- mat from now on.