Sunday, August 1, 2021
Home » Flowstone Workshop » Flowstone Workshop 7

Flowstone Workshop 7

Flowstone-HeaderTic Tac Toe

Robot arm and Tic-Tac-Toe board.

Welcome to the FlowStone workshop number 7, where we give a beginners guide to computer programming using the FlowStone free graphical programming language. In this issue we are going to look at programming a robot arm to play Tic-Tac-Toe!

This project was inspired by the Miami-Dade University Robotics Club when they asked for our help with the project. The basic idea was to have a robot arm play Tic-Tac-Toe against a human opponent. We split this project into three main programming tasks 1) Moving the robot arm, 2) Robot vision, and 3) Tic- Tac-Toe algorithm.


• CrustCrawler Smart Robot Arm

• AX-12A dual gripper

• Homemade Tic-Tac-Toe board and pieces

• PC Webcam

• Red and blue colored paper

Obviously this is quite a big project so for this article we are going to focus on the first two issues, and leave the actual Tic-Tac-Toe algorithm for another issue, there are lots of examples of these algorithms on the internet if you are interested, the fun part is moving the arm and seeing the board with Robot Vision.


There are lots of advantages to programming Tic-Tac-Toe from a robotics point of view that make this project a little easier. First, the board has only nine squares making programming simpler than for a chess board, for example. Second, there are only two types of playing pieces, an ‘O’ and an ‘X’ so we can use color to differentiate the two different pieces. Finally the robot will only play with one type of piece in terms of shape, so they can all be picked up by the same gripper. In our case we will use Blue for the robot and Red for the human opponent. Also for simplicity we will place the webcam directly above the board. This could be static or mounted on the arm, so for fun we’ll put it on the arm and move the arm to above the board in between each move to watch the game, hopefully intimidating the opponent in the process.


Adding vision to our CrustCrawler Robot Arm creates a whole world of possibilities and programming challenges. Fortunately the FlowStone programming language has some pretty powerful video processing algorithms already programmed in. So the first thing we did was setup the webcam in the FlowStone software using the webcam module, and then drew four lines into the video image to represent the board squares so that we could align the camera correctly to the board. We did this by using the bitmap create module that creates a new bitmap for each frame of video that you can draw onto. We then drew the four dashed red lines using the line module onto the live video feed.

Splitting up the webcam image.

We then split the raw video image up into nine separate mini images and placed them on the user interface in the correct order as shown in the Split Image photo.

Simulation using test Bitmap.

We could then use the color detect module to check for red or blue in each mini image. To speed the development process up and get consistent colors we first simulated the video image by making a bitmap image in a graphics package with some Blue ‘O’s and Red ‘X’s, then tweaked the color detect thresholds until it was detecting the shades of color correctly.

Here you can see the simulated image on the left and the image processing on the right. When a color is detected FlowStone highlights the area and draws a circle in the middle to show the center. We then decoded the presence of Red or Blue into a simple text string with either ‘O’, ‘-‘, or ‘X’:

These text strings were added together to make an array that would be sent to the Tic-Tac-Toe algorithm, which will ultimately tell the robot where to move to. Now that we know it works in principal with a simulated image, we had to make a board with colored pieces that the robot could see for real. To do this we used some square wooden dowel and stuck some colored paper on the top of each piece (we made the paper by simply printing colored sheets on our photo printer).

We added an extra sample button so we could take a photo of the board and use that as a test image while setting up the color detection. As you can see from the image the color detection relies heavily on lighting so if you don’t have a good coverage of white light across the whole board the colors are not detected correctly. So we need to add some external lighting to the rig. One idea is to put an LED torch next the webcam on the Robot arm for that SIFI look!

So now we have it seeing the board and detecting the colors correctly, the next step is to make our robot arm pick up and move the pieces.


Our robot has a maximum of five Blue Pieces and a combination of nine squares it can place them on. So we need to program a sequence of movements for each of these locations, plus an overhead view location for it to look at the board.

Data processing showing ‘O’s & ‘X’s detected.

Next we need to make five locations where we can place the Blue pieces ready for the game. To make sure these are the same each time you can mark a spot on the board for each of these five locations. One idea to make the programming easier is to have one ‘Home’ position where the robot can move the pieces to just before placing them on the board, this way the robot arm can pick up the next piece and place it on the ‘Home’ position (so we have five ‘get piece’ sequences). Then have nine separate sequences to place this piece on the correct square (nine ‘place piece’ sequences). Otherwise we would have 5 x 9 = 45 sequences to program if we moved in one “go to the correct square”!


The CrustCrawler servos are programmed using a derivative of the RS485 protocol where each servo has its own unique address; this has the added benefit of only three wires in the entire system as the servos are daisy chained.

Image processing using color detection.
Modified robot arm head with webcam and light.

FLOWSTONE WORKSHOP 7-7The protocol for these servos is well defined and is made up of various hexadecimal words. To make this simple we have programed this using FlowStone’s Ruby Code module so that all you need to do is send it an array of servo positions and FlowStone does the rest!

Ruby Code embedded in FlowStone.

To create the movement sequences we used some onscreen sliders to position the arm at each step of the movement sequence. These positions were then stored in an array of positions. Once completed, simply playing back this list of position at set time intervals moves the arm as it was programmed. We had to do this 15 times to make our complete list of locations.


Programming a robot arm to play Tic-Tac- Toe is no easy feat but using FlowStone makes it in the reach of a competent hobbyist or student. This is only the beginning here, the next step would be to add the Tic-Tac-Toe algorithm so that you can complete the project and try and beat you robot arm. As always the source code of this project is available on the DSPRobotics Examples site.