Mecanum wheels are very useful to increase the maneuverability of a robot. These wheels come with 45º rollers that move independently and allows the robot to move forward, backward, sideways, diagonally or spin in place. But please, don’t make the confusion with omnidirectional wheels. Mecanum wheels are different than omnidirectional wheels, but the result is almost the same. Depending on which wheels rotate in which direction, the robot will change the heading or spin in place.
This kit includes an aluminum chassis with four motors and mecanum wheels. The chassis provides enough space to add sensors like LiDAR or camera for computer vision, as well as a computer like Raspberry Pi or Nvidia Jetson and batteries.
The motors have attached encoders. Having encoders, you can add an IMU sensor and create a map of the room. This makes easier the job to transform this chassis into an autonomous robot able to navigate on smooth surfaces in an apartment, building or factory floor.
Mecanum wheels have some disadvantages compared with the “normal” wheels that we’re using for our cars or shopping carts. These wheels tend to wander side-to-side when the robot is trying to negotiate an inclined floor. Also, mecanum wheels are known for losing traction.
The kit has a reasonable price ($75.99 in the U.S).
Nvidia DeepStream Integration with Azure IoT Central [image credit]
Nvidia DeepStream is a set of tools capable of analyze in real-time video/image streaming and multi-sensor processing. Azure IoT Central is a cloud computing platform from Microsoft that provides access to servers, storage, networking, and software—over the internet. In the tutorial Nvidia DeepStream Integration with Azure IoT Central, Paul DeCarlo combines these two technologies and shows us how to enable remote interaction and telemetry for DeepStream on Nvidia Jetson Devices using the cloud from Microsoft.
This combination of software and hardware can be useful if you’re running a robot on Nvidia Jetson. We can apply the learnings from the tutorial and build a monitoring application for a robot running AI software and ROS.
I try to optimize my work. Even it is about CAD designs for printing 3D parts, learning new things or writing software. I’m new in writing Python scripts and sometimes the syntax causes me headaches. The video below explains common mistakes that a programmer does when is writing Python scripts. It helps me to understand why I have so many errors on the indentation of the lines.
one of the mistake: using Tab key or spaces for indentation. Corey recommendation is to use an IDE for indentation.
Using an IDE to write Python scripts is the easier way. Usually, I write ROS nodes in Python via SSH. All these nodes are running on a Raspberry Pi. I use ‘nano’ to create and write the python files via SSH connection. I cannot use a Python IDE via SSH because this connection doesn’t provide GUI resources.
When you’re working on robots and don’t have too many hardware and software resources at your service, you have to find solutions. I have one recommendation and one idea for programmers who are writing Python scripts via SSH are:
1. don’t mix ‘tab’ and ‘space’ indentations in nano editor: This is the usual mistake that I do by reflex. Sometimes I’m lost in writing the program and I mix the Tab and space keys for indentation. I change in nano the tab character spaces to 4 and everything works without syntax mistakes if I use only the tab key for indentation.
Step 1: Go to your home directory and type the command: sudo nano /etc/nanorc
Step 2: Navigate into the configuration file until the line with #set tabsize 8
Step 3: Remove the # and put 4 instead of 8
Step 4: – Ctrl + O – to save the file, and then – Ctrl + X – to close the file
2. change the nano editor with an IDE and use git: This is an idea: write the Python script on my Windows PC, commit to git, and then connect via SSH to clone the git package. In this way, I will reduce syntax mistakes. In the same time, if I deliver, for example, only a new line of code, will take a little bit more time to check if it works or not. I didn’t test this method. If someone uses this method on Raspberry Pi, please leave a comment with the advantages and disadvantages of this method.
In this tutorial, you will learn how to configure your Raspberry Pi 3B+ board for running Ubuntu MATE 18.04.2 (codename Bionic) and the 12th ROS Distribution (codename Melodic). But we study robotics on this blog. We need ROS to read data from sensors and control motor drivers. So I have added the steps to install the Arduino IDE and rosserial to run ROS nodes on Arduino boards.
Considering Melodic a new ROS distribution, setting up the ROS environment is half of the battle. Not all the ROS packages available in Kinetic are migrated to Melodic.
My guess is that all the packages available in Kinetic will be migrated to Melodic since this distribution have long term support. If you want to start learning about ROS, I am pretty sure that all the packages you are going to use are already available in Melodic. Otherwise, if you think Melodic is not what you need at this moment, you can think of installing ROS Kinetic. In this tutorial, you can find all the steps to install ROS Kinetic on Raspberry Pi 3 and Ubuntu MATE 16.04.
Migrating a Kinetic package to Melodic needs some additional steps. The package has to be built from source (clone the package from GitHub to your catkin workspace) and checks for dependencies in the package.xml file of the migrated package.
Melodic comes with improvements including C++14 over C++11 in Kinetic, OpenCV support with a minimum version of 3.2, rviz and urdf changes. These four are just a part of all improvements from Melodic. All the improvements and the migration guide can be consulted here.
Because we’ve talked about the migrated packages, ROS gives us some insights about the list of available packages in Melodic. The list is here.
Ubuntu MATE (Bionic) installation
The idea of Linux and implicitly the Ubuntu MATE version is to customize it to your needs. The reason for the different versions is actually the packages that come with it.
For this installation and for other projects I run Ubuntu MATE and not Raspbian on Pi. Raspbian is a very good Raspberry Pi operating system but becomes a pain to install ROS and new packages. Using MATE consumes more resources than Raspbian, but in the end, everything works and I can use GPIO and the USB ports to control robots.
I work with MATE 16.04 and with MATE 18.04. If you’re trying to build robots, MATE has been just a platform to host different frameworks for running algorithms and has little impact on what we’re doing with Pi. If you’re using it for multimedia, definitely the last MATE version is impressive and comes with more utilities.
For this tutorial, I use a new Raspberry Pi 3B+, and the first step to make it useful is to install an Ubuntu MATE image.
Download Ubuntu MATE 18.04.2 for Raspberry Pi 3B+
Download the image archive from here. The image runs on Raspberry pi 2 B, 3 B and 3 B+. Read more →
If you are a beginner in ROS and want to learn how to create a ROS service, this tutorial is for you. I describe all the steps in details starting with creating the package and up to use the service and print the results.
The goal of this tutorial is to add additional information to the material support for ROS Service available on the ROS wiki page.
Few words about ROS Services
ROS Services are operating at exactly the same periods with the remote procedure calls;
a ROS Service has inputs and outputs; the inputs and outputs are defined similarly to ROS Messages;
a ROS Service can have as input a number and as output a char;
usually you should define a ROS Service when it’s needed to do something occasionally. For example, I’ll create a ROS Service for a robot of mine that will take a picture when encountering an obstacle;
Software and programming language:
I’m using ROS Kinetic version 1.12.14
The operating system is Linux Ubuntu 16.04 LTS
Programming language: Python
Below I detailed all the steps to implement a practical example of a ROS Service. In this example, I’ll use a ROS service with a random number as input and an ON or OFF text as output.
1. Create ROS package
I want to run this exercise with the beginning, and the first step is to create a package to store the service files. This is not a requirement if you already have a package and you just want to create a ROS Service.
Navigate to the src directory of you workspace and type the below command:
catkin_create_pkg ros_service rospy
If you finish the above commands, you should have a new ROS package called ros_service. The first step is finished and next we will start working to create the ROS Service. The first step is to create the srv directory and the service definition file.
2. Create ‘srv’ directory and ‘srv file’
Starting with this step, we create the directory when we will create the service definition file with the inputs and outputs. Navigate to the package directory when you want to create a ROS Service:
In the srv directory create a file called ServiceExample.srv and write the below three lines:
int32 onezero — string turn
The three dashes mark the end of the input(s) and the beginning of the output(s).
With this last three lines in an ‘srv file’, we ended the service definition file. Next, is time to update two files needed to create the code and class definitions to work with the service that we define.
3. Update package.xml and CMakeLists.txt
Open the package.xml file and write the below lines. If you put the rospy dependency when you create the ROS package, you should already have into the file the rospy build depend and run depend. Otherwise, put all the below lines into the package.xml:
We’ve done with the package.xml file and we will go next to the CMakeFiles.txt. In this file we have to add the dependencies and the service file: (Info: you should already have as comments most of these lists)
Now we have to run the catkin_make command to generate the classes used to interact with the service.
4. Write callService.py
In the srv directory create a new file called callService.py. Make this file executable using the command “chmod +x callService.py”. The file shows a simple server that accept as input one or zero and as output it returns ON or OFF.
#import the code generated by catkin.
#we need ServiceExample for the first message type,
#and the ServiceExampleResponse from the second message type from the ServiceExample.srv file
I have written three articles so far with reference to the design, electronics, programming, and assembly of the RoboBioca robot. This is the fourth article of the series and I will describe the whole project.
I have created this robot waiter to help us promoting the sea buckthorn organic juice. The robot consists of a rotating base to manage the juice shots and a robotic arm that takes one shot at a time and serve it to the customer.
The construction lasted 5 weeks during which I changed the old printer with a new one, I printed dozens of components, and I had several nights when I slept no more than 4 hours per night. The result of this marathon in which I’ve created RoboBioca was sensational for me. Also, the reaction of visitors from the BIOFACH 2019 Nuremberg trade fair that came into contact with the robot was far beyond my expectations.
I had big emotions for the servo motors used in the robotic arm. I was afraid that the servo would not work for more than 20-30 juice shots. The surprise was that the robotic arm resisted and served over 300 juice shots and still working.
Hardware and Software Technologies The software area is covered by the IoT application Blynk and an Arduino sketch. I use Blynk to send commands to the robot. I installed it on an Android tablet, but it works just as easily on my smartphone. Communication between the tablet and robot is via Bluetooth.
The hardware part is more complex and includes a robot arm and a rotating platform for plastic cups. The robot arm is a 6-axis SainSmart kit that imitates the industrial design. The initial version of the kit has been subjected to changes at the end-effector and performance.
The steps to built RoboBioca I have made a CAD design of the rotating base and robot claw. The robotic arm was not designed in CAD. I did not consider that a CAD design could help me to finish the project since it physically exists and it worked. The next step was to divide the project into several small parts that can be built and tested independently of the other modules. I have finished building each part and then I assemble the complete robot.
detect if the plastic cup was grabbed. I use a limit switch sensor to detect the plastic cup;
detect when the plastic cup is taken over;
The two robotic claws are driven by a single servo motor. Actually, only one of the two claws is attached to the servo motor shaft. Through a gear wheel system, the rotation of the claw attached to the servo drives the movement of the second claw.
One of the claws of the end-effector hosted a limit switch sensor. The sensor is used to detect the plastic cup. In other words, if there is a plastic cup -> the sensor is closed; if the plastic cup is missing -> the sensor is open. In this way, I know whether or not there is a plastic cup between the robot claw. In addition, I use the same sensor to know if the plastic cup is taken by the customer. After the customer raises the juice shot, I know that the plastic cup is taken and the arm can enter into the standby position or take another shot with juice. Read more →
The beginning of 2019 found me working to build a platform for promoting sea buckthorn organic juice. The first finished part of the platform was a 3D printed robotic claw. This claw is attached to a 6-axis robotic arm SainSmart. It is the classic robot arm which reproduces the industrial design.
The robotic claw is used to:
grab a small plastic cup with juice;
detect if the plastic cup was grabbed. I use a limit switch sensor to detect the plastic cup;
detect when the plastic cup is taken over;
The design The application is not very complex. I need a robot claw to carefully handle a plastic juice cup without breaking the plastic cup or losing it on the way. I analyzed the steps, actions and worst case scenarios and I decided to stay in the classic design for the robot claw. Two claws actuated by a servo motor are what I need to grab and move a plastic cup. A more complex design with several claws and many degrees of freedom would be too much.
I have a few constraints to accomplishing the above steps. First, the plastic cup has a height of 5.2 cm. The claws will not have to cover a large surface so there should be enough space for picking up the cup. Also, I can’t prevent the cup to not slide between claws with an abrasive material glued on the inner surface. The plastic cup should not resist when it will be taken over by a person.
Robotic claw: CAD design and 3D printed
the claws have a thickness of 1 cm, that is about 20% of the height of the plastic cup. I could have made them less thick but I needed space to add the limit switch sensor for sensing the plastic cup.
gluing a rubber piece to the inner surface of the claw is excluded. The rubber would have prevented the plastic cup from slipping, but it would have made it very difficult to take it from the claws. I would have risked the robot arm being pulled up together with the cup. The solution was to use a high-torque servo motor able to maintain the position of the claws. SainSmart uses for the 6-axis robotic arm gripper the SG-90 servo motors. These servo motors cannot hold anything between the claws.
If you’ve come to this article, you’re probably at the same point as I was before I made my stepper motor move without noise, vibration or jump steps. That’s why in this article, I will provide a sufficient level of depth to know how to control a stepper motor (for example – a NEMA 17 or other stepper motor size) and fixing the problems that may arise.
When a stepper motor is used in a 3D printer, most likely a team of electronics and programming experts makes calculations and put everything together to works perfectly. Instead, when I try to use the same stepper motor in a hobby project (after a normal 8 hours daily job) it’s quite different. Moreover, things get complicated if you have a low budget and just wanted to use the stepper motor to make a robot or rotate a cup holder like in the way I did. All of these situations may give you big headaches.
Let’s start with the beginning and I’ll tell you in a few words from where the whole story started. It is possible to write a long story in this article, but I have a lot to explain to you about stepper motors.
How to Control NEMA 17 Stepper Motor with Arduino and A4988 Driver
How I Start With The NEMA 17 Stepper Motor
First of all, I will comment based on NEMA motors. Like many others, I’ve heard about the NEMA stepper motors and I thought ‘NEMA’ is the brand of the motors. Totally wrong! NEMA is a standard that describes stepper motors. For example, NEMA 17 is a stepper motor with a 1.7 x 1.7-inch faceplate. In other words, NEMA 17 is a size, not a series of stepper motors. Read more →
In this tutorial, I provide enough level of depth to be informed about how to control servo motors with Arduino. I use PWM software and hardware resources because this is the right way to control one or more servo motors. After reading this article, you should be able to control your hobby servos without noise or vibration. In addition, I write a section at the end of the article where I describe how I fix several MG996R servos and a conclusion.
Before going into the details, I want to tell you something. I know how hard is to produce great work and then put it out into the world. But no matter how much was written on forums and blogs about how to control hobby servo motors like MG996R, almost all these tutorials provide information to control it while working freely, without applying torque. The servo changes its behavior under applying torque. If it is not controlled in the right way, the servos vibrate, make noise and rotate randomly. That is how I came to write this tutorial. I want to describe you in details the solutions tried to control the servos, as well as the solution to smoothly control many hobby servos like MG996R while applying torque – in my case, the servo actuates a robotic arm.
RoboBioca Robot Arm
The MG996R servo is the main servo motor used to move the SainSmart 6-axis robot arm. From all of the 6 axes of the arm, four of them are actuated by the MG996R. Only two axes use the SG90 servo to move the end effector.
I used this robot arm to build a robot waiter able to grab and handle a small plastic cup. Because I need something different for my project, I cannot use the default configuration of the arm. I replaced the SG90 servo motors with only one servo motor with metallic gearbox and claws. In this way, I reduced the number of degrees of freedom of the arm from 6 to 5.
In this project, I used the Blynk application to send commands to the robot. The communication between the Android tablet running Blynk and the robot is done wirelessly using a Bluetooth connection.
And because I would not have had enough problems to solve, the Bluetooth connection interferes with the servo motors. These are making noise and running randomly while is working the Bluetooth connection. So an extra problem to be solved.
The power supply A stable power supply of 6V as required for powering the servo motors is mandatory. I used an adjustable LTC3780 DC buck/boost converter module. Also, I have a power adapter (12V – 3A) to feed the converter. I used a digital multimeter to set an output voltage as close as 6V for the converter.
First try The first attempt was to control the servo motors with the Arduino Sensor Shield V5. The biggest problem with Arduino UNO is that I have only two PWM pins (pin 9 and pin 10) that can be used with PWM while running the Servo2 library. The PWM pins are used for the control signal of servo motors. Unlike DC motors, the PWM control is required for servos to determine the position rather than the speed of the servo shaft.
This is the Servo library distributed with Arduino 0016 and earlier. It can drive up to two servos using pins 9 and 10 on a standard board or 11 and 12 on a Mega. Other pins wont work.
I had two servo motors working well when I control them with PWM and that’s all. I leave them and make another try.
Second try With five servos and only two pins with PWM that I can use, I tried another way to control the servo motors. I gave up the Servo2 library and I use capacitors and the Servo library. This attempt turned out more successful than the previous one. From the solutions found on the Internet, I used 470uF 50V electrolytic capacitor for each servo. It works, but not as I want it. Shaking and vibrations don’t occur in most of the servo positions, but the random rotations still continued.
Third try The third attempt was successful. I used the Adafruit 16-Channel PWM/Servo driver to control all the five servos. The driver uses an i2c-controlled PWM driver with a built-in clock and 12-bit resolution for each servo, that means about 4us resolution at 60Hz update rate.
The first failed servomotor was the one who turned the robot arm. It has failed quite easily – I have noticed some smoke being emitted during operation. The second failed servo was at the base. This one no longer recognized the position. Regardless of the commands for a particular position, it randomly changes its position. The cause was not using the Bluetooth module. I tried to position the servo motor without attaching the communication module.
The first servo failed after about 30 minutes of operation. The second was more friendly and failed after about another 2 hours of operation. During this time I’ve used the robot arm to test different positions.
I’m in the situation of having two failed MG996R servo motors while I start working to solve two problems that have actually increased my blood pressure: random shakes(vibrations) and noise.
Initially, I thought I do something wrong and that’s why the servo motors were broken so fast. After a search on Google, it looks like I’m not the only one who worked with the MG996R and have broken them in the testing phase. Finally, I ordered some new servos and I start working to see where I was wrong and how to fix this.
I ordered three MG996R servo motors. They came quickly and I started to test them. It’s not the first time when I bought something very cheap (the price for an MG996R is around €6/$7) and did not work. This time was something very special for me. From three servo motors, all of them did not work.
With three new servo motors that did not work, I thought it would be better to search for an alternative. I’ve looked for alternatives having the same size that could fit into the SainSmart robotic arm. I found Power HD Servo 1501MG. They are the same size as the MG996R plus higher torque.
Of the five MG996R servo motors used in the upgraded version of the robot arm (the original version of the arm uses 4 MG996R), there are three left. Two MG996R servos were replaced by 1501MG. One MG996R servo drives the shoulder yaw, another on the wrist yaw, and a third on the robot claw.
How I fix the MG996R
Any of the below methods require training in electronics and some experience with screws, gearboxes, and grease. Take great care when you work inside the servo, you will do it on your own responsibility.
Two of the servo motors I’ve fixed just by removing the screws. I control them to move back and forth from position 30 to 160 without putting the cover of the gearbox, then I fixed the cover of the gearbox… and it works. Sounds a little stupid, but this is what I did and works. I suspect the problem was in the position of the gears inside the gearbox. Most likely, the DC motor was rotating without engaging all of them. There may be assembly errors or transport shocks.
One of the new servos had some bigger problems. One gear doesn’t have one or two teeths. Besides that, I found a piece of plastic inside the gearbox. Fortunately, I have a burned servo from where I took the spare pieces, I mounted back all the pieces and it worked.
MG996R missing theets
Conclusion on the hobby servos
try as little as possible to manually change the position of your servo. This can quickly destroy the gearbox and the servo can give fails;
the servo motors used in hobby projects works well between positions 10 and 170. If you place a position less than 10 or greater than 170 it makes a noise and gets into vibrations;
the power supply of the servo must be as close as possible to the 6V value. A higher voltage can burn it, and a lower voltage makes it run randomly. Also, please do not try to feed servos directly from the Arduino board or any other controller used in your project. You will have many surprises;
if you work with multiple servo motors at once, for example for a robot arm, you would want to program them one by one. If one of the servos has an incorrect command and moves to a less desired position, it has a big impact on the other servos. The robot arm can touch the work table or other objects around it and broke the gearbox of the servo;
if you do not have a stable power supply source of 6V, it is necessary to use an electrolytic capacitor between the power supply wires of the servo motor;
More Powerful Alternatives To Raspberry Pi 3 (B/B+)
I’m quite involved in developing an autonomous robot platform for outside work. The robot will work in the summer during the day for a few hours, which result in high temperatures for all its components. My main concern is the electronic parts. I have plans to install a cooling system for these, but I’m not sure if I can keep the temperature values at an acceptable threshold.
Besides, I want to reduce the temperature of the electronics using parts that do not work under stress. Low stress results in low temperature generated by the component. Thus, I’ve come to look for Raspberry Pi 3 alternatives to run ROS and communicate with the robot.
I use the Raspberry Pi 3 to run ROS Kinetic, monitor and control the robot. Raspberry Pi is a pretty stable platform and can work continuously for weeks without a problem. Its alternative must provide as much stability in operation, community support, and resources.
The budget for this change is not greater than €150. My research led to the next list (the list will be updated once new alternatives appear).
Jetson TK1 / TX1 / TX2 or Intel NUC is out of the question for the moment. Any of the three Jetson variants or the Intel computer cost a few hundred euros. It is worth investing money in such a board if running ROS and computer vision applications. Otherwise, I do not see any reason in spending hundreds of euro to run ROS nodes for sensors and navigation algorithms.
Added on 19.December.2018 Renegade has the form-factor compatible with Raspberry Pi 3 and can run ROS without a big effort with its 1.4GHz ARM Cortex-A53 processor and 4GB of DDR4. It has a price of $80.00 (the version with 4GB DDR4), but the final price will increase by a few dollars considering that it has no WiFi. Is needed a USB dongle if you desire to use this function.
XU4 has a competitive price of about €63 and on the specifications list is a Cortex-A15 processor that can provide 2Ghz and 2GB of LPDDR3. The only minus of this board is the lack of a built-in WiFi module. I need an extra €5 to buy a WiFi module to have an Internet connection.
Asus Tinker runs ROS nodes on a QuadCore ARM SOC 1.8GHz processor with 2GB of RAM. It has an built-in WiFi module and a price around €49. I have some doubts about the operating system. In reviews from Amazon other users reveals stability issues for Android and Debian images. This makes me think twice about making a decision. It is very important to me to use a single board computer for months without interruption. An unstable operating system can lead to a large number of reboots and downtime.
Rock64 comes in several variants and the strongest one has 4GB of RAM and an ARM Cortex A53 64-bit processor. The board can run a full version of Linux Ubuntu or Debian. The price is also good considering the performance – around €38. The only thing that concerns me is the community support. An active community could save me for a lot of hours to fix issues.