If you are a beginner in ROS and want to learn how to create a ROS service, this tutorial is for you. I describe all the steps in details starting with creating the package and up to use the service and print the results.
The goal of this tutorial is to add additional information to the material support for ROS Service available on the ROS wiki page.
Few words about ROS Services
ROS Services are operating at exactly the same periods with the remote procedure calls;
a ROS Service has inputs and outputs; the inputs and outputs are defined similarly to ROS Messages;
a ROS Service can have as input a number and as output a char;
usually you should define a ROS Service when it’s needed to do something occasionally. For example, I’ll create a ROS Service for a robot of mine that will take a picture when encountering an obstacle;
Software and programming language:
I’m using ROS Kinetic version 1.12.14
The operating system is Linux Ubuntu 16.04 LTS
Programming language: Python
Below I detailed all the steps to implement a practical example of a ROS Service. In this example, I’ll use a ROS service with a random number as input and an ON or OFF text as output.
1. Create ROS package
I want to run this exercise with the beginning, and the first step is to create a package to store the service files. This is not a requirement if you already have a package and you just want to create a ROS Service.
Navigate to the src directory of you workspace and type the below command:
catkin_create_pkg ros_service rospy
If you finish the above commands, you should have a new ROS package called ros_service. The first step is finished and next we will start working to create the ROS Service. The first step is to create the srv directory and the service definition file.
2. Create ‘srv’ directory and ‘srv file’
Starting with this step, we create the directory when we will create the service definition file with the inputs and outputs. Navigate to the package directory when you want to create a ROS Service:
In the srv directory create a file called ServiceExample.srv and write the below three lines:
int32 onezero — string turn
The three dashes mark the end of the input(s) and the beginning of the output(s).
With this last three lines in an ‘srv file’, we ended the service definition file. Next, is time to update two files needed to create the code and class definitions to work with the service that we define.
3. Update package.xml and CMakeLists.txt
Open the package.xml file and write the below lines. If you put the rospy dependency when you create the ROS package, you should already have into the file the rospy build depend and run depend. Otherwise, put all the below lines into the package.xml:
We’ve done with the package.xml file and we will go next to the CMakeFiles.txt. In this file we have to add the dependencies and the service file: (Info: you should already have as comments most of these lists)
Now we have to run the catkin_make command to generate the classes used to interact with the service.
4. Write callService.py
In the srv directory create a new file called callService.py. Make this file executable using the command “chmod +x callService.py”. The file shows a simple server that accept as input one or zero and as output it returns ON or OFF.
#import the code generated by catkin.
#we need ServiceExample for the first message type,
#and the ServiceExampleResponse from the second message type from the ServiceExample.srv file
I have written three articles so far with reference to the design, electronics, programming, and assembly of the RoboBioca robot. This is the fourth article of the series and I will describe the whole project.
I have created this robot waiter to help us promoting the sea buckthorn organic juice. The robot consists of a rotating base to manage the juice shots and a robotic arm that takes one shot at a time and serve it to the customer.
The construction lasted 5 weeks during which I changed the old printer with a new one, I printed dozens of components, and I had several nights when I slept no more than 4 hours per night. The result of this marathon in which I’ve created RoboBioca was sensational for me. Also, the reaction of visitors from the BIOFACH 2019 Nuremberg trade fair that came into contact with the robot was far beyond my expectations.
I had big emotions for the servo motors used in the robotic arm. I was afraid that the servo would not work for more than 20-30 juice shots. The surprise was that the robotic arm resisted and served over 300 juice shots and still working.
Hardware and Software Technologies The software area is covered by the IoT application Blynk and an Arduino sketch. I use Blynk to send commands to the robot. I installed it on an Android tablet, but it works just as easily on my smartphone. Communication between the tablet and robot is via Bluetooth.
The hardware part is more complex and includes a robot arm and a rotating platform for plastic cups. The robot arm is a 6-axis SainSmart kit that imitates the industrial design. The initial version of the kit has been subjected to changes at the end-effector and performance.
The steps to built RoboBioca I have made a CAD design of the rotating base and robot claw. The robotic arm was not designed in CAD. I did not consider that a CAD design could help me to finish the project since it physically exists and it worked. The next step was to divide the project into several small parts that can be built and tested independently of the other modules. I have finished building each part and then I assemble the complete robot.
detect if the plastic cup was grabbed. I use a limit switch sensor to detect the plastic cup;
detect when the plastic cup is taken over;
The two robotic claws are driven by a single servo motor. Actually, only one of the two claws is attached to the servo motor shaft. Through a gear wheel system, the rotation of the claw attached to the servo drives the movement of the second claw.
One of the claws of the end-effector hosted a limit switch sensor. The sensor is used to detect the plastic cup. In other words, if there is a plastic cup -> the sensor is closed; if the plastic cup is missing -> the sensor is open. In this way, I know whether or not there is a plastic cup between the robot claw. In addition, I use the same sensor to know if the plastic cup is taken by the customer. After the customer raises the juice shot, I know that the plastic cup is taken and the arm can enter into the standby position or take another shot with juice. Read more →
The beginning of 2019 found me working to build a platform for promoting sea buckthorn organic juice. The first finished part of the platform was a 3D printed robotic claw. This claw is attached to a 6-axis robotic arm SainSmart. It is the classic robot arm which reproduces the industrial design.
The robotic claw is used to:
grab a small plastic cup with juice;
detect if the plastic cup was grabbed. I use a limit switch sensor to detect the plastic cup;
detect when the plastic cup is taken over;
The design The application is not very complex. I need a robot claw to carefully handle a plastic juice cup without breaking the plastic cup or losing it on the way. I analyzed the steps, actions and worst case scenarios and I decided to stay in the classic design for the robot claw. Two claws actuated by a servo motor are what I need to grab and move a plastic cup. A more complex design with several claws and many degrees of freedom would be too much.
I have a few constraints to accomplishing the above steps. First, the plastic cup has a height of 5.2 cm. The claws will not have to cover a large surface so there should be enough space for picking up the cup. Also, I can’t prevent the cup to not slide between claws with an abrasive material glued on the inner surface. The plastic cup should not resist when it will be taken over by a person.
Robotic claw: CAD design and 3D printed
the claws have a thickness of 1 cm, that is about 20% of the height of the plastic cup. I could have made them less thick but I needed space to add the limit switch sensor for sensing the plastic cup.
gluing a rubber piece to the inner surface of the claw is excluded. The rubber would have prevented the plastic cup from slipping, but it would have made it very difficult to take it from the claws. I would have risked the robot arm being pulled up together with the cup. The solution was to use a high-torque servo motor able to maintain the position of the claws. SainSmart uses for the 6-axis robotic arm gripper the SG-90 servo motors. These servo motors cannot hold anything between the claws.
If you’ve come to this article, you’re probably at the same point as I was before I made my stepper motor move without noise, vibration or jump steps. That’s why in this article, I will provide a sufficient level of depth to know how to control a stepper motor (for example – a NEMA 17 or other stepper motor size) and fixing the problems that may arise.
When a stepper motor is used in a 3D printer, most likely a team of electronics and programming experts makes calculations and put everything together to works perfectly. Instead, when I try to use the same stepper motor in a hobby project (after a normal 8 hours daily job) it’s quite different. Moreover, things get complicated if you have a low budget and just wanted to use the stepper motor to make a robot or rotate a cup holder like in the way I did. All of these situations may give you big headaches.
Let’s start with the beginning and I’ll tell you in a few words from where the whole story started. It is possible to write a long story in this article, but I have a lot to explain to you about stepper motors.
How to Control NEMA 17 Stepper Motor with Arduino and A4988 Driver
How I Start With The NEMA 17 Stepper Motor
First of all, I will comment based on NEMA motors. Like many others, I’ve heard about the NEMA stepper motors and I thought ‘NEMA’ is the brand of the motors. Totally wrong! NEMA is a standard that describes stepper motors. For example, NEMA 17 is a stepper motor with a 1.7 x 1.7-inch faceplate. In other words, NEMA 17 is a size, not a series of stepper motors. Read more →
In this tutorial, I provide enough level of depth to be informed about how to control servo motors with Arduino. I use PWM software and hardware resources because this is the right way to control one or more servo motors. After reading this article, you should be able to control your hobby servos without noise or vibration. In addition, I write a section at the end of the article where I describe how I fix several MG996R servos and a conclusion.
Before going into the details, I want to tell you something. I know how hard is to produce great work and then put it out into the world. But no matter how much was written on forums and blogs about how to control hobby servo motors like MG996R, almost all these tutorials provide information to control it while working freely, without applying torque. The servo changes its behavior under applying torque. If it is not controlled in the right way, the servos vibrate, make noise and rotate randomly. That is how I came to write this tutorial. I want to describe you in details the solutions tried to control the servos, as well as the solution to smoothly control many hobby servos like MG996R while applying torque – in my case, the servo actuates a robotic arm.
RoboBioca Robot Arm
The MG996R servo is the main servo motor used to move the SainSmart 6-axis robot arm. From all of the 6 axes of the arm, four of them are actuated by the MG996R. Only two axes use the SG90 servo to move the end effector.
I used this robot arm to build a robot waiter able to grab and handle a small plastic cup. Because I need something different for my project, I cannot use the default configuration of the arm. I replaced the SG90 servo motors with only one servo motor with metallic gearbox and claws. In this way, I reduced the number of degrees of freedom of the arm from 6 to 5.
In this project, I used the Blynk application to send commands to the robot. The communication between the Android tablet running Blynk and the robot is done wirelessly using a Bluetooth connection.
And because I would not have had enough problems to solve, the Bluetooth connection interferes with the servo motors. These are making noise and running randomly while is working the Bluetooth connection. So an extra problem to be solved.
The power supply A stable power supply of 6V as required for powering the servo motors is mandatory. I used an adjustable LTC3780 DC buck/boost converter module. Also, I have a power adapter (12V – 3A) to feed the converter. I used a digital multimeter to set an output voltage as close as 6V for the converter.
First try The first attempt was to control the servo motors with the Arduino Sensor Shield V5. The biggest problem with Arduino UNO is that I have only two PWM pins (pin 9 and pin 10) that can be used with PWM while running the Servo2 library. The PWM pins are used for the control signal of servo motors. Unlike DC motors, the PWM control is required for servos to determine the position rather than the speed of the servo shaft.
This is the Servo library distributed with Arduino 0016 and earlier. It can drive up to two servos using pins 9 and 10 on a standard board or 11 and 12 on a Mega. Other pins wont work.
I had two servo motors working well when I control them with PWM and that’s all. I leave them and make another try.
Second try With five servos and only two pins with PWM that I can use, I tried another way to control the servo motors. I gave up the Servo2 library and I use capacitors and the Servo library. This attempt turned out more successful than the previous one. From the solutions found on the Internet, I used 470uF 50V electrolytic capacitor for each servo. It works, but not as I want it. Shaking and vibrations don’t occur in most of the servo positions, but the random rotations still continued.
Third try The third attempt was successful. I used the Adafruit 16-Channel PWM/Servo driver to control all the five servos. The driver uses an i2c-controlled PWM driver with a built-in clock and 12-bit resolution for each servo, that means about 4us resolution at 60Hz update rate.
The first failed servomotor was the one who turned the robot arm. It has failed quite easily – I have noticed some smoke being emitted during operation. The second failed servo was at the base. This one no longer recognized the position. Regardless of the commands for a particular position, it randomly changes its position. The cause was not using the Bluetooth module. I tried to position the servo motor without attaching the communication module.
The first servo failed after about 30 minutes of operation. The second was more friendly and failed after about another 2 hours of operation. During this time I’ve used the robot arm to test different positions.
I’m in the situation of having two failed MG996R servo motors while I start working to solve two problems that have actually increased my blood pressure: random shakes(vibrations) and noise.
Initially, I thought I do something wrong and that’s why the servo motors were broken so fast. After a search on Google, it looks like I’m not the only one who worked with the MG996R and have broken them in the testing phase. Finally, I ordered some new servos and I start working to see where I was wrong and how to fix this.
I ordered three MG996R servo motors. They came quickly and I started to test them. It’s not the first time when I bought something very cheap (the price for an MG996R is around €6/$7) and did not work. This time was something very special for me. From three servo motors, all of them did not work.
With three new servo motors that did not work, I thought it would be better to search for an alternative. I’ve looked for alternatives having the same size that could fit into the SainSmart robotic arm. I found Power HD Servo 1501MG. They are the same size as the MG996R plus higher torque.
Of the five MG996R servo motors used in the upgraded version of the robot arm (the original version of the arm uses 4 MG996R), there are three left. Two MG996R servos were replaced by 1501MG. One MG996R servo drives the shoulder yaw, another on the wrist yaw, and a third on the robot claw.
How I fix the MG996R
Any of the below methods require training in electronics and some experience with screws, gearboxes, and grease. Take great care when you work inside the servo, you will do it on your own responsibility.
Two of the servo motors I’ve fixed just by removing the screws. I control them to move back and forth from position 30 to 160 without putting the cover of the gearbox, then I fixed the cover of the gearbox… and it works. Sounds a little stupid, but this is what I did and works. I suspect the problem was in the position of the gears inside the gearbox. Most likely, the DC motor was rotating without engaging all of them. There may be assembly errors or transport shocks.
One of the new servos had some bigger problems. One gear doesn’t have one or two teeths. Besides that, I found a piece of plastic inside the gearbox. Fortunately, I have a burned servo from where I took the spare pieces, I mounted back all the pieces and it worked.
MG996R missing theets
Conclusion on the hobby servos
try as little as possible to manually change the position of your servo. This can quickly destroy the gearbox and the servo can give fails;
the servo motors used in hobby projects works well between positions 10 and 170. If you place a position less than 10 or greater than 170 it makes a noise and gets into vibrations;
the power supply of the servo must be as close as possible to the 6V value. A higher voltage can burn it, and a lower voltage makes it run randomly. Also, please do not try to feed servos directly from the Arduino board or any other controller used in your project. You will have many surprises;
if you work with multiple servo motors at once, for example for a robot arm, you would want to program them one by one. If one of the servos has an incorrect command and moves to a less desired position, it has a big impact on the other servos. The robot arm can touch the work table or other objects around it and broke the gearbox of the servo;
if you do not have a stable power supply source of 6V, it is necessary to use an electrolytic capacitor between the power supply wires of the servo motor;
More Powerful Alternatives To Raspberry Pi 3 (B/B+)
I’m quite involved in developing an autonomous robot platform for outside work. The robot will work in the summer during the day for a few hours, which result in high temperatures for all its components. My main concern is the electronic parts. I have plans to install a cooling system for these, but I’m not sure if I can keep the temperature values at an acceptable threshold.
Besides, I want to reduce the temperature of the electronics using parts that do not work under stress. Low stress results in low temperature generated by the component. Thus, I’ve come to look for Raspberry Pi 3 alternatives to run ROS and communicate with the robot.
I use the Raspberry Pi 3 to run ROS Kinetic, monitor and control the robot. Raspberry Pi is a pretty stable platform and can work continuously for weeks without a problem. Its alternative must provide as much stability in operation, community support, and resources.
The budget for this change is not greater than €150. My research led to the next list (the list will be updated once new alternatives appear).
Jetson TK1 / TX1 / TX2 or Intel NUC is out of the question for the moment. Any of the three Jetson variants or the Intel computer cost a few hundred euros. It is worth investing money in such a board if running ROS and computer vision applications. Otherwise, I do not see any reason in spending hundreds of euro to run ROS nodes for sensors and navigation algorithms.
Added on 19.December.2018 Renegade has the form-factor compatible with Raspberry Pi 3 and can run ROS without a big effort with its 1.4GHz ARM Cortex-A53 processor and 4GB of DDR4. It has a price of $80.00 (the version with 4GB DDR4), but the final price will increase by a few dollars considering that it has no WiFi. Is needed a USB dongle if you desire to use this function.
XU4 has a competitive price of about €63 and on the specifications list is a Cortex-A15 processor that can provide 2Ghz and 2GB of LPDDR3. The only minus of this board is the lack of a built-in WiFi module. I need an extra €5 to buy a WiFi module to have an Internet connection.
Asus Tinker runs ROS nodes on a QuadCore ARM SOC 1.8GHz processor with 2GB of RAM. It has an built-in WiFi module and a price around €49. I have some doubts about the operating system. In reviews from Amazon other users reveals stability issues for Android and Debian images. This makes me think twice about making a decision. It is very important to me to use a single board computer for months without interruption. An unstable operating system can lead to a large number of reboots and downtime.
Rock64 comes in several variants and the strongest one has 4GB of RAM and an ARM Cortex A53 64-bit processor. The board can run a full version of Linux Ubuntu or Debian. The price is also good considering the performance – around €38. The only thing that concerns me is the community support. An active community could save me for a lot of hours to fix issues.
I have a Raspberry Pi 3 running ROS Kinetic and I use it to control an autonomous robot. The plan is to improve my robot by adding computer vision capabilities. The Raspberry Pi has resources to build intelligent robots and the community helps me more in fixing a lot of problems. These are the reasons for this moment not to change it with another single board computer with advanced hardware resources. The bad thing is that Pi 3 has limited capabilities for graphics applications such as rviz or running computer vision applications.
My idea (I hope is a good one and will work) is to use Pi 3 and a Linux computer in the same network to exchange data from one to another while I run rviz and the Gazebo simulator on the PC.
The first step in implementing my idea is to setup ROS Kinetic to communicate between Pi 3 and the remote Linux computer. Below are the steps that I did to make both computers communicate with each other.
Step 1: I checked that everything was okay with ROS Kinetic on my Linux PC. I installed it some time ago using the steps described here.
Step 2: I re-installed a Linux image and ROS Kinetic on the Raspberry Pi 3 board. It took some time since I chose to install ROS on the Raspbian Stretch Lite. This operating system is what I need for my robot: it doesn’t have desktop applications or a GUI of any kind.
Step 3: At this step, I pay some attention at the IP address for the ROS master node (Pi) and the IP address for other ROS node (the Linux PC).
on Raspberry Pi 3 type the following command:
Navigate to the end of the file and add these two lines:
#The IP address for the Master node = the IP address of Raspberry Pi
#The IP address for the Master node= the IP address of Raspberry Pi
on the Linux PC, type the following command:
Navigate to the end of the file and add these lines:
#The IP address for the Master node = the IP address of Raspberry Pi
#The IP address for your Linux PC
These are all the steps to make two Linux computers communicate and share nodes, topics, and services.
I want to control an autonomous robot with a Raspberry Pi 3 board and ROS Kinetic. The Pi 3 will be connected to another Linux PC used for monitoring and control settings. The setup for computers are in this article.
Due to the lack of Pi resources in terms of processor and memory, I’m forced to use resources more efficiently. The first step is to install on Pi an operating system like Raspbian Stretch Lite. The interaction with this system is done through type commands. It doesn’t have GUI and other software included with the Desktop version. Theoretically speaking, it is a perfect operating system if you want to not stress the Pi board with too many tasks that you do not need anyway.
The system will use the Raspberry Pi board as the master while the PC is the slave. This configuration means to run the Roscore on the robot instead on the remote PC. The PC is used to see the message that coming from Pi or send some manual correction back to the robot.
I installed the ROS Kinetic version with no GUI tools on the Raspbian Stretch Lite and I put all the steps below.
Step 1: Download and install Raspbian Stretch Lite The installation steps for Raspberry Lite are described here.
Step 2:Connect via SSH to Pi and run the below commands:
Step 3 (optional): The installation process takes several hours and sincerely, I don’t want to repeat the installation too soon. I decide to clone the memory card as soon as I finished the initial installation. Here are the steps needed to clone the memory card.
You can find additional information here and here.
A robotic arm may seem complicated to be built and controlled. It involves teaching how to program a microcontroller to control some servo motors for repetitive tasks. But you can learn to do it quickly using robotic arm kits.
I’ve seen a lot of robotic arm kits around the web in the last year, but the ones below are the favorites today. The robotic arms from this article have 4 or 6 degrees of freedom to suit of any project.
4DOF Robot Arm with Remote Control PS2
4DOF Robot Arm with Remote Control PS2
The robotic arm kit from Banggood is controlled with two ps2 joysticks. It’s a simple way to control the arm and does not involve running an advanced programming code on the Arduino board.
The range of applications for such a kit is small compared to a programmable kit, but for the price of $39.99, it is a good start for school students. It has a manual and a guide to install the code for the Arduino board.
LewanSoul LeArm 6DOF
LewanSoul LeArm 6DOF
This robot arm is made entirely of metal and aluminum and can lift up a weight of about 250 grams.
A Bluetooth module is added to the main board of the robot to control the arm with a smartphone or tablet. Also, you have an application that simulates all the joints of the robot arm, so that you can move it at a push of the touchscreen. This is the case if you do not want to use wires to control your arm. Otherwise, you can use wires and a remote control to move the arm on all the 6 axes.
It is neither the cheapest 6-axis robotic arm nor the most expensive. It has a price of $129.99 on Amazon. The price does not include transport costs.
6-Axis Desktop Robotic Arm
6-Axis Desktop Robotic Arm
At a price of $174.99, Sainsmart offers us a robotic arm made from simple components available to anyone like a PVC pipe. Such an approach is very good for the DIY users who can easily change the structure of the arm. The arm can be used for applications like pick and place, palletizing, and more.
The robotic arm requires an external power supply, other than the 5V DC from Arduino Uno.
Another good part of this kit is the documentation. Besides the wiki, you can find a lot of projects that use the robotic arm for different applications like pick and place an object or object detection.
uArm Swift Pro
uArm Swift Pro
The range of applications for uArm Swift Pro is large compared to other kits. With a repeatability of 0.2mm and a maximum payload of 500g, the arm is suitable for pick&place applications to 3D printing.
This is not a cheap kit. It has a price of $1,129.95 on Sparkfun. The arm is open-source and controlled by an Arduino Mega 2560 board. For documentation, you can access this link.
Coincidentally or not, after 10 years of ROS 1, the Open Source Robotics Foundation has launched a new version called ROS 2. ROS 2 (the code name “Ardent Apalone” – Apalone is a genus of turtles in the family Trionychidae) was officially released at the end of 2017. The release of the new ROS has gone a little unobserved by the usual ROS users, and it is understandable since there are few articles in online about this release.
So in this article, I will try to describe why Ardent Apalone appeared and what gaps left by ROS 1 will be covered by the new ROS 2 version.
Before going into the subject, I will remind that ROS (The Robot Operating System) is not an operating system as we know. ROS (or ROS 1) is a solution designed to be hosted by an operating system like Linux. Or, as the majority calls it, this is a meta-operating system. And of course, it’s designed for robots.
Like ROS 1, the ROS 2 is the network of nodes that allows communication/exchange of information between the components used in the robot. So far, nothing new. Everything is the same as we know it today.
One of the reasons behind the launch of a completely new version (ROS 2) and not the improvement of ROS 1 is the significant changes to the framework. The team that developed ROS 2 has chosen to implement the new changes safely in the new framework. So, they did not want to alter the ROS 1 variant to not affect the performance and stability of the current versions of ROS. From my point of view, it’s a wise decision. Especially because there is a plan to implement the ROS 1 nodes to work with the ROS 2 nodes together on the same robot. So there will not be significant changes to the systems that will work with both ROS variants.
Below I made a list of the new features of ROS 2.
Three compatible operating systems One of the news is that besides Linux, ROS 2 is compatible with Windows 10 and Mac OS X 10.12 (Sierra). If the support of OS X is not new (officially ROS 1 were compatible with OS X as an experimental part), the support for Windows is something new for ROS.
Real-time support ROS 1 has not been designed for real-time applications. The goal of ROS 1 was to create a simple system that can be re-used on various platforms. In other words, the use of ROS has led to a significant reduction in the development of a robot.
A real-time system must update periodically to meet deadlines. The tolerance to errors is very low for these systems.
The example below is used by the ROS team to describe a situation when a system needs real-time support.
A classic example of a controls problem commonly solved by real-time computing is balancing an inverted pendulum. If the controller blocked for an unexpectedly long amount of time, the pendulum would fall down or go unstable. But if the controller reliably updates at a rate faster than the motor controlling the pendulum can operate, the pendulum will successfully adapt react to sensor data to balance the pendulum. [source]
In other words, the real-time support is more about computation delivered at the correct time and not performance. If a system fails to send a response is as bad as giving a wrong response. This new feature is very useful in safety- and mission-critical applications such as autonomous robots and space systems.
Distributed discovery This new feature facilitates, in some way, the communication between nodes. In other words, the nodes in ROS 2 do not need the master node to change messages between them. If you run a C ++ written node and another in Python (a talker and a listener), the nodes will identify each other and start communicating automatically. You may be wondering how to identify the nodes if there is no master node to allow authentication. In ROS 2, the role of the master node was taken over by the ROS_DOMAIN_ID environment variable. When a ROS 2 node is launched, it makes its presence known in the network to other nodes that share the same ROS domain.
Node lifecycle management
Managed nodes are scoped within a state machine of a finite amount of states. These states can be changed by invoking a transition id which indicates the succeeding consecutive state. [source]
The most important thing is that a managed node presents a known interface and is executed according to a known life cycle machine. This means that the developer can choose how to manage the life cycle functionality.
Security ROS 1 had no security issues because it did not exist. With ROS 2 we can talk about security. It integrates the transport layer of ROS 1 with an industry standard transport layer that includes security. The layer is called Data Distribution Service (DDS).