This ad is exclusive. Only one sponsor per week. Buy it here.

Ray Kurzweil: We Can Control AI Before It Controls Us

Ray Kurzweil is one of the greatest thinker of our time and almost all the time I agree what he said about robots and AI. This time, I’ll do a shift, and I’ll tell you why I disagree with what he said about how we can control AI.

By creating safeguards and standards in advance, we can better defend against negative consequences. As an example, Kurzweil points to the 1975 Asilomar Conference, a meeting that sought to define the ethical boundaries of biotech research before it reached its full potential. He believes a similar approach might work for AI and other exponential technologies.

Kurzweil brings into discussion safeguards and standards, something like “the ethical boundaries of biotech research.” Well, how many of you have the possibilities to do biotech research in a garage or at home? Probably none or a few of you have the resources to do that. So this category of researchers is not a significant danger and the ethical boundaries can work.

Instead, any of us can make AI at home. And any of us can apply AI to a robot at low costs.

So, “creating safeguards and standards in advance” is just the beginning and an addition to the three laws of robotics. We are far away to protect humans from artificial intelligence.

Ray Kurzweil: We Can Control AI Before It Controls Us

Multitasking on Arduino: Use millis() Instead delay()

When you use the delay() function in your sketch, the program stops. The program waits until moving on to the next line of code. So, in this dead time you can’t process the input data from sensors, as well as the outputs.

The delay() function is easy to use, but good only if you don’t have something else going on during the delay. Otherwise, you have to use millis().

Millis() can seriously affect your project when you have to run multiple actions simultaneous. It’s the function that lets you do multitasking on Arduino.

It’s pretty simple to work with the delay() function. It accepts a single number as argument representing the time in milliseconds. Using millis() takes a little bit of extra work compared to delay().

Calling the millis() function in the Arduino sketch returns the number of milliseconds that have elapsed since you start to run the program.

Below is an example of millis() function used for the HC-SR04 ultrasonic sensor to run an autonomous robot able to detect and avoid obstacles.

And how the same function looks when the delay() function is used:

Accerion’s navigation sensor for autonomous robots

Accerion is not a well-known name in robotics, but I think they will become famous soon. This start-up designs a sensor able to establish the accurate position of a mobile robot without infrastructure outside the robot.

The sensor works indoor as well as outdoor. In general, it is designed to make the autonomous navigation easier and at lower costs.

Because the sensor doesn’t require additional infrastructure is an advantage for robots able to work in vast areas (robots made for agriculture) as well for indoor spaces with dynamic environments where humans and machines are moving continuously (robots made for logistics).

The sensor is scheduled to appear on the market in 2017.

This is a test with disturbance for the sensor:

Open-source machine vision board for DIY robots runs Linux and Arduino

The board is called Livera and combines a webcam sensor OV9712, an MPU-6050 accelerometer, an Arduino-like microcontroller – Atmel 32u4, an ARM9 Hi3518, and a Wi-Fi router.

The machine vision board runs Linux and OpenCV for computer vision applications. You have to pay $39 for the basic board or $49 for the version with a Motor Driver board included.

We’ve provided libraries which contain machine vision related APIs such as:

  • 720hd video and photo capturing&processing
  • OpenCV enabled Color-Sensitive and Object-Tracking.
  • Wifi enabled wireless control (this can work with the iot!)
  • Image capturing and data recording onto the onboard sd card
  • Mobile observation and manipulation from our custom apps( web-based and native)
  • Programmable and open source.

This is a great way to get started with Arduino, embedded Linux, and OpenCV library.

Hicat.livera – Start making your first machine vision robot

Mood-Detecting Sensor Could Help Machines Respond to Emotions

We enter in era of Emotions of Robots. EQ-Radio is a robotics application able to detect whether a person is excited, happy, angry or sad.

The new device, named “EQ-Radio,” is 87 percent accurate at detecting whether a person is excited, happy, angry or sad—all without on-body sensors or facial-recognition software.

We picture EQ-Radio being used in entertainment, consumer behavior, and healthcare,” says the study’s lead researcher, Mingmin Zhao. “For example,” says Zhao, a graduate student, “smart homes could use information about your emotions to adjust the music or even suggest that you get some fresh air if you’ve been sad for a few days.” Zhao adds that remote emotion monitoring could eventually be used to diagnose or track conditions like depression and anxiety.

Instead, EQ-Radio emits radio signals that reflect off a person’s body and back to the device. Its algorithms can detect individual heartbeats from these radio echoes with an accuracy comparable to on-body ECG monitors.

We view this work as the next step in trying to develop computers that can understand us better at an emotional level and potentially interact with us similarly to how we interact with other human beings.

Mood-Detecting Sensor Could Help Machines Respond to Emotions

Tutorial: Setting up an Apache Web Server – Raspberry Pi

If you have an Internet-connected robot, you will see real-time data from the sensors using IoT platforms. But what if you want to have your own IoT application to store sensors data to create graphics and reports? So you need a database and some code to display data.

Raspberry Pi is made for the Internet, while an Apache server, a MySQL database and the PHP programming language are the software needed to store data from your robot.

If you need some instructions to install Apache and PHP on your Pi, this tutorial makes things easier.

Tutorial: Setting up a Apache Web Server – Raspberry Pi

BEST DEAL OF THE DAY: U-blox NEO-6M 25 x 25mm GPS Module (50% OFF)

This U-blox NEO-6M module is compatible with Raspberry Pi 2, 3, B+, and Arduino. The old price is $41.99, while the new one is $20.99.

The U-blox NEO-6M module has under 1 second time-to-first-fix for hot and aided starts, support for anti-jamming technology, support SBAS (WAAS, EGNOS, MSAS, GAGAN), 50 channel positioning engine with over 2 million effective correlators and 5Hz position update rate.

Gowoops U-blox NEO-6M 25 x 25mm GPS Module with TTL Ceramic Passive Antenna For Raspberry Pi 2 3 B+ MCU Arduino

Open-Source Outdoor Terrain Mapping Software

Péter Fankhauser, a Ph.D. student at ETH Zurich in robotics, released a library and an ROS package for outdoor terrain navigation: grid_map and elevation_mapping. Both resources are open-source and useful to understand the environment when the robot navigates in rough terrain.

Grid Map:

This is a C++ library with ROS interface to manage two-dimensional grid maps with multiple data layers. It is designed for mobile robotic mapping to store data such as elevation, variance, color, friction coefficient, foothold quality, surface normal, traversability etc. It is used in the Robot-Centric Elevation Mapping package designed for rough terrain navigation.

Features:

  • Multi-layered: Developed for universal 2.5-dimensional grid mapping with support for any number of layers.
  • Efficient map re-positioning: Data storage is implemented as two-dimensional circular buffer. This allows for non-destructive shifting of the map’s position (e.g. to follow the robot) without copying data in memory.
  • Based on Eigen: Grid map data is stored as Eigen data types. Users can apply available Eigen algorithms directly to the map data for versatile and efficient data manipulation.
  • Convenience functions: Several helper methods allow for convenient and memory safe cell data access. For example, iterator functions for rectangular, circular, polygonal regions and lines are implemented.
  • ROS interface: Grid maps can be directly converted to and from ROS message types such as PointCloud2, OccupancyGrid, GridCells, and our custom GridMap message.
  • OpenCV interface: Grid maps can be seamlessly converted from and to OpenCV image types to make use of the tools provided by OpenCV.
  • Visualizations: The grid_map_rviz_plugin renders grid maps as 3d surface plots (height maps) in RViz. Additionally, the grid_map_visualization package helps to visualize grid maps as point clouds, occupancy grids, grid cells etc.

Robot-Centric Elevation Mapping:

This is a ROS package developed for elevation mapping with a mobile robot. The software is designed for (local) navigation tasks with robots which are equipped with a pose estimation (e.g. IMU & odometry) and a distance sensor (e.g. kinect, laser range sensor, stereo camera). The provided elevation map is limited around the robot and reflects the pose uncertainty that is aggregated through the motion of the robot (robot-centric mapping). This method is developed to explicitly handle drift of the robot pose estimation.

Demonstration of the software: Mapping capabilities of the quadrupedal robot ANYmal in a forest:

Finally, LIDAR-Lite v3 is Back For Pre-orders

Glad to hear that LIDAR-Lite v3 is back on the market for pre-orders. It’s expected to be ready for delivery in the mid-October. This one keeps the same line as the first and second version. It’s small, low-power and with a 40-meter laser-based optical ranging sensor.

The price is $149.99.

Dimensions: 20 x 48 x 40 mm (0.8 x 1.9 x 1.6 inches)

Features:

  • Range: 0-40m Laser Emitter
  • Accuracy: +/- 2.5cm at distances greater than 1m
  • Power: 4.75–5V DC; 6V Max
  • Current Consumption: 105ma idle; 130ma continuous
  • Rep Rate: 1–500Hz
  • Laser Wave Length/Peak Power: 905nm/1.3 watts
  • Beam Divergence: 4m Radian x 2m Radian
  • Optical Aperture: 12.5mm
  • Interface: I2C or PWM

LIDAR-Lite v3