14 min read

Whether you think we’re ready or not, the robot revolution is coming. These smart, self-learning machines are already in our homes, in our offices and on the high street. But what are some of the technologies supporting this surge? And what’s coming next?


Think of any industry and it’s likely some form of robotics is involved. Warehouse robots are fulfilling orders while drones are delivering packages. Robotic arms are performing surgery while robotic marking systems are grading school papers.

Today robots are everywhere. We’ve reached a point where some robotic systems, such as ATMs and self-service checkouts, are so commonplace they’re barely even considered robots anymore.

The fact is we’ve witnessed huge advancements in the field over the past decade. So it’s not too hard to envisage a near future where further advances transform society even more so. Tomorrow’s world is to be automated, digitised and connected.

But what are the technologies underpinning this revolution? Mechanics, intelligence and connectivity are all playing a key part in shaping our future. We take a look at how these facets are coming together to create a robot-rich society.

Robotics

In 1961 General Motors debuted Unimate – the world’s first industrial robot. It weighed 1,225 kilograms and was used to help automate the diecasting process.

Since then robots have been transforming industrial manufacturing processes by doing human tasks with much more efficiency. Jobs involving heavy lifting and repetitive motions are now being performed by machines.

And today robots are a huge part of the non-industrial world as well. Consumer robots, enterprise robots, military robots and autonomous vehicles are all making huge strides in transforming their respective sectors.

Behind-the-scenes tech

Key to this rise has been the advancement of sensing technologies. Low-range sensors and basic-resolution cameras have given way to smaller, more powerful tools. With modern motion and vision systems, robots can achieve greater levels of autonomy as they accomplish tasks with faster speed and better accuracy.

Some robotic technologies in use today include:

Robot vision

Light direction and ranging (LIDAR) systems: Light is bounced off nearby surfaces to create a 3D map of the environment 

2D vision: Flat, calibrated vision that allows the robot to measure length and width, but not height

3D vision: Normally two or more cameras that provide X, Y and Z axis information for greater geometry detail

Robot sound

Microphone: An analogue-to-digital conversion unit combined with digital-signal processing can be used to sense the emotional state in a human’s voice

Robot touch

Capacitive sensors: Capacitive sensing technology, already used in touchscreen devices, is helping robots feel different surfaces

Robot perception

Proximity sensors: Can be based on capacitive, inductive, optical, sonar, ultrasonic and fibre-optic technologies

Robots in motion

Robots come in many forms, but the most popular one in fiction is the humanoid – designed to resemble the human body. Yet despite the impressive developments from engineering companies such as Boston Dynamics, we’re still many years away from having fluid human movement.

That being said, we’ve come a long way since the mechanisms that inspired the ‘robot dance’. Here are just a few examples of exciting developments in natural robotic motion.

Insect motion

Researchers at Tokyo Institute of Technology have found new ways of controlling multi-legged robots. They propose that, by using a two-level controller, a network of non-linear oscillators can be manipulated to manoeuvre diverse gaits and postures. To put it simply: they’ve created robots that move like insects.

They demonstrated this ability to generate natural walking patterns by using a six-legged ant-like robot. Researchers found that the robot would execute gaits that it hadn’t been explicitly designed to produce, but that were commonly seen in real-world insects. Unlike most developments that aim to replicate biology, these researchers have managed to reverse-engineer natural movement.

Spider motion

Scientists have built a bionic spider that can roll and crawl – an early prototype can be seen in the video above. Called the BionicWheelBot, the robot was inspired by the flic-flac spider, known for its ability to change the way it moves depending on the ground conditions. On uneven ground the spider would walk normally on its legs. However, faced with smooth ground, the spider would spread its legs and move by performing somersaults.

Despite having eight legs, the robot version is considerably less terrifying. To walk, it uses all its legs. To roll, it tucks six of its legs in and uses 15 motors and the remaining two legs to spring forward.

Artificial intelligence

Even with basic intelligence robots have shown they have the capacity to contribute huge productivity and efficiency gains to society. But tomorrow’s robots will be smarter than ever before, and the potential for further gains is huge. A large part of this is down to artificial intelligence.

Broadly speaking, AI is defined as the ability of a machine to imitate intelligent human behaviour. Or, as Nils J Nilsson, one of the founding researchers into AI, put it:

 

“Artificial intelligence is that activity devoted to making machines intelligent, and intelligence is that quality that enables an entity to function appropriately and with foresight in its environment.”

Nils J Nilsson

 

Data is the main source of knowledge for AI. It’s the fuel that allows AI to make accurate and informed decisions. Data is often modelled from a variety of sources and is trained to build the AI’s knowledge.

Under the AI umbrella, a number of exciting applications are being made. These complementary tools are converging with robotics to result in robot functionalities that have never existed before.

Below are just a few examples.

Credit: Ioannis Oikonomou


Complementary AI applications are converging with robotics to result in robot functionalities that have never existed before

Exploring the technology behind the rise of the robots

AI scanners that can predict disease

Scientists from Alphabet’s health science company, Verily, have created an AI algorithm that can predict heart disease by looking into a patient’s eye. By analysing eye scans, the software uses machine learning to deduce characteristics such as age and blood pressure. This data is then used to predict cardiovascular risk.

The algorithm was trained using a medical dataset of nearly 300,000 patients, which included general medical data as well as eye scans. Neural networks were then used to mine this information for patterns, with the algorithm learning to associate eye data with the metrics of heart disease.

AI that can clone voices

Baidu have recently released a whitepaper explaining its latest AI development: a program that can clone a voice within seconds of listening to it. Not only can it mimic the input, but it can also change the voice to reflect a different accent or gender.

The system makes use of deep neural networks to train the model to generate speech. Baidu’s development marks a big step in voice cloning – previous iterations of the technology required much longer times to train the model. For instance, in 2016 Adobe’s Voco required 20 minutes of voice audio for it to generate a mimicked version.

AI that can listen to emotion

Music data specialists Gracenote have taught its computers to detect emotion in songs using machine learning. The team began by developing a taxonomy of moods, atmospheres and emotional qualities that are commonly associated with music. They then used 40,000 songs as examples for the mood categories and fed this into the training set. Once the system had been trained on the songs, it then applied its learnings to millions of tracks.

This classification technology is now being used by music service operators, including Apple and Spotify.

Sensory intelligence has wider applications. By recognising emotion (be it through body language, vocal tone or facial expression), robots can better understand how to react to humans (and potentially even learn to express emotion themselves). This can help with jobs that involve empathy, such as customer service or healthcare roles. 

Internet of Things

The Internet of Things market is expected to grow from US$170.6 billion in 2017 to US$561 billion by 2022

(Source: MarketsandMarkets via ReportLinker)

 

Connected devices are changing the way our homes are run, how our offices function and even how our cities work. In fact, many commentators have been predicting the demise of the term ‘Internet of Things’, seeing as all tomorrow’s devices will be built to be connected.

One key reason for this IoT boom is the drop in prices for connectivity technology. Sensors that can capture information such as pressure, torque and temperature are getting cheaper and cheaper. And this is also feeding into an expanding robotics market.

 

In 2004, the average cost of a sensor was $1.30. In 2020, the price is expected to drop to $0.38.

(Source: The Atlas)

 

Likewise, prices for lidar and infrared sensors are dropping by more than 90%. These were previously the most expensive components of self-guiding robots. However, thanks to aggressive investment in the automated car market, production costs have gone down dramatically.

Growth in edge computing will further make deploying IoT solutions easier. Think of it as the light version of a data centre.

For example, whereas traditional machine vision requires separate camera modules and processing units, edge computing means the camera module can pre-process data and offload most of its processing burden to the edge of the network.

This is particularly helpful for robots that use AI (which depends on large volumes of data being processed in real time). Computation is sped up, which in turn improves efficiency and time-to-decision.

IoT, AI, robotics and the rise of the robots

Credit: Ioannis Oikonomou


The IoT boom is being fuelled by the falling prices of connectivity technology and advances in computing infrastructure

Connectivity in manufacturing

Rethink Robotics has introduced a new feature to its manufacturing robots. Called Intera Insights, the new software platform gives manufacturers critical data insights in real time. Key performance indicators such as force, speed, part count and cycle time are displayed via a customisable dashboard on the robot’s display. This development marks the first time data collection has been made available from a collaborative robot.

Joined-up farming

Robots, drones and greater connectivity are changing the way farming is done. Engineers at the University of California, Davis, have developed a smart cultivator that uses UV lights and cameras to identify specially treated plants, and trim any weeds (which aren’t treated) away. They’re hoping the functionality behind these robots can be applied to other farm functions, such as the application of insecticides. What’s more, IoT connectivity to other autonomous machines, such as tractors and drones, can ease the pressure on demand for manual labour and make farming more self-sufficient.

The future of robots

Throughout history technology has prompted the success and decline of cities around the world. The rise of robots will no doubt add to this pattern.

In the last few years microprocessors have grown more powerful, AI has taught itself to become more intelligent, and robots have proved their potential as a valuable automation tool.

Tomorrow’s technology, including robotics, AI and the IoT, will converge and be responsible for creating smarter cities, more exciting business processes and unlocking faster innovation cycles than ever before.