The driving force behind this growth is the development of sensor technology. Investors are putting their money behind AI-enabled intelligent robotics, and the result is that the sensors that enable them to understand their environment are becoming more and more sophisticated.
They have been considered as less intuitive in the past, following orders oblivious of their surroundings. This is no longer the caseârobots are beginning to meet and even exceed human capabilities.
The senses that we as humans depend on to thrive are no longer unique to the natural world; artificial intelligence is beginning to catch up with us. It is smart sensing technology that is one of the leading factors in the development of high-performance smart robotics.
The Rise in Demand for Smart Robots
Robots have already completely reformed the world of manufacturing and they are starting to move into every industry, becoming a significant part not only within businesses but within our homes, too. It is, however, the automotive, electrical and electronics industries embracing Industry 4.0 that are driving the growth of smart robotsâindustries that rely on monitoring and evaluating real-time data in order to streamline processes.
The demand is so high that worldwide industrial robotic sales are increasing by a considerable volumeâthe International Federation of Robotics reported 31% growth between 2016 and 2017.
The robots in question are capable of doing more and more with relative ease of use, and, with the help of smart sensors, they are able to move outside of the boundaries of the traditional workplace and to perform more and more roles.
Global Market Insights has reported that the global smart sensor market is predicted to grow at a CAGR of over 17% to reach 80 billion USD by 2024.
The Sophistication of Smart Sensors
We are headed for a world where robotics, as well as the software that supports them, will be all around us. The incredible thing about smart sensors is that they can be embedded anywhere, turning objects into robots by involving any number of the tens of thousands of different types of potential sensors available.
These sensors are becoming more sophisticated; they are smaller, cheaper and easier to integrate. The sensors have inbuilt artificial intelligence allowing information to be processed in-sensor and acted on immediately. They are becoming smaller yet more powerful and enabling high-achieving robots to do things and go places that their forebears merely dreamed of.
The level of sophistication of sensors has propelled vision systems, tactile sensors and speech recognition technology to new heights.
Image courtesy of Wikimedia Commons.
Machine Vision is Becoming Increasingly Accurate
Perception is one of the fastest growing areas of sensor development and has seen two consecutive years of double-digit growth that is still showing no signs of slowing down.
Perception is enabled by machine vision technologyâsensors are fitted to allow robots to move to target positions, perform predetermined functions, spot defects with increased precision and vastly improve quality control.
In relation to the manufacturing industryâa primary user of roboticsâmachine vision reduces costs while increasing speed and repeatability.
Machine vision applies industrial image processing by mounting cameras which capture, interpret and signal individually within a control system. It enables robots to visually inspect products and react in real time without the need for intervention.
In the first quarter of 2018, the machine vision trade association, the AIA, reported that sales of machine vision components and systems in the US increased by 19% to $709 million. The latest generation of systems uses 3D vision technology to provide even more automation. 3D smart sensor technology helps to solve the following application challenges and to create high speed, self-aware robots:
- Object recognitionâvision-enabled smart sensors are able to guide robots to determined positions, detect and inspect discrete objects, and implement real-time control decisions from gathered data.
- Object reflectivity and low contrastârobots are able to identify and locate objects based on their shape thanks to 3D scanning. Objects can be reliably detected even in the event of low contrast, poor light, or complex geometries.
- Overlapping itemsâsensors allow robots to adjust dynamically to size and location, and built-in measurement tools enable them to detect and manipulate objects; the result is smarter robots that can support a greater variety of tasks.
Advances in machine vision are creating a revolution in manufacturing. Factories are no longer being built around systematic, repeatable functions. Todayâs manufacturing is centred around flexibility, adaptability, and high efficiency.
A recent case study by the School of Mechanical and Aerospace Engineering in Singapore has looked at how machine vision, coupled with adaptive reasoning, can improve industrial robots in dynamic environments.
The experimental results demonstrate that with innovative integration of emerging sensing techniques, along with the correct algorithms and designs, effective human-robot and robot-robot interactions can be realised.
Machine vision is no longer purely reactive; it is becoming a data collection tool that allows a deeper understanding of complex processes and collaborations across the whole supply chain in real time.
Itâs not just the manufacturing industry that is benefiting from machine vision-enabled smart robots: machine vision is becoming widely used within medical microscopy and life science applications, with many robotic systems using imaging technology during surgeries.
Remotely operated surgeon-controlled robots are able to perform extremely precise movements: movements that are impossible with the human hand.
The machine vision system provides a magnified calibrated 3D view in high definition. The Bristol Robotics Lab has a number of studies underway in this field such as SMARTsurg. The EU funded project aims to enable complex minimally invasive surgical operations using a novel robotic platform to assist surgeons.
Image courtesy of Pixabay.
The Intelligence of Audio Recognition is Booming
The ability for robots to be truly interactive depends greatly on their ability to not only hear but understand us. In order for electromechanical robots to perform tasks, communication is fundamental.
To achieve this, detailed work is required in terms of deciphering content in relation to context; speech recognition alone is not enough. Highly sophisticated audio technology is needed to cope with noise, accents, and interpretation.
Todayâs researchers are working on complex grammar sentence instruction, speech and gesture recognition and fusion of the human-robot interaction within a natural language interface.
Researchers from the University of Hamburg have been working on a study to enhance robot speech recognition using biomimetic binaural sound source localisation. They have designed a machine learning framework to process sound recorded by microphones in humanoid robotic heads. The accuracy of speech recognition has shown to be greatly indicated by head position and selection of the appropriate channel input.
The accuracy of audio and facial recognition technology is enabling robots to really engage in conversations. It is also vital that voice recognition is coupled with the ability to recognise body language. As a result, robots are able to replace human interactions, primarily within the customer service, security, and surveillance industries.
Thanks to astute audio recognition, a number of new robots are due to be released this year:
- The Smart Service RobotâNew Era AI Robotic is due to unveil its new service robot. The customer service-focused Android robot can welcome and provide information to customers. Its ability to interact through voice dialogues will make it ideal for dealing with high traffic service areas such as hospitals and airports.
- The Home Security RobotâRobotelf Technologies is introducing its latest home security robot. The new Robelf will include surveillance guards and voice, vision, and position sensors. This coupled with its facial recognition technology will allow it to detect strangers and provide alerts in real-time to the homeowner.
- The Caregiving RobotâKebbi, designed by Nuwa Robotics, uses astute voice recognition to allow it to play and chat with children. It can also be programmed to help children study and learn languages.
Tactile Sensors are Putting Robots in Touch with Their Surroundings
Our sense of touch is an invaluable sensory input that is critical to enabling robots to manipulate objects. Force sensors have become so sophisticated that they allow robots to actually feel what they are touching.
What this means for the field of robotics is that robots are no longer dependent on everything being located with predictability. Force sensors, coupled with vision sensors, allow them to adapt to their environment.
The applications are truly astronomical, as developments in six-axis force sensors are supporting robotic explorers on Mars by ATI Industrial Automation. The force sensors use silicon gages to give a very high output even when experiencing significant load that has caused previous sensors to fail.
The Smart Sensor-Enabled Smart Robots of the Future
Sensors arenât about to stop with only three of our human senses: innovative developments to enable robots to smell and taste are arising. The Origami Hierarchical Sensor Array (OHSA) from researchers at the Technion-Israel Institute of Technology is able to sense and detect physical and chemical stimuli. The advancements could create significant robotic developments in the fields of medical diagnosis and biological hazard sensing.
Thanks to the sophistication in smart sensors, smart robotics is no longer confined to traditional sectors. They are able to break free of their previous confines and exist alongside us in almost every industry, as well as in our homes. In the future an increasing number of sensors will be built into the extremities of automated systems, allowing machines to seamlessly interact with real-world environments, and with us.