AI, Sensors and Robotics
Making sense from perception to machine and action
This article is a brief look at the relationship between artificial intelligence (AI), sensors and robotics. It is not meant to be comprehensive, rather it explores rudimentary concepts.
Many human-like operations may require some degree of artificial intelligence or machine learning to operate in the manner needed. Yet many robots are only programmed for a set task and given a few different eventualities.
Yet to help a machine with an understanding of its spacial environment — in terms of the perception, speech recognition or learning otherwise it can happen through sensory inputs.
I have previously covered cameras, yet it can be microphones, wireless signals and much more.
The machine has to recognise a set of human behavioural indications or to take information from the environment it is situated to determine actions. Is it hot or cold, is that a human or machine, car or human, etc. Identifying objects can be one aspect, yet another one is to understand what set of events that should trigger a given action.
The three if divided may serve different purposes. AI can be programmed, robotics involving the wide aspects relating to the physical.
You can see the almost uncanny movements of the robot hand trained by the team at OpenAI.
OpenAI had then trained a neural network to solve the Rubik’s Cube in a simulation through the use of reinforcement learning. Domain randomization enables networks trained solely in simulation to transfer to a real robot. This Automatic Domain Randomization (ADR), which endlessly generates progressively more difficult environments in simulation.
They focus on challenging problems for ‘machines to master’.
- Dexterous manipulation.
In addition to this they mention the concept of meta-learning.
“We believe that meta-learning, or learning to learn, is an important prerequisite for building general-purpose systems, since it enables them to quickly adapt to changing conditions in their environments.”
Robotics and AI serve different purposes.
However these fields move closer in these type of contexts when complex manoeuvring is necessary.
Robotics involves building robots, whereas AI deals with programming intelligence or making computers behave like humans.
“Robotics is an interdisciplinary branch of engineering and research area for information engineering, computer engineering, computer science, mechanical engineering, electronic engineering and others. Robotics involves design, construction, operation, and use of robots.”
To some extent a goal is to develop machines that can substitute for humans and replicate human actions.
However robots need motors and sensors in order to activate and perform basic operations.
Sensors have become incredibly cheap and far easier to integrate into different systems whether on an industrial scale or private.
As such with the increasing availability of cheap sensors and possibilities to combine advanced software has been giving interesting results. Another example is Boston Dynamics, and their highlight reel from 2012–2019 shows the astonishing progress that is being made within this area.
Their Spot model was launched recently and has been popular.
It may seem like this form of sensory intelligence is enough, yet it has been said we need far more for these type of robots to be operating in our homes.
Still we let vacuum cleaners into our homes.
There might be a possibility for other appliances as well although it may be in the early-stages.
Apparently the needs for health robotics has surged during the Coronavirus lockdown:
Demand for health gadgets soars amid lockdowns | The Japan Times
In addition to this robots are increasingly used in telemedicine.
What America can learn from China’s use of robots and telemedicine to combat the coronavirus
As well as to some extent — there are robots and drones used in this capacity to help in the strange battle against the spread of the Coronavirus.
Robots And Drones Are Now Used To Fight COVID-19
A blog called TerraHawk described it with an illustration that I found helpful given what has been discussed above.
“Sensemaking or sense-making is the process by which people give meaning to their collective experiences. It has been defined as “the ongoing retrospective development of plausible images that rationalize what people are doing.”
In this way machines are programmed to attempt making sense of the environment to take appropriate actions.
This is #500daysofAI and you are reading article 310. I am writing one new article about or related to artificial intelligence every day for 500 days. My focus for day 300–400 is about AI, hardware and the climate crisis.
AI, Sensors and Robotics was originally published in Towards Data Science on Medium, where people are continuing the conversation by highlighting and responding to this story.
Discover Past Posts