Scientists teaching AI to become better at second-guessing

Melbourne: Scientists are teaching artificial intelligence (AI) systems the subtleties of human behaviour so that they may be better placed to predict our intentions.
One of the holy grails in the development of AI is giving machines the ability to predict intent when interacting with humans, said researchers at the University of New South Wales (UNSW) in Australia.
Currently, AI may do a plausible job at detecting the intent of another person. It may even have a list of predefined, possible responses that a human will respond with in a given situation, they said.
However, when an AI system or machine only has a few clues or partial observations to go on, its responses can sometimes be a little, noted the researchers.
“What we’re doing in these early phases is to help machines learn to act like humans based on our daily interactions and the actions that are influenced by our own judgment and expectations — so that they can be better placed to predict our intentions,” said Lina Yao, a senior lecturer at UNSW.
“In turn, this may even lead to new actions and decisions of our own, so that we establish a cooperative relationship,” Yao said.
The researchers want to see awareness of less obvious examples of human behaviour integrated into AI systems to improve intent prediction.
However, doing so is a tall order, as humans themselves are not infallible when trying to predict the intention of another person, the researchers said.
“Sometimes people may take some actions that deviate from their own regular habits, which may have been triggered by the external environment or the influence of another person’s actions,” she said.
Yao and her team are developing a prototype human-machine interface system designed to capture the intent behind human movement.
“We can learn and predict what a human would like to do when they’re wearing an EEG (electroencephalogram) device,” Yao said.
“While wearing one of these devices, whenever the person makes a movement, their brainwaves are collected which we can then analyse,” she said.
Yao said recording this data has the potential to help people unable to move or communicate freely due to disability or illness.
Brain waves recorded with an EEG device could be analysed and used to move machinery such as a wheelchair, or even to communicate a request for assistance.
“Someone in an intensive care unit may not have the ability to communicate, but if they were wearing an EEG device, the pattern in their brainwaves could be interpreted to say they were in pain or wanted to sit up, for example,” Yao explained.
“So an intent to move or act that was not physically possible, or not able to be expressed, could be understood by an observer thanks to this human-machine interaction.
“The technology is already there to achieve this, it’s more a matter of putting all the working parts together,” she said.
Yao noted the ultimate goal in developing AI systems and machines that assist humans is for them to be seen not merely as tools, but as partners.
“What we are doing is trying to develop some good algorithms that can be deployed in situations that require decision making,” she added. (AGENCIES)