My robotic co-worker

How UQ researchers are helping the Australian defence force design autonomous machines that will one day work alongside us.

Self-flying drones, next-generation artificial intelligence and robots that can communicate using sign language.

These are just some of the technologies that UQ researchers are working on in their collaboration with Trusted Autonomous Systems (TAS), a defence cooperative research centre funded by the Australian Government Next Generation Technology Program and by the Queensland Government.

Since 2017, TAS has been linking Australian researchers and defence industry partners to assist the development of technology that allows humans and advanced machines to work together more effectively to defend our nation.

Here, three UQ Advance Queensland Research Fellows share what they’re working on in collaboration with TAS, and in doing so provide insight into some of the weird and wonderful technologies of our future.

You might think self-flying drones that can flawlessly deliver shopping straight to our doors are ready to go. But drones won't be foolproof any time soon unless we can get past a few big barriers, says UQ’s Associate Professor Dr Pauline Pounds.

The first problem is the heavy weight of the drones’ sensors that need to "see" the world and navigate obstacles.

“Today, unmanned drones typically navigate with a camera or laser-based system, called Lidar,” Dr Pounds said.

“They're great, but Lidar is very heavy and both systems require a lot of computer processing power, which dramatically slows reaction times.”

In her previous work, Dr Pounds got around this problem by attaching thin filament whiskers onto her drones, which allowed them to sense the environment and self-navigate by touch.

“The whiskers were ultra-light and crazy sensitive,” she said.

“They can measure forces equivalent to the weight of a flea. That’s right at the threshold of human perception.”

But when drones move around, we don’t always want them to run into objects to know they are there. Thankfully, Dr Pounds found a solution for that too.


“We can measure forces so low, the wave of air created by moving your hand towards the drone should deflect the sensors just enough for us to measure it. This means we could use it as a pre-contact sensor.”
Dr Pauline Pounds

Her new work with TAS will look at whether she and her team can develop the same kind of air sensing for navigation, but using the drone’s own rotors as sensors to detect air movement as it flies.

Her work with TAS will also aim to make automatic navigation faster, as even microseconds can make the difference between a collision and a tight corner for a drone.

“I want to try building intelligent systems that aren’t big brains thinking deep thoughts about how to respond to their environment,” Dr Pounds said.

“That takes time and energy, and brains are fragile.

“If we make a drone’s response to the environment more like a reaction, like when you tap your knee and get an instantaneous jerk response, it would be fast and cheap in terms of processing.

“This should let us take a drone, throw it into a room and have it bounce off the walls like a ping pong ball, without ever actually touching.”

Working with TAS, Dr Pounds has been considering how this technology could, for example, help Australia’s submarine systems measure surrounding water flow to detect objects nearby and navigate autonomously.

“I can see this being very useful in a defence context,” she said.


A woman's hands holding a drone
Dr Pauline Pounds holding a drone

As humans, the way we feel lets us know many things about the world. We know when a piece of cake tastes delicious, for example. Or when a piece of art is stunning.

While artificial intelligence (AI) can do a lot, social tasks that we do easily, but cannot always clearly define or quantify, present significant difficulties, Dr Andrew Back says.

“The challenge is to discover ways for AI to encapsulate meaning, nuance, implication, intent, interpretation, feeling and understanding, allowing it to solve these problems,” Dr Back said.

Dr Back has been working to tackle this by developing a new kind of AI that learns the 'language' of the natural world in a similar way to how humans interact with the world using language.

This AI works by looking at the probability of small but different events – that is, how often they occur.

Dr Back calls this system 'synthetic language AI' because these small probabilistic events form the letters and words in a new type of language, but one that is not necessarily human language as we know it.

His work extends the concept of entropy, which allows us to understand the world which is random at a quantum scale, and yet has meaning at a human scale.

This method is allowing Dr Back and his team to do things far outside what is possible with traditional AI.

“In some of our work, we looked at the different pause lengths buried in speech and treated this as a very simple synthetic language with only 10 words," Dr Back said.

“In another case, we formed a simple synthetic language based on very small changes in frequencies”.

This new type of AI has been applied to a variety of applications with promising results, including determining speaker nationality, recognising particular authors and distinguishing dementia based on speech signals.


“Using these types of approaches, we found that it may be possible for our AI approach to overcome some of the challenges in understanding the meaning of what is behind how people speak.”
Dr Andrew Back

“A key advantage of this approach is that it seems to be possible to develop it without requiring the usual large volumes of data or prior training.”

Dr Back quickly put his AI to work decoding meaning from all kinds of data, including the movement of meerkats, captured by motion sensors.

A graph with a squiggly blue line representing meerkat movements over time

The movement of meerkats over time was tracked using sensors attached to them. Image: Dr Andrew Back

The movement of meerkats over time was tracked using sensors attached to them. Image: Dr Andrew Back

As with other data, Dr Back and his team looked at the probability of the different types of meerkat movement to try and understand their meaning. And then they took it one step further.

By translating the meerkat movements into a new form of synthetic language, they were then able to translate it into an English language equivalent, which gave deeper insight into the possible meaning of the meerkat movements.

“We wanted to make sense of these meerkat movements in a way we can understand in English," Dr Back said.

"You get the sense of what the meerkat is possibly expressing."

A transcript of the translated meerkat movements - example "Look. Me hey thing. Jo big look"

Intertranslation using Dr Back's synthetic language AI system. Image: Dr Andrew Back

Intertranslation using Dr Back's synthetic language AI system. Image: Dr Andrew Back

Dr Back thinks it may be possible to use synthetic language to pull meaning out of data found in the real world, such as geopolitical events, which has implications for national security.

He is currently developing the mathematical theory and algorithms to do this.

“We need AI technologies that can quickly solve complex social problems that humans are good at," Dr Back said.

"We hope to provide the groundwork to meet this challenge.”


Hands gesturing. Image credit: Digpro

It may not be long until we have droids in our homes making us our morning granola and robotic co-workers helping us put together a new pitch.

If we want to communicate with machines properly, however, Dr Jessica Korte says we need them to understand more than just our words.

“Spoken communication is multimodal,” Dr Korte said.

“People gesture, they have facial expressions and body language. This is why Zoom calls are so hard; we are missing a lot of information.”

In collaboration with TAS, Dr Korte is developing a machine that can understand gestures and facial expressions, so we can have naturalistic, realistic human-machine interactions in the future.

Korte believes the best way to do this is to train a machine to understand Auslan, otherwise known as Australian Sign Language.


“Every problem we solve getting a machine to recognise, produce and process Auslan helps us understand how humans and robots can communicate with gesture more generally.”
Dr Jessica Korte

Her machine will use existing computer vision technology to see Auslan signs.

“It will capture five channels of information in Auslan: hand shape, orientation, spatial location, movement, but also non-hand expressions and body language,” Dr Korte said.

Using the same technology as Dr Back, Dr Korte will analyse the five Auslan channels the machine sees, and compare them to understand the underlying meaning.

Once the machine has understood what is signed, it will sign back a reply.

“For this, we might use animation, video segments, or generative adversarial networks to create new videos on the fly,” she said.

By using this flexible AI, Dr Korte’s system will improve how her machines can decode signs under situations of stress and missing information – something very important in a defence context.

“If four from five channels suggest one sign, for example, the AI can make a best guess,” Dr Korte said.

This flexibility will also be useful in dealing with the fact that different people vary slightly in how they create signs.

“This could also have security advantages. If we can tell the difference between individuals’ signs, we can ensure a person is authorised to sign commands to the robot,” Dr Korte said.

Outside defence, Dr Korte hopes to give something back to the community of signers who are helping to develop her system.

“The technologies we develop will spin off into industry translation that I hope can be useful to the Deaf community. That’s what I am really excited about.”


Hands performing sign language