Skip to content Go to Homepage
menu
Insights: Blog
Teaching computers to be humble
by Insights Team

Nick Jennings, Vice Provost (Research & Enterprise) at Imperial College, has been an AI researcher for more than 30 years and has seen the peaks and troughs in the field. He has encountered years where there was a huge amount of funding and interest, as well as other times where researchers were calling AI a “failed technology” that was never going to work.

The field has moved on some way from the latter opinion. Now the challenge has shifted away from getting computers to learn things, to getting them to recognise what they don’t know.

“A huge challenge in the field is getting computers to know when they are at the extremes of what they know about and actually being able to give the answer: ‘I’m sorry, I don’t know enough about that’,” he says.

“In general, people are quite good at it, but machines are incredibly poor.”

In contrast, another challenge Nick identifies is the move away from systems being good at specific tasks to having a broader intelligence. “If a machine can play Go, they can play no other games. They’ve been built to solve that particular one whereas humans are more general problem solvers,” he says.

In other words, we want computers to become jacks of many trades, while being self-aware enough to understand the limits of their own knowledge. The goal is to build intelligent systems in our own human likeness, with sensitivities and general understanding like we have.

These two strands are critical in Nick’s current research into multi-agent systems. If successful, his work could revolutionise the way we respond to disasters. “I’m looking at how unmanned autonomous systems, drones, can help first responders get a better picture of what’s going on. To do this you have a team of drones working together so they don’t all go to the same place, but if there are areas that are particularly important, you need more than one,” Nick says.

“That dynamic coordination is an important and challenging problem.”

For this to be successful, these systems need to have a broader understanding of problem solving than simply knowing where to fly. For example, they may need to understand when and where to drop supplies. They also need to know when they cannot complete a task by themselves and ask for help from either another drone or a human.

The system falls apart if the drones in that team are programmed with a specific task to do on their own. This would result in a poor understanding of the whole situation as each drone tries to complete the entire task. It’s here we can see the real-world impact of using computers that know their limits.

“One of the most interesting challenges is to get humans and computers to interact in a more seamless way. Many of us are familiar with interacting with computer systems and it’s often not very easy,” says Nick.

“The next generation of systems will have a much smoother style of interaction, where the software will be seen as a partner, not a tool, and suggest things to us.”

This collaborative approach is the key to bringing AI to everyday life. We can’t expect computers to be omniscient when humans aren’t, so striving for humble AI is more effective and more achievable.

 

Hear more from Professor Nick Jennings in our Insider's Guide to Artificial Intelligence video interview.

Icon Book Demo

Isn’t it time you stop assuming? Book a demonstration with us

Thank you, we will be in touch shortly!
loader

Error Submitting Form

Back to top