Monday 15th July 2019

Are connected cars good listeners? Audio Analytic aims to make them so

Published on November 6th, 2018

Connected cars may be smart but they don’t always have great context. They hear things, but they don’t listen. That’s also a common human failing, says freelance technology writer Nick Booth, but it’s the default setting for a so-called ‘smart car’.

A smart car knows that tomatoes are fruits, but it wouldn’t know enough to exclude it from a fruit salad. That’s a contextual error which comes from lack of experience. There are some decisions that the connected car will get wrong, despite having all the data to hand.

Such as: should we push through the traffic lights to get out the way when we hear a siren approaching? Your heart says yes, but your head says no. On the one hand you want to help the emergency services. But set against that is your suspicion that a CCTV camera will capture this and the authorities will make sure your good deed doesn’t go unpunished with a fine.

Intelligent audio

Soon the smart car may be able to give us more context on decisions, thanks to the pioneering work of Audio Analytic (AA). This Cambridge-based Artificial Intelligence (AI) developer is striving to create human levels of interpretation of noises. It began by creating apps for Centrica’s Hive services, so that smart homes could detect audio anomalies such as the sound of a window breaking and a dog barking. A smart home could process these inputs and conclude that a break-in is taking place.

Chris Mitchell, CEO & founder of Audio Analytic

Audio Analytic’s initial aim is to create good listeners out of machines in four main areas of public life: health and well-being, entertainment, communication and safety.

Now AA is striving to help human drivers make better decisions with the aid of really useful intelligence. For example, it may tell you whether that siren is from an ambulance or a police car. You can make your own mind up which one of them you may risk a fine for. The audio analysis might tell you how far away the emergency vehicle is, so you can decide at the very last minute whether to get out the way – by which time the lights might have changed.

The majority of the time, the AA apps will keep you safe by dealing with the daily distractions that put the driver at risk. The noise of a baby crying, or kids arguing or the dog going nuts in the back of the car will all slow your wits and extend your reaction time. Your smart car can then automate an appropriate response.

The barking dog could be automatically pacified with the dispense of a dog chew – although it’s dangerous rewarding that sort of behaviour. At the sound of a baby crying the car could be automatically slowed and the entertainment system turned down. If the kids start arguing in the back perhaps they should be punished by some sort of automatic signal jammer that renders their handsets and tablets inert. That might shock them into silence.

Nick Booth

The world around sensors

Audio Analytic currently has seven sounds that its AI apps can contextualise (including smoke alarms and human speech) but is in the process of developing ten more. “Humans have a very good sense of the world around them and we are trying to give machines the same perception,” says Audio Analytic founder and CEO, Chris Mitchell.

AA is helping machines not just to monitor but to understand the environment. If only it could create an AI system for helping motorists to appeal against unjust fines.

This blog is by freelance technology writer, Nick Booth.

Comment on this article below or via Twitter: @IoTNow_OR @jcIoTnow