Thursday 23rd May 2019

The connected car in the embedded world

Published on March 6th, 2019

What’s the hardest job in car design? Here’s a clue, says Nick Booth: it’s one of four major disciplines. The curvature? The car shape? The wedges? No, it’s none of those. The toughest job for car designers, experts say, is Packaging.

Fitting all the components into one powerful metal box is the trickiest part of conceiving a new car. Even for electric vehicles, which have a fraction of the moving parts of a combustion engine.

There are parallels with the embedding process in IT, in which all the moving parts such as the data, the software and the processors are stripped to the bare metal. They’re shrink-packed as economically as possible into silicon embedded into the hardware and mechanical parts.   

So it’s fitting that the Embedded World event in Nuremberg is becoming the forum for ‘bare metalling’ the connected car.

Long journey

Embedded technology’s compactness also lowers power consumption, size and  manufacturing costs for every unit, while hardening them. Embedded technology is necessary for the long journey into Complete Automation because it’s a good idea to travel light. The most famous embedded system was the Apollo Guidance Computer developed by Charles Stark Draper at the MIT Instrumentation Laboratory. It was one small footprint for a computer, but a giant leap for transport technology.

Engineering simulator ANSYS, one of the exhibitors at the show, is working on a number of ‘packaging’ projects. One of them is to help manufacturers develop high fidelity sensor models to simulate an entire autonomous vehicle before driving prototypes have hit the road.

Sandeep Sovani, ANSYS’s global director for the automotive industry, says embedded technology will be crucial in key aspects of the connected car, such as the advanced driver assistance system (ADAS) and the Human machine interface (HMI).

Sandeep Sovani

For example, there are complexities caused by the fact that humans see things differently from the cameras in an ADAS system. ANYSYS can simulate the human’s perspective. It doesn’t develop the embedded software but helps car makers to embed it more efficiently into their connected cars.

Its SCADE Suite creates a design environment for packaging the system and software engineering development, HMI design, multiphysics simulation, application testing and lifecycle management into one tight unit. This means it can pack in the entire suite of software functions – everything from engine control systems to emergency braking systems.

Code is generated from SCADE and, in the case of autonomous vehicles, placed into the vehicle to control the system. Then every system, from Suspension to SatNav, can be simulated and tested in a variety of imaginary environments.

Embedded intelligence

The HMI, on the other hand, is already a form of embedded intelligence. It’s essentially software embedded in a connected car that can exchange information with humans, while monitoring and alerting them, especially the driver.

Many vehicles already have driver assistance but the complexity multiplier will go into overdrive once connected cars have ambitions for autonomy. The software being developed for collision avoidance, lighting automation or blind spot flagging will need so much intellectual capital, human resources and space it will make the MWC 2019 in Barcelona look like a village fete.

All that code has to talk to and share information across systems as diverse as RADAR, LiDAR, GPS, Ultrasonic and camera sensors. All have their own hideously complicated formats and modus operandi. Editing them all down into a digestible, processable and embeddable form will be one of the most difficult packaging tasks in the history of car design.

The author is freelance technology writer, Nick Booth.

Comment on this article below or via Twitter: @IoTNow_OR @jcIoTnow