The past has a habit of making promises that the future struggles to keep. By 2001, we were supposed to be encountering black monoliths around the orbit of Saturn, but, in reality, we’re only just getting close to returning to the moon after a 50-year absence. Where we were going, we were supposed not to need roads, but in reality, the only flying cars we have are the ones that took a wrong turn off a cliff because the driver was slavishly looking at the sat nav instead of looking out of the window. The 1980s, of course, gave us KITT, the four-wheeled, driverless, talking Knight Rider car and with a rumoured film remake in the works, this is still something that fascinates us. Of course, while we can talk at our cars, they don’t yet talk back – but the driverless thing? Well, that is certainly on the cards.
Tesla’s Elon Musk has been selling the dream for several years now with some success, even getting customers to pay upfront for its “Autopilot and Full Self-Driving Capability”. However, while Autopilot is an industry-leading Level 3-capable advanced driver assistance system (ADAS) feature it’s still a long way from offering the “full self-driving“.
Another complication is that a handful of crashes, some fatal, have provided ammunition for those that wish to put the brakes on the rollout of autonomous cars. While these are troublesome incidents, on a macro scale it’s clear that self-driving cars are needed to save lives. Humans, as a rule, are not great drivers and 1.35 million people are killed around the world each year on the roads. Aside from the tremendous human loss, it costs $518 billion USD worldwide to deal with the fallout. It makes a great deal of sense to solve this problem and at Imagination, we’re working hard to develop much of the fundamental technology that will enable this to happen.
While self-driving cars are not yet here, what we do have in many new cars on the road today featured ADAS capabilities which are very useful. There’s adaptive cruise control to help you drive that will help you follow a car at a safe distance, automatic emergency braking to avoid impact faster than a human can react, blind spot detection to help avoid things you can’t see, speed camera alerts in countries where it is legal to do so, and lane departure warnings to avoid drifting. Level 2 cars can also advance and brake in slow-moving traffic, change lanes and take motorway exits, all without driver intervention.
However, driver monitoring systems turn their attention from the road to the weak point inside the car – the driver. As David Harold, Imagination’s Chief Marketing Officer observes, “[with driver monitoring] the car isn’t looking at the outside world – it’s looking at you. How are you driving? Are you tired? Soon this will progress to detecting mood – do you look angry? Are you at risk of developing road rage?” Of course, how the car will deal with the latter is up for debate. Will it spray some cold water on you? Play some James Blunt? And would that make things worse?
But, ADAS requires that the driver is always alert and observant and has their hands over the wheel ready to take over. (I’ve been in a “self-driving” car from Lyft at CES in January 2020, and the driver was doing exactly this).
However, every Tesla now leaves the factory with its latest chip, which Tesla says is capable, at least in terms of hardware, of full self-driving, where the driver does not have to be involved and can trust the car to drive for the majority of the journey. It was this tech that was highlighted in a teardown done by the Nikkei Business Publications who quoted an engineer at a Japanese car company saying that it meant Tesla was six years ahead of the rest of the industry and as such, they could not compete. Jamie Broome, Imagination’s director of automotive does not quite agree. “What Telsa has done better than anyone else is focus,” he says. “They did this and delivered something quickly. Where they are ahead is with their willingness to take on risk – they were prepared to make the move to adoption.”
Andrew Grant, Imagination’s director of AI, agrees that the technology is in place to make self-driving cars happen on a wider scale – namely running the advanced neural networks that will power the self-driving algorithms. “We [Imagination] are uniquely positioned at the confluence of where this is going right now, with GPU and the grunt from multiple neural network accelerators (NNA).”
Imagination’s single NNA cores are capable of delivering up to 10 tera operations per second (TOPS), but, reflecting the lack of vision in some parts of the industry, Grant notes that was when talking to some companies they struggled to see why there was a need for this level of neural network performance.
“People couldn’t really see why they would need more than four TOPS, yet now we have the ability to do multiple clusters of 10 TOPS cores – so hundreds of TOPS.” Grant notes Audi has potentially already matched Tesla with the autonomous capabilities in the A8 but chose not to shout quite as loudly about it as the Californian company.
The key difference is of course that the A8 is a showcase car, whereas Tesla has put its self-driving chip in every car it ships. As Musk has said publicly, “If you buy a car and it doesn’t have the hardware for full self-driving, it’s like buying a horse”.
Grant agrees that this is something that every car manufacturer should be doing, simply because the secret sauce for autonomous driving isn’t in the hardware – it’s in the software. While the initial cost is there, the production will come down with economies of scale.
But the reality is that the hardware is just the way to access what’s of real value – data – specifically real-world data that will train the neural network models that will make self-driving cars possible. Grant gives the example of the car that believed it has encountered a “flying bike”. When analysing the data, it was realised that what the car had seen were bikes hanging off a bike rack – but as it hadn’t encountered this before it believed it was seeing a “flying bike”. Once educated this was added to the training model so every car in the fleet could now understand what this was.
Broome takes this on: “The real trick to autonomous driving is the collection of data. This is where you are going to win. The more cars you have on the market and the more data you collect about scenarios and situations, the better chance that you will win.” He also gives an example of how the algorithms learn from their experiences to improve the driving experience for everyone. The example he gives is of Teslas in Autopilot mode that used to unnerve passengers by repeatedly applying light braking while driving. This was because the algorithms were detecting cars in other lanes as drifting slightly and, like a nervous new driver, overly applying the brakes, a feature it learned to improve on as it understood the data.
Another issue regarding data is who owns it? An autonomous car will absorb huge amounts of data and much of it will provide evidence of where a car has been, and how fast it has been travelling. But does this data belong to the autonomous system provider, the car manufacturer or the car owner? And what even defines a car owner? Autonomous cars will usher in ride-sharing services, such as that being worked on by innovative start-ups such as Cruise, that will have a profound effect on the nature of ownership.
So, will the industry be able to get it together and compete with Tesla? David Higham, Principal Functional Safety Engineer, is confident they will and gives a five–year time from for true Level 4 cars on the road from other manufacturers. He also says it will take this long for consumers to gain trust in the systems. “I wouldn’t place blind faith in a driverless car – as a consumer, I’d want some evidence that these systems are safe.” This ties into the move from designing electronics to be functionally safe, as in ensuring that they won’t catastrophically fail in the event of a random, unexpected error (conforming to ISO 26262), to be actually fit for purpose – as defined by the new up and coming Safety of Intended Function (SOTIF) standard), which as the name implies, is a more general standard designed to make sure that autonomous cars are not only mechanically reliable but are safe to use for passenger and pedestrians.
It’s clear that while we are tantalisingly close to autonomous cars more advances will have to come together to make it possible. Carmakers will need to be braver in the speed with which they create hardware, so they can bring that processing power to the roads and start to accumulate that data – or risk leaving our Knight Rider dreams entirely in the hands of one company.