Making technology work for you...

Get Your Free Subscription By Email:

Toyota's Killer Firmware : The Computer Inside Your Now "Smarter Car" Is Already Causing Human Death

While everyone is looking forward for an era of self-driven Smart-cars, the recent verdict of a jury against Toyota holding it responsible for a crash in which a passenger was killed and the driver injured due to faulty firmware in their car raises some serious questions about software bugs causing fatal accidents.

Recently, Google revealed using customized version of Ubuntu Linux for their Smart cars. And after this verdict the big question which needs to be answered is - isn't like any other software platform the embedded smart-brain of these future cars suspectable to hardware/software defects and also vulnerable to already demonstrated car hacking/hijacking?

Confused Smart Cars


"Smart" cars and "self driving" cars are not the same. The meatbag sitting between the wheel and the seat is still the biggest security issue, smart or not. Also, there's a lot more QC involved for the self driving cars than the "smart" ones.

But, don't you think unlike every other software system out there serious security issues will always remain an issue (For example malware in Iranian nuclear plant). This does not mean we should not make the move towards the new smart/self driven car era, but that needs to be done with strict QC and safety regulations to avoid cases like this from Toyota.

Well... Let's get a few things straight:
1. This vehicle was from '05, before many of the 'smart-car' features became mainstream.
2. The fault was in an electronic control system, not in a 'smart-car' compensation mechanism.
3. Electronic control systems have been the standard for over ten years now, something that I personally don't like.

Now, all this does not make the court case invalid; to the contrary, it's why the case is valid... However, this article is taking the whole thing out of context. So-called 'smart-cars' have, as this article mentions, full blown software operating systems which are vulnerable to maleware and such.
Now, regarding the self-driving cars question: I think that Google said it well in their video when they addressed that they need to work out a system to seamlessly return control to the driver in an emergency.

Only time will tell, I suppose; as I always say: Stay cautious my friends.

Well, here is the "contextual relationship" I intended to convey - what I meant was that if the "ECU" with a comparatively "very tiny" amount of code (which is also low level assembly) can have issues like this what could happen with full-blown operating-system? Considering the fact that these "Smart Cars" are already being exploited -; shouldn't government's learn lesson from this verdict and implement stricter safety norms when it comes to human life safety.

As you have said progress in this technology should be "cautious" and strictly regulated, because you would not want a terrorist/government to exploit a zero-day exploit ( or a software glitch like the recent "High Frequency Trading" incident where "Knight Capital" lost over $460 million in 45 minutes ( which caused a world-wide panic selling at stock-exchanges and ironically had industry's most talented coders with highest paid jobs to prevent this from happening (How and Why Wall Street Programmers Earn Top Salaries) happen with your car resulting in an accident.

One more example - RAF Pilots Blinded At 1000 Mph By Helmet Technical Glitch :

A number of manufacturers are introducing "Drive by Wire" - where linkages, rods, cables etc. are being replaced by modules that send/receive information to operate various components. Cruise control being the easiest example (and we've heard plenty of stories about that). Next up could be brakes and steering - in these cases there is not a lot of room (none actually) for ANY type of error - can you imagine a message popping up "The steering module is not responding please wait before making any turns".. The savings in manufacturing along with significant weight reduction makes this very appealing to the bean counters.

The company DARPA recently published a problem (Youtube DARPA) with "Smart Cars", showing a car being "driven" by a smartphone, whilst the "operator" was standing outside the vehicle.

I think this could be a major problem with autonomous cars too. I think it will be a major stumbling block in getting them released.

Nice and insightful blog. This is really surprising that due to software bugs such an dangerous accident occurred. After hearing this, questions will be raised on Google's driverless car and I hope some new technology should be developed to eradicate the danger.Thanks for the post.

Add new comment

This is just one of the many helpful tips we have posted, You can find more stories here,
Do subscribe to updates using your favorite RSS feed reader or using the secure FeedBurner email update form on top of this post.