This website uses cookies primarily for visitor analytics. Certain pages will ask you to fill in contact details to receive additional information. On these pages you have the option of having the site log your details for future visits. Indicating you want the site to remember your details will place a cookie on your device. To view our full cookie policy, please click here. You can also view it at any time by going to our Contact Us page.

Self-driving cars: What could possibly go wrong?

12 October 2020

Self-driving cars are the latest trend and they have the potential to make life a lot easier, particularly anyone who takes long journeys regularly and wants their car to take on most of the work. However, they can also cause problems for drivers.

Shutterstock image

For drivers who fancy chilling out behind the wheel, then self-driving cars could be perfect. You might need to pay a little extra to get it, both for your car and car insurance cover, but the results will be worth it if you’re interested in driving a cutting-edge vehicle.

This technology might save time and energy, but it can come with comical consequences. As it can’t think for itself, it can make mistakes that leave owners squealing with laughter, as well as affecting the cost of your insurance cover and more.

Anyone who’s ever relied on a Sat-Nav system, only for it to direct you into a lake or oncoming traffic, knows all too well how confused these supposedly ‘intelligent’ solutions can become. All it takes is one missed update or one unexpected change, and the system starts hollering at you to drive your car into water or the wrong way down a one-way street.

So, how much worse does it become when your car is driving itself? Quotezone.co.uk, which specialises in helping drivers get the best deals on their car insurance while getting the level of cover they need, has put together a list of some of the weirdest ways that self-driving cars have gone wrong in the few short years since the industry introduced them.

Self-driving cars stop for a snack

One of the funniest recent developments in the self-driving car market is that they believe Burger King’s illuminated logo is a red light. Burger King’s signature red background leaves the cars confused, meaning that they stop and politely wait for the light to change or for their owners to override them!

So, next time you’re going through the drive-thru and someone stops suddenly, make sure you control your urge to honk your horn. The driver might be greedily tucking into their burger, or it could just be their car getting confused.

Police pull over driverless car for driving too slowly

When they first started developing self-driving cars, many companies aimed to create vehicles that could drive themselves, without any human intervention. In 2015, Google sent out a prototype driverless car, which was pulled over by police in Mountain View, California. The charge? Driving too slowly.

That’s right – you’ve heard of speeding, but this autonomous vehicle was going too slowly for the cops. If a vehicle isn’t going fast enough on certain roads, it can cause traffic jams and crashes, so the police were eager to apprehend the culprit – shame there wasn’t one. Once they realised that it was a prototype, they spoke with the passenger and decided against giving it a ticket.

Google self-driving car should’ve checked bus times

As well as run-ins with the police, Google’s self-driving car has also caused a bus crash. The car pulled out in front of a bus whose driver didn’t notice it. As a result, the two vehicles were involved in an accident when they collided close to Google’s California headquarters. Thankfully, no one was injured, but it does make you wonder why the car didn’t search for bus times!

After all, as the world’s most significant search engine powers it, you’d think it could check the timetable before it set off and avoid a crash. It was the first crash caused by one of these vehicles – previously, other cars had hit the self-driving cars, but they’d never caused an incident. When they finally did get involved in an accident and cause it themselves, they went big!

Another test car refused to start

Shutterstock image

While refusing to stop is an issue, so too is not starting at all. An early self-driving car test went wrong when the car refused to do anything. The testing team were left scrambling around trying to fix it before they resigned themselves to their fate and declared that the car was being lazy.

The FF 91 was supposed to drive into the centre of the stage and use its auto park function to find the best space. However, the car simply stood its ground and refused to move. Even insulting it didn’t inspire it, so the team members were forced to say a few words to cover their embarrassment.

Why did the pigeon cross the road? To delay the bus

It sounds like something out of an episode of Black Mirror, but self-driving buses are already in use in Spain. The first of its kind, affectionately named Èrica, launched in 2018. It’s a unique way to travel, but passengers have noticed a surprising problem that makes their journeys a lot longer than usual.

The bus has built-in sensors to detect blockages in the road, so it doesn’t hit pedestrians, other vehicles or unexpected obstacles. However, this does have an unexpected consequence – the bus stops for pigeons! When a pigeon walks in front of the bus – and in Spanish cities, that happens quite often – the autonomous vehicle stops for the bird, causing delays to its journey. The bus also comes to a halt in heavy rain, which is a shame for anyone hoping to get out of a wet walk.

Slow pedestrians also cause confusion for self-driving cars

It’s not just our feathery friends who cause issues for self-driving cars; early users of Google’s self-driving car noticed that they stopped for slow-moving pedestrians. The car was unsure whether the person would suddenly walk out into the road, so it chose to hang about and wait. While this sounds like a smart move, it could add hours to your journey if you drive through a neighbourhood full of older people and slow walkers.

A driver can assess whether a person looks like they’re about to cross the road, but the car only knows that someone was walking slowly beside the road. Its technology isn’t able to predict their next move, so it chooses to wait, potentially adding ages to your journey. The car probably got a lot of blank stares from passers-by too!

The feeling is mutual: Self-driving cars freak pedestrians out

Self-driving cars might be cautious around pedestrians, but the feeling is clearly mutual. Self-driving cars often attract stares and confused looks from passers-by. The issue is so widespread that BMW made an advert about it to promote its own self-driving technology.

The ad shows the BMW winding down creepy roads, only to be stopped by a gruesome woman. As she opens the door, preparing to terrify the driver, she notices that the driver’s seat is empty. She runs away in horror, leaving viewers to laugh at the preposterousness of being scared of a vehicle without a driver. It doesn’t stop us from staring, though!

Street art fools AI driving tech

Companies often call self-driving cars ‘intelligent’, but they aren’t always as bright as you’d think. Studies show that many driving AI systems misunderstand stop signs if they’ve got simple graffiti on them. So, next time you’re driving around using autopilot, you might find that it doesn’t stop at graffitied or damaged signs and that you need to override the autopilot.

Shutterstock image

Considering that many of these cars drive into cities, you’d think they’d be able to tell the difference between ‘stop’ and a drawing of the finger! The study used stickers to deface the signs, and the AI wasn’t able to identify them correctly. Here’s hoping that the technology has evolved and isn’t still bamboozled by something you’d find in a child’s craft set. If it’s confused by stickers, then how will the technology fare when it faces colourful graffiti tags!

Why bridges make self-driving cars cross

Many drivers and automation experts believe that bridges can cause self-driving cars to lose control, as they don’t look the same as a regular road. As a result, self-driving cars can struggle to understand them, which could lead to traffic collisions or hold-ups.

It’s not just bridges that self-driving cars struggle to handle; they also dislike hills. Clearly, self-driving vehicles don’t like heights! The technology struggles with depth perception but considering the dangers of falling off a bridge or down a hill, drivers will need to be careful. If your car does crash off a bridge or hill, then you could find yourself with more to worry about than increasing car insurance premiums. 

Telsa on autopilot crashes in worst possible place

An early Tesla with the autopilot function crashed into the worst thing possible – a police car! The autonomous vehicle crashed into the parked car, which was used by a state trooper who was ticketing someone else at the time. No one was seriously hurt, but the crash did cause a pileup.

The owner of the Telsa blamed the autopilot function, but the police argued that he should’ve been keeping an eye on the car. So, if you’re thinking of picking up a car with autopilot, remember, it’s not an excuse to relax – make sure you watch out to avoid getting into an accident. Kicking back and watching the world go by could be disastrous. If you do, you could find yourself paying hefty car insurance premiums, and even facing police charges. Your car insurance might not cover the accident if you were supposed to be watching the autopilot, so check what cover your car insurance offers for self-driving vehicles.

Conclusion: Are self-driving cars too unpredictable to insure?

The early hiccups that some self-driving cars have experienced are understandable, given how new and innovative autonomous vehicle technologies are, but it does beg the question – will these driverless cars always be too unpredictable for insurance providers?

In theory, self-driving cars could actually increase driver safety and reduce the risk of an insurance claim, because the AI behind the wheel is always likely to obey speed limits and comply with the other rules of the road.

However, self-driving cars and autopilot functions obviously run the risk of a malfunction, but insurance underwriters won’t know how significant that risk might be until these vehicles have been tested more extensively.

In the beginning, that means the cost of insuring a self-driving car is likely to be prohibitively expensive – if insurers are even willing to insure it at all.

One factor that might come into play in the future and affect how much you have to pay for your car insurance cover, is the reliability of the self-driving software that your car uses. A car insurance policy for a car that regularly makes daft errors could cost more than cover for a more reliable vehicle. As a result, all of these seemingly silly incidents could become a crucial factor in how much you pay for your car insurance policy.

Today’s drivers don’t need to worry too much; it’s likely to be years before you have to consider which insurer you should use in order to find the best deal on your self-driving car insurance. In the meantime, Quotezone.co.uk can help you find the right policy at the right price for your own, run-of-the-mill car – with yourself rather than AI behind the wheel, of course!


More information...

Print this page | E-mail this page

Igus - Tech Up, Costs DownOmron Electronics