Why Google is Making its Cars More Aggressive — And Why Fixies, Bad Drivers Can Stop Them Dead in Their Tracks

When it comes to the world of autonomous vehicles, Californian software giant Google is the undisputed king of self-driving cars. And with more years of research and development into self-driving vehicles than the majority of auto makers, Google’s fleet of self-driving cars are amassing real-world driving experience at an astonishing rate, covering more miles collectively in a week than many drivers do in their lifetime.

But as Autoblog reports, despite millions of miles of driving experience, Google’s fleet of fully-autonomous Lexus RX450h SUVs and fully-electric low-speed pod-like cars are about to get a software upgrade to make them a little more aggressive to cope with life in the real world. Without it, Google’s cars are simply too nice and law-abiding to cope with the average U.S. city, making them prime targets for queue-cutters, tailgaters, and more unusual road users.

Google is making its cars more aggressive again).

Google is making its cars more aggressive again).

Ultimately, the problem lies in the difference between how you’d drive when taking a driving test and how you’d drive on your everyday commute. Google’s cars are driving like contentious teens who desperately want to prove to the examiner that they’re safe enough to be given a license.

Humans, on the other hand, take risks, get sloppy — and generally screw up.

Think about your morning commute, for example. Did you sneak a lane change in between two cars that would have resulted in a fail on a driving test? Perhaps you took a chance to overtake that one guy who was driving at the posted limit on the Interstate when everyone else was going ten over?

Google’s cars won’t of course be programmed to break laws. They’ll still stick to the limit and obey all traffic signs. But when it comes to four-way stops, stop lights and heavy traffic, Google plans to program its fleet to be a little less perfect and a whole lot more human. With any luck, this should stop Google’s cars being constantly cut up by impatient drivers trying to make up time after getting up late for work, improve traffic flow and ultimately make Google’s autonomous vehicle fleet a little more real-world ready.

We should point out here that the upgrade doesn’t mean Google’s cars will be any less safe. As before, Google’s autonomous vehicles will do everything possible to predict and evade accidents and collisions, continuing its record of not causing an accident while in fully-autonomous mode.

Googles autonomous cars are getting confused with fixed-wheel bicycles.

Googles autonomous cars are getting confused with fixed-wheel bicycles.

In addition to the upgrade to make its cars more assertive and defensive on the road, Google says it is continuing to evolve its autonomous vehicle algorithms to ensure its fleet can cope with more and more real-world situations. In most cases, that involves a vehicle encountering a new situation — such as a grandmother chasing ducks in her motorized wheelchair — analyzing the data from that encounter, and with help from Google’s autonomous driving software team, developing the correct response for future encounters.

In the case of the crazy woman chasing ducks, Google says the car behaved as it should, identifying and treating the unpredictable behavior from the woman as a road hazard and acting accordingly.

But, it admitted, there’s one thing which is still causing its fleet of autonomous cars a bit of a headache: fixies — or fixed-wheel bicycles.

For those who are unfamiliar with every Hipster’s favorite mode of transport, fixed-wheel bicycles have just a simple chain connecting front and rear sprockets together. With no way to change gear, they also do away with freewheeling hubs, meaning that any movement of pedals is reflected by a movement of the rear wheel and vice-versa.

As a consequence, experienced fixed wheel riders often do what’s known as ‘track stands’ when waiting at stop lights and intersections. Rather than put their feet on the tarmac, riders doing track stands gently rock their bicycles backwards and forwards, allowing their bicycles to wait at a stop light without requiring the riders to unclip their feet from the pedals.

To a human driver, this gentle motion back and forth is easily identified and dealt with — but Google’s self-driving cars interpret the swaying motion of a track-standing fixie rider as a danger that should be avoided at all costs. The result? Upon encountering such a cyclist, the self-driving vehicles stop dead and wait for something to happen — even if they have right of way.

With Google already working on teaching its cars to identify and more appropriately react to these hardcore cyclists, it won’t be long before fixie riders pose no threat to Google’s self-driving fleet.

In the mean time, we’d like to suggest Google stays clear of such fixed-wheel havens as downtown San Francisco, Austin and of course Portland.

Biker’s rights, it seems have the upper hand for now.


Want to keep up with the latest news in evolving transport? Don’t forget to follow Transport Evolved on Twitter, like us on Facebook and G+, and subscribe to our YouTube channel.

You can also support us directly as a monthly supporting member by visiting Patreon.com.

Share on FacebookTweet about this on TwitterShare on Google+Share on LinkedInDigg thisShare on RedditEmail this to someonePin on Pinterest

Related News

  • Tim Martin

    I wonder how long they will remain in the lead as far as driving data goes. With tesla auto pilot going active they will have thousands of cars collecting real time data across the world. With that kind of data collection it will not be long. I am sure that even when people are not using auto pilot they still will be collecting data and sending it to Tesla servers.

  • Chris O

    “Upon encountering such a cyclist the self-driving vehicles stop dead and wait for something to happen — even if they have right of way.”

    I’m afraid autonomous automobiles will show that sort of behaviour in any number of hard to analyse situations making the experience frustrating both for the occupants and for other traffic.

    Unless that’s something that can really be solved within real world (budgetary) parameters the autonomous car owner will pretty soon be the automotive equivalent of the glasshole I’m afraid.

    Sounds to me it’s great for highways but not so much for complex urban environments.

    • LogicDesigner

      Fortunately with software, once the problem is solved once, it is always solved. These unique scenarios will pop up, whether it is the fixie doing track stands or the duck chasing grandma, and they will be solved, one-by-one. It won’t be long before Google has a robust system that can handle virtually everything you can throw at it. What is left unaccounted for will be so exceedingly rare that the extraordinary safety benefits of autonomous vehicles will outweigh it.

      • Chris O

        My experience with software is rather more nuanced than that. Also there is only so much hardware, especially the sensors needed to feed the software with data that can be done with in a certain budget and space available in a car.

        I wonder if I will see the day that a Googlemobile will drive up to a cross walk notice the person standing near it is talking in his phone rather than looking to cross the street and just keep on driving rather than coming to a full stop. Because let me tell you, the second I find the car wasting my time in situations like that I will hit the off switch even if there are no impatient drivers behind me honking their horns.

        I think there is all sorts of nuances a human driver will pick up on but are hard to solve for a machine considering the fact that getting it wrong really isn’t an option for a machine.

        • LogicDesigner

          Well I think the subject of this article shows that they are making progress on this very issue. I think that the amount of time wasted over the course of a week will be far offset by the amount of time freed up to focus on other things while you are being chauffeured to your destination.

          Also, I stick by my notion that it is a problem that can be solved with more software, not more hardware. It is not for lack of detection that Google cars are erring on the safe side. They are fully able to see what is going on—far, far more than any human. The issue is how they interpret that information and then how they act on it.

          • Chris O

            Ah, you reckon you will be chauffeured while working on your mobile device or taking a nap. I think you will be legally required to monitor (babysit really..) your vehicle in case all that sophisticated hardware and software made by the lowest bidder fails and tells you to take over. Other people honking about your autobot wasting their time by stopping for no good reason might also interfere with the experience even if the wasted time is acceptable to the vehicle’s occupants. I wonder to what extend these automatons will be able to reasonably match human manoeuvring speed in complex real world conditions without conflicting with their mission to put safety first.

            It’s my understanding that lack of proper road surface markings can confuse the software by lack of proper data inputs by the hardware so I wonder to what extend current hardware is up to the job.

          • LogicDesigner

            I think you will be legally required to monitor (babysit really..) your vehicle…

            Initially, yes. That is the way legal matters go. Over time that will change. Below is a video of Google’s self-driving car chauffeuring blind people around.

            …made by the lowest bidder…

            That argument usually applies to government work. Is there any reason why you think a Google vehicle would include hardware and software made by the lowest bidder? I would think the opposite would be the case. I certainly don’t think of Google as making lowest-bidder-quality software.

            Other people honking about your autobot…

            Google has put over a million miles on their autobots so far, equivalent to more than a human lifetime of driving. There are several very interesting videos of their testing on the same YouTube channel where you find the video below (link: https://www.youtube.com/channel/UCCLyNDhxwpqNe3UeEmGHl8g ). They don’t seem to be having problems with people honking.

            …road surface markings can confuse the software by lack of proper data inputs by the hardware…

            Maybe I missed it, but I haven’t heard that the road marking issue was a hardware issue. I would be interested in reading about it though if you could point me in the right direction. (Thankfully we can post links on this site without going through moderation!)


          • Chris O

            Google is unlikely to stress the problems in its promotion videos I suppose but this infographic gives some insight in the obstacles autonomous driving still has to overcome, including interestingly the “ adding sensors to smart cities ” to solve the limitations of the onboard sensors, especially in bad weather.

            Also the cost issue expected to be still at a hefty $6700-$10400 in 2025 according to IHS automotive, despite the efforts of the “lowest bidder”;) This is supposed to be solved in 2035, so it looks like most of us won’t be taking that nap while our autonomous vehicles get us where we want to go for quite some time to come.


            It’s great that sharing of knowledge through links isn’t sabotaged on this forum BTW;)

          • Joe Viocoe

            ” I think you will be legally required to monitor (babysit really..) your vehicle…”

            Just like we are legally required to drive no faster than the posted limit. It will be ignored the second the driver feels comfortable enough.