WoA guest blog Luc Van Gool, professor of computer vision at ETH Zurich & KU Leuven
Granted, they still need some tweaking and could still use some extra data to perform optimally, as some recent incidents have proven. Even though the percentage of accidents per kilometer travelled could become significantly lower than the percentage for cars with driver, the ultimate goal should obviously still be zero accidents.
I consider myself fortunate to have teams in my research labs that contribute to the improvement of self-driving cars. More specifically, we work in the context of the TRACE network of high-caliber research labs in Europe focusing on computer vision technology for automated cars. We work closely with industry giant Toyota and we can rely on talent from renowned institutes such as KU Leuven, the University of Cambridge, the Czech Technical University in Prague, the Max Planck Institute in Saarbrücken, and ETH Zurich.
Without going into too much detail, we can assure you that the computer vision algorithms that we develop are quite complex. They need to process intricate traffic scenes and draw the right conclusion from it, whatever the weather conditions or the time of day. And these computer vision functionalities are but one type of algorithms, to be combined with other technologies such as lidar and radar data, map information, voice orders, etc. Needless to say that the analysis of the image data also needs to happen in real-time.
We are continuously striving to new levels of image processing and recognition. Next to the improvement of the interpretation of the different parts of a scene (road, vegetation, sky, etc.), of the detection of traffic agents like cars or pedestrians, and of lanes, we have also embarked on projects such as the detection of parking spot availability and image processing in demanding circumstances, such as extreme fog or at night.
Major challenge: communicate with the passenger(s)
But perhaps the most challenging part is ensuring an intuitive interaction with the passenger(s).
Even if multiple passengers talk to each other, or the radio is on, it is important that the car understands it is being spoken to, and reacts accordingly. Indeed, for people a very natural way of interacting with the car would be to tell it verbally what they want. Rather than just typing in an end goal at the start of the journey, it should be possible to include intermediate goals while already underway, like requesting that the car passes by a bakery, or to tell the car to park in the shade of a tree.
Just like the first cars looked like carriages, but with the horse substituted by a motor, it is intriguing to think about what autonomous cars will look like in the future. It is fun to think of very unlikely scenarios such as our beds becoming driverless cars and transporting us while we finish our sleep cycle. Ridiculous? Perhaps. It is difficult for people to extrapolate beyond what we know. Henry Ford dismissed the idea of asking users how products should evolve. He jokingly said that when he would have asked his customers before the market entrance of cars, they would have asked for faster horses.
Defining future of cities
Last but not least, the question is not only how cars – or transportation devices – will evolve, but also how these changes will impact the structure and organization of our cities. The number of cars needed might decrease dramatically when the concept of car ownership is replaced by car sharing, a logical choice considering that we don’t use our car for 90% of the time on average. And the traffic density may decrease further with automatic parking space arbitration. According to a recent study, 30% of nowadays urban car traffic consists of cars driving in search of a parking spot. Imagine being able to minimize this percentage to just a fraction of what it is today.
I am glad and proud having been able to share our insights with the audience during this exceptional and& event.