In today’s digest, we take a look at recent coverage on the inseparable nature of smart cities and autonomous vehicles, and a bit of what it will take to bring them about in tandem.
April 19, 2018
In today’s digest, we look at a series of AV developments in China, a couple of interesting US-based AV startups, and more.
Will China Shock the World AV Market?
It could happen. While US-based OEMs and tech companies have made a lot more noise, Chinese companies — and recently, the Chinese government — have been quietly laying the groundwork to rake in the biggest share of global AV profits.
That’s according to a report by McKinsey Research, who recently projected that by 2030, China will top the world’s AV market, with Chinese companies pulling in an expected revenue of over $500 billion.
That’s not just because of brilliant strategy on the part of Baidu and Tencent, though. The Chinese people are far more excited about autonomous vehicles than most other developed countries, and they’re willing to pay a premium for it. We turn again to McKinsey for the stats: “Ninety-eight percent of Chinese interviewed in a recent survey showed a preference for autonomous driving, higher than Germany’s 69 percent or the 70 percent for the US, McKinsey said. Chinese consumers were also willing to pay an average of a US$4,600 premium for the technology, the highest among the three.”
That dovetails with a recent major global poll of attitudes toward autonomous vehicles conducted by Ipsos — China has the second-highest user acceptance, after India. The United States, on the other hand, is near the bottom of the list.
The Chinese government has, up until recently, turned a blind eye to public-road testing conducted by Baidu and Tencent — it was technically against the law, but those companies were willing to work within that grey area. Well, it’s grey no longer, as the Chinese government has just legalized testing on public roads with three simple preconditions:
“1. All cars must first be tested in non-public zones.
2. Road tests can only be on designated streets.
3. A qualified driver must always be in the driver’s seat, ready to take over control.”
Xin Guobin, the vice minister of the Ministry of Industry of Information Technology, took a swipe at American tech companies while explaining the reasons for the safety precautions:
“To ensure the safety of road tests, we will not only require that road tests take place on prescribed streets, but also that the test driver sits in the driver position throughout, monitoring the car and the surrounding environment, and ready to take control of the car at any time. This is a lesson that we have learned from the accidents faced by Uber and Tesla.”
And, perhaps most importantly, China’s vaunted BAT (Baidu, Alibaba, and Tencent) monolith of tech companies are now all officially in the AV space, as Alibaba recently confirmed that they’re developing autonomous vehicles.
Alibaba had already been deeply involved with geospatial mapping and data processing for smart city purposes:
“In June 2016, the company launched an AI-powered “city brain” system in Hangzhou, where it’s headquartered, to crunch data from mapping apps and increase traffic efficiency. Simon Hu, the president of Alibaba Cloud, says the firm’s ultimate goal is to produce the kind of autonomous driving that uses data like that so transportation is fully integrated into urban infrastructure.”
Quartz gives us a nice little overview of BAT’s autonomous vehicle push in a recent piece, “Every Big Tech Firm in China is Becoming an AV Company”:
- Alibaba plans to hire over 50 engineers to boost its autonomous driving development team, which is led by Wang Gang, a former artificial intelligence professor at Singapore’s Nanyang Technological University who joined Alibaba last March. Earlier, the e-commerce firm partnered with SAIC Motor, one of the country’s largest car manufacturers, to build an internet-connected car.
- Baidu, which ventured into autonomous driving long before Tencent and Alibaba, has been developing its self-driving system Apollo since 2015. It hopes to mass produce partially autonomous cars running Apollo by 2019. It has secured partnerships with over 90 companies to provide support for Apollo, including Microsoft and German carmaker Daimler.
- Tencent, which reportedly already had its self-developed autonomous driving system in place in November, also put cars on the road in Beijing in early April for testing. Footage suggests that the driver kept his hands on the wheel the entire time. Didi Chuxing, the ride-hailing giant backed by Tencent, has also been investing in self-driving and could roll out a related service in 2022.
Meanwhile, in Silicon Valley: More Startups!
With global production continuing to ramp up, innovative startups continue to surface in Silicon Valley. We found the work being done by some new arrivals to be particularly compelling.
First, there’s Metamoto, a Silicon Valley startup flush with $2 million in Series-A funding. They announced recently that they are collaborating with “leading automotive players” on an early engagement program of their simulation-as-a-service offering.
Per Neowin: “the program is aimed at gathering data and feedback from multiple parties playing in the autonomous driving arena, for training, testing, debugging and validating workflows of systems that are used in self-driving vehicles, but within the confines and safety of virtual computer simulation. The company adds that the participants that form the testing group were chosen to ensure that perspectives are represented from across the transportation industry including OEMs, Tier-1 suppliers, transportation network companies (TNCs) and stack, sensor and other technology providers.”
Metamoto says their simulation products “can scale and deliver precise simulation of a variety of sensors, including LiDAR, camera, radar, GPS, IMU, and others.” Crucially, if their service works as described, companies will be able to run simulations on their sensor stacks that “mirror unique cases and learn from captured data to identify isolated outcomes, performance boundaries, and system tolerances.”
And then there’s an alliance between startups Renovo and Affectiva that takes “attention tracking” (video-monitoring tools to ensure safety drivers/passengers are alert enough to take control when needed) to a new level: reading the emotions of passengers in AVs.
The new technology works as “both a driver monitoring tool, making sure safety drivers keep their eyes on the road even as the self-driving software is driving the vehicle, and an emotional tracker to ensure robot taxi passengers feel safe and secure during their autonomous trips.”
It’s been fed by the emotional cues of people all over the world: “Using deep learning algorithms, Affectiva trained its software to read emotional reactions by studying a wide range of people from all ages and ethnic backgrounds. Renovo then integrates Affectiva’s application into its operating system, which allows it to access any of the cameras inside or outside the car.”
How in the world does it work? According to Affectiva, “computer vision algorithms identify key landmarks on the face – for example, the corners of your eyebrows, the tip of your nose, the corners of your mouth. Deep learning algorithms then analyze pixels in those regions to classify facial expressions. Combinations of these facial expressions are then mapped to emotions.” It’s able to seven “emotion metrics”: anger, contempt, disgust, fear, joy, sadness, and surprise. The intrepid reporter from TheVerge couldn’t resist a bit of editorializing here: “there’s no word, though, on how effective it would be with someone who may be sociopathic.”
You may be thinking, “well, that’s cool, I guess, but isn’t it a stretch to think that data on a passenger’s emotions will be all that useful?” It’s a fair question, but Renovo and Affectiva are betting the appeal of the features will be irresistible: “a vehicle that can detect whether a passenger is scared can slow down the speed or dim the lights if it senses annoyance or frustration”, which, in addition to the more commonly-touted feature of monitoring the passenger for attentiveness, goes a long way in helping the car to determine whether handing over control in an emergency is likely to have a positive outcome.
- Here’s an in-depth report from the Associated Press covering some recent back-and-forth between national advocates for the 1.3 million legally blind people in the United States and the AV industry, which has sparked the “fueling [of] new research outside the auto industry to develop data and software meant to help ensure the needs of the blind are met when AVs become commonplace”. The report covers some of the more interesting aspects of that research.
- This is a must-see for all geospatial mapping enthusiasts: a new Google initiative looks to preserve walkable copies of earth’s historical sites on the web, making use of drones (using photogrammetry and lidar scanners) to create dense models of sites. Best of all: you’ll be able to tour them with VR goggles on. If you have a fast enough computer, and especially if you have VR, you can tour Bagan here.
- Finally, IEEE Spectrum reports on a breakthrough new underwater GPS system inspired by shrimp eyes. Inspired by recent research that resulted in a special camera that can see polarization patterns of light waves, as mantis shrimp eyes apparently do (it resembles a rope being waved up and down, and detects changes in polarization patterns once light enters the water and gets scattered or deflected), researchers have figured out how to use those patterns to figure out the sun’s position, and from there, the location of the underwater camera itself. Technology never ceases to amaze.
What do you think — about any of these stories or the other ongoing developments in the realms of next-gen transportation or smart cities? Contact us and let us know. If you write something really great, we might even quote you next time, so don’t be shy, join the conversation!More