Text-n-Drive Safely! 8 Innovations to Hasten Your Robo-Car Future
Much of what’s on display in the little booths that line the automotive halls at the 2020 CES techstravaganza is nitty-gritty hardware developed by devoted engineers toiling in the trenches to realize the autonomous-vehicle future that we’re promised (or is it “threatened?”) is just around the corner. Most of this stuff bores us to somnambulism, but here are a few crucial pieces to that automotive puzzle that you as a serious car lover should probably know about.
Bosch Long-Range Lidar
Lidar has long been the expected “third leg” of the tech-stool required to support high-level autonomy, compensating for weaknesses in the other two legs—radar (might miss a narrow motorcycle with plastic fairings and such) and cameras (which can be dazzled by sun or obscured by rain, fog, and snow. To date the cost of high-reliability lidar units has proven prohibitive. Bosch claims it is preparing for mass production of what it claims will be first long-range lidar sensor suitable for automotive use (meaning it is fully ruggedized for vehicular conditions of temperature, vibration, and the like). For now the company is not disclosing many specifics about the technology (range, resolution, field of view, even whether it’s a spinner or solid-state), except to say that it is a “time-of-flight” system that measures the time it takes for a pulse of light to be emitted and reflected. Bosch also claims that its new unit’s superlative range, field of view, and resolution set it apart from the scores of other lidars on display at CES. Stay tuned for further specifics.
China’s RoboSense demonstrated what it claims is “the world’s first smart solid-state Lidar,” using MEMS (Micro-ElectroMechanical System) micro-mirrors to directionally control light emission and focus. The smartness is an artificial intelligence perception algorithm that handles data analysis and comprehension inside the unit, outputting information that an autonomous vehicle’s driving computer can use directly for decision making. RoboSense says its 120-degree wide by 25-degree tall field of view is the MEMS industry’s largest, and it uses 905nm light which provides 150m range (almost 500 feet). The hardware is now available to autonomous upfitters for $1,898, with smart variants following after Q1 2020.
Yandex Lidar/Camera Combo
Yandex is the Google of Russia, and it’s getting into the autonomous vehicle space, first with a briefcase-sized six-wheeled food-delivery robot, and also with a sensor suite that combines a high dynamic range camera with two lidar units—a solid-state one that boasts a 120-degree field of view and a 200-meter (656-foot) range, and a more traditional roof-mounted one that provides a 360-degree view. Yandex claims that by doing all the image processing on the chip, these sensors can work smarter together. If the camera detects a potential hazard, the solid state lidar can briefly focus more attention on that area to develop a higher-resolution image. Yandex claims its current prototypes already cost half as much as existing devices and that savings could amount to 75 percent at full production. The company is already operating a robotaxi service in Europe that’s logged 5,000-plus autonomous rides and is teaming with Hyundai Mobis to build and operate a fleet of 10 self-driving robotaxis in Detroit during this June’s North American International Auto Show.
Velodyne Alpha Prime Lidar
While Bosch, Yandex, and others on the floor at CES are striving to make lidar affordable for mass production use, Velodyne soldiers on developing and selling ever better versions of the pricey spinning lidar units all the proto-autonomous vehicles from Waymo, Cruise Automation, et al. are using. This one claims to deliver the industry’s highest performance specs (360-degree surround perception with a 40-degree vertical field of view, 4.8 million points per second resolution, eye-safe 903nm light, 220m range) available along with a new level of power efficiency.
TriEye Raven SWIR Camera
Israeli startup TriEye has teamed with no lesser automotive manufacturer than Porsche to develop this short-wave infrared camera that can “see through” weather better than a typical visible-light camera, as rendered in the image at the top of this page. Based on a fully developed, cheap and cheerful CMOS (Complementary Metal Oxide Semiconductors) camera chip, TriEye claims that the heat signature of items in the environment can be more easily sensed through bad weather than the visible light they reflect or emit, giving Raven an advantage over typical cameras. This product is still in development.
Ambarella CV2 Camera Sensors
Security cameras are Ambarella’s bread-n-butter business these days, but the company is trying to bust into the autonomous vehicle sector with stereo-vision cameras and new “convolutional neural network” (CNN) image processing that is 20 times more effective at determining what the camera is perceiving at distances of up to 180 meters (almost 600 feet). At CES the company outfitted a Lincoln MKZ with its system and demonstrated vision-only autonomous operation at night, and way back in 2010 its camera technology was used to complete a 15,000-km autonomous drive from Parma Italy to Shanghai China. Don’t hold your breath for camera-only robocars, but a sensor that’s capable of operating on its own is a much better team player than one that always relies on collaboration from other sensors.
Adasky Thermal-Imaging Camera
At last year’s CES we were impressed by the image quality of Adasky’s thermal-imaging camera, which presents what looks like a brighter, clearer version of those night-vision images some high-end cars offer (but Adasky’s camera is way cheaper). Because people and animals give off heat, they’re vastly easier to see with thermal-imaging at night and through dense fog, snow, etc., making this a safer camera alternative in any ADAS system—even all the level 2 systems out there today. The Adasky news for 2020 CES: Company founder Yakov Shaharabani proposes equipping traffic-light intersections with Adasky cameras that can sense a potential red-light runner or distracted texting pedestrian about to enter the intersection and hold a red light or call for a yellow to prevent catastrophe. The company is in talks with several OEs and has reportedly landed at least one production contract for level-2 autonomy support.
Valeo Move Predict.ai
Much of the technology you’ve read about so far has to do with gathering information, but how can it be best put to use? That’s the job of Valeo’s self-descriptively-titled Move Predict.ai, which predicts the movements of pedestrians, scooters, and other slow-moving traffic around the vehicle. The system not only looks at people, but at its surroundings—crosswalks, intersections, and the like, to predict where those people might go. It also looks for indications of low attention levels, such as a pedestrian with a phone to their ear or two people walking together. We sat in the back seat watching a screen as engineers orbited the test car, talking on cell phones, rapidly changing directions or darting in front of the car, and Valeo’s system did a decent job tracking them, making assessments of their distraction level, and predicting which ones would jump in front of the car and which would keep going.
The post Text-n-Drive Safely! 8 Innovations to Hasten Your Robo-Car Future appeared first on MotorTrend.