AV Platform An Overview of Autonomous Car Tech Platforms—EMEA, Part II
Plenty of automakers have announced or shown off self-driving vehicles. But the majority of these companies are not creating their own self-driving hardware and software systems in-house; they’ve either acquired specialized firms to create these systems for them or are outsourcing the work to third parties. Some of the most respected leaders in this growing market segment are in the Europe, Middle East, and Africa (EMEA) region.
There’s lots of talk about autonomous vehicles (AVs) these days, and a fair number of prototype cars and trucks are currently on public roads. There are also fully functioning autonomous robo-taxi services ferrying passengers around various global locales, occasionally without anyone in the driver’s seat.
But for all the talk and test vehicles in the marketplace, the number of actual “platforms”—the intelligent technology at the heart of AVs—remains limited. The reason for this is simple—creating these platforms is a major endeavor requiring significant capitalization, copious systems engineering, and rigorous testing. For now, this work is better suited to technology companies (ones with experience in computer hardware and software), rather than automakers. (For the purposes of these articles, automakers may be referred to as original equipment manufacturers, or OEMs.)
This explains why some of the biggest car companies in the world have either outsourced their tech platforms or have purchased companies specializing in creating them.
A new awareness of the limitations of the technology has led a number of vendors to concentrate their efforts on delivering a fleet of robo-taxi AVs rather than production vehicles in order to gain valuable real-world testing experience. Another reason for this focus on fleets and taxi services is that in the future, individual mobility may decrease in favor of multi-person or group mobility.
Some companies have chosen to collaborate or share data with their former competitors in an effort to lower the costs of developing their own AV platforms. This has created an often confusing web of alliances, licensing agreements, and investment partnerships in the automotive industry.
This article is the fourth in a series comparing and contrasting the dominant industry players and their technology platforms, often referred to in the industry as “stacks.” This article further explores companies based in the Europe/Middle East/Africa (EMEA) region, while additional articles in this series examine those in North America and in Asia. Specifically, this article covers Fusion Processing, Infineon, Navya, NXP, Oxbotica, TTTech, Veoneer, Wayve, Yandex.Taxi, and ZF/2GetThere.
If you've not read the first article in the series yet, you can find it here.
EMEA AV Tech Platform Players, Part II
The following companies have developed a broad range of AV platform technology for disparate OEM purposes. For some firms, production cars for consumers are the primary target application. For others, the main focus may be autonomous trucking, suburban ridesharing, middle-mile delivery, or low-speed shuttles.
Note that not every company listed below produces or incorporates a complete standalone platform. But each produces critical hardware and software that enable AV platform functionality.
Founded in Bristol in the UK in 2012, Fusion Processing makes sensors, ADAS, mirror replacement systems, and an SAE Level 4 and 5 AV platform called CAVstar (CAV stands for Collision Avoidance) for cars and buses.
CAVstar provides 360-degree situational awareness, detects obstacles, and controls braking, acceleration, and steering in a variety of weather conditions. Fusion Processing has driven 1.6 million kilometers with CAVstar-equipped vehicles in Bristol, London, and Manchester in the UK. Fusion’s systems have also been deployed in Sweden, Canada, and Australia.
CAVstar is at the heart of SAE Level 4 Enviro200 driverless single-decker buses operated by British transport company Stagecoach in partnership with bus manufacturer Alexander Dennis Limited (ADL).
Stagecoach’s autonomous CAVstar-equipped buses have successfully repeatedly navigated two enclosed depots in Manchester and Cambridge, England. Stagecoach, ADL and Fusion Processing hope to test-operate Enviro200 AV buses via scheduled service on a public 30-mile route (14 miles of which will be on the Forth Road Bridge) between Fife and Edinburgh in Scotland in the near future. The goal is for the buses to carry at least 10,000 passengers per week. For these public tests, a backup driver will be aboard the vehicles, but will not have any driving duties.
Starting in July 2019, CAVstar was used for a series of tests of “platooning” of driverless heavy goods vehicle (HGV) trucks as part of the HelmUK Advanced Platooning Trial in Bristol. Platooning is a process whereby several trucks follow a lead truck’s braking, acceleration and steering as well as maintaining a safe distance between each vehicle. This results in increased fuel efficiency, heightened safety, and lower carbon emissions. Partners in the platooning trial included TRL (formerly Transport Research Laboratory), UK heavy truck market leader DAF Trucks, logistics company DHL, the UK’s Department for Transport (DfT), and Highways England.
German semiconductor manufacturer Infineon was spun off from industrial manufacturing firm Siemens in 1999.
Infineon makes components such as chips that handle radar, LiDAR, camera, and image sensing data for ADAS applications, all designed to comply with ISO 26262 safety methodology. The company also provides components for driver monitoring, emergency calling, controller area network (CAN) transceivers, electronic control units (ECUs), vehicle-to-everything (V2X) communication, and car data security systems. In March 2015, Infineon made an undisclosed investment in German AV platform company TTTech (see below).
High-performance ASIL-D-capable Infineon AURIX TriCore lock-step multicore microcontrollers are part of zFAS Central Driver Assistance Systems made by TTTech for carmaker Audi. These included those for the 2018 Audi A8, the world’s first SAE Level 3-enabled production car.
In 2017, Infineon announced that its AURIX microcontrollers were components of NVIDIA’s (see the second North America article in this series) AI-enhanced DRIVE PX 2 AV platform.
In the DRIVE platform, Infineon’s microcontrollers act as decision-makers (or “voters”) for ASIL-D-compliant tasks and as in-vehicle interfaces for Ethernet, CAN, LIN, and FlexRay network communications.
The DRIVE platform allows users to access AURIX via an AUTOSAR-compliant software stack. This permits higher-level application layers to be easily adapted, reducing development time by as much as 20 to 40 percent.
To support the collaboration with NVIDIA, Infineon established its own Silicon Valley Automotive Innovation Center (SVAIC) to work with the chipmaker and other technology partners.
In 2018, Infineon announced that its AURIX microcontrollers would be used in NVIDIA’s scalable and efficient SAE Level 5-capable DRIVE Pegasus AV platform. Network channels the AURIX interfaces with include FlexRay, Gigabit Ethernet and CAN FD.
In 2018, at CES Las Vegas, Infineon announced it would be joining the vendor ecosystem of the Apollo AV open platform developed by Chinese search-engine company Baidu (see the first Asia article in this series).
In May 2018, Infineon declared that it would be leading an automotive industry consortium of 15 companies and organizations focused on AV data security. The consortium is called the Security for Connected, Autonomous Cars (SecForCARs). Partners in the consortium include Audi, Volkswagen, and Bosch (see the first EMEA article in this series).
In CES Las Vegas 2020, microprocessor manufacturer Qualcomm (see the second North America article in this series) announced it would be integrating AURIX safety controllers in the company’s Snapdragon Ride AV platform.
Navya is a French firm that was born out of the ashes of Induct, a nascent French robotics venture that had developed an electric self-driving van called the Navia. The Navia was introduced and demonstrated at the 2014 CES and was further tested at the site of the Civaux Nuclear Power Plant in Civaux (Vienne), France. The investment fund Robolution Capital took over the assets of Induct and changed the company’s name to Navya in 2014.
In 2015, the company began trials of a driverless, manual-control-free, SAE Level 5 15-passenger electric AV shuttle called the Autonom. Initial tests took place in Civaux and Saône in France and Greenwich in the UK. In 2015, the CarPostal/Postbus subsidiary of the national Swiss Post postal service ordered two Autonom AV shuttle vehicles that were installed in Sion, Switzerland.
Featuring LiDAR and camera sensors, GPS, an inertial measurement unit (IMU), and deep-learning algorithms, the Autonom can operate in mixed traffic at a maximum 25 kph. Navya allows customers to customize their shuttles’ exterior colors and logo placement.
In 2016, French AV technology vendor Valeo and global transport company Keolis invested in Navya with the aim of contributing to and realizing value from Navya’s AVs.
Later in 2016, Keolis and Navya jointly launched a 1.3-kilometer, publicly accessible AV shuttle service called Navly, which operates five days per week in the Confluence district of Lyon, France. Keolis and Navya also arranged trials of Navya shuttle vehicles at CES in Las Vegas and at the Christchurch International Airport in New Zealand.
In 2017, Keolis, Navya and IDF Mobilités agreed to cooperate on AV shuttles for three different tracks of the La Défense district of Paris, which hosts 500,000 visitors daily. A further agreement was signed with France’s NEoT Capital for a turnkey service solution for Autonom shuttle vehicles.
At the beginning of 2017, Navya sold two Autonom shuttle vehicles to SB Drive, an AV subsidiary of Japan’s SoftBank, which has smart mobility partnerships with four Japanese municipalities. In May 2017, Navya signed an agreement with Western Australia’s Royal Automobile Club (RAC) to deploy Autonom shuttles in New Zealand, Australia and other parts of Southeast Asia.
In 2017, Navya delivered two driverless AV shuttle vehicles to the University of Michigan Mobility Transformation Center (MTC) to carry passengers around the university campus. Later in 2017, Navya opened the first of its production facilities in Vénissieux, France. A second facility was opened in Michigan in the U.S. shortly thereafter.
The Autonom Cab and Tract
At the end of 2017, Keolis, Valeo, and Navya launched an electric, app-ordered, driverless, manual-control-free, infotainment-equipped, six-passenger AV robo-taxi called the Navya Autonom Cab.
Designed for either first- or last-mile private or shared operation, Navya claims the Autonom Cab can reach speeds up to 90 kph and is designed for sustained 50 kph driving. Like the Autonom shuttle, Navya lets customers customize the vehicles’ exterior colors and logo placement. So far, the Autonom Cab has undergone tests in France and Australia.
As of June 2019, Navya had sold and installed more than 130 units of its Autonom shuttle. Navya has also introduced its LEAD remote supervision platform for managing fleets of Autonom and Cab AVs. The company’s third AV, an electric freight tractor called the Autonom Tract, designed for the movement of goods, was recently introduced for use in applied settings such as warehouses and airports.
Netherlands-based semiconductor firm NXP was spun off from Philips Electronics in 2006 and merged with Freescale Semiconductor in 2015. American AV platform developer Qualcomm (see the second North America article in this series) tried to buy NXP in 2016 but was unsuccessful.
NXP manufactures ISO 26262-compliant radar, camera sensors, components for ADAS, instrument and infotainment systems, driver and occupant monitoring systems, and secure automotive networking and connectivity systems (including V2X, gateways and intelligent roadside units).
It also produces powertrain, vehicle dynamics, safety and security, and adaptive vehicle components.
NXP’s own machine learning- and AI-enhanced AV platform is called BlueBox. Like competitor ZF (see below) with its “See-Think-Act” credo, NXP has an AV strategy it calls “Sense-Think-Act.” The AV platform in each case serves as the “Think” stage.
The high-performance, low-power BlueBox is more of an open development platform than an out-of-the-box “solution” for AVs. The latest iteration of the BlueBox, the BLBX2 family, integrates NXP’s S32V234 vision and sensor fusion processor, the company’s LS2084A multicore embedded processor, and its S32R27 radar microcontroller.
The BlueBox BLBX2-DB features CSE and ARM TrustZone-secure ASIL-B computing, with automotive interfaces with vision acceleration as well as an ASIL-D subsystem with dedicated interfaces. It’s capable of mapping, object detection and localization, classification, and situational assessment. It also offers global path, behavior, and motion planning decision-making. Easily customized development is afforded via an open ROS Space Linux-based system and programmability in Linear C, making it suitable for mass-market production AVs.
Automated Drive Kit
For those looking for a more complete solution, the BlueBox BLBX2-DB sits at the center of an NXP Automated Drive Kit (ADK) available from Hexagon/AutonomouStuff, which also includes a NovAtel IMU-IGM-S1 Inertial Measurement Unit (IMU), a NovAtel GPS Enclosure, a Velodyne Puck LiDAR sensor, Delphi ESR radar, Neusoft ADAS Vision Software, and Robot Operating System (ROS) Middleware Software Service Installation. The ADK is compatible with Baidu’s Apollo platform and LiDAR object processing (LOP) software.
At CES 2019, NXP announced it would co-develop a safety-focused, open commercial AV platform based on the BlueBox with semiconductor partner Kalray. Kalray’s manycore MPPA processors enable perception machine learning that can help push the platform from SAE Levels 2 and 3 to SAE Levels 4 and 5. Software that can run on the platform includes Baidu’s Apollo (see the first Asia article in this series).
At CES 2020, NXP announced an end-to-end SAE Levels 2 to 4 AV simulation platform with software and hardware-in-the-loop systems to help carmaker OEMs develop and validate processes for safety.
For this platform, the BlueBox is complemented by a Kalray Coolidge Massively Parallel Processor Array (MPPA) perception accelerator with AI software for object detection and classification application support.
Also present is a dSPACE ASM Traffic simulation environment complete with traffic, sensor simulation, full VD and BEV powertrain.
And finally, an Embotech Forces Pro and ProCruiser real-time optimal control software and highway planner solution enable Lane Change Assist (LCA), Automatic Cruise Controle (ACC), Traffic Jam Assist (TJA), Emergency Braking (EB), Double-Lane Change (DLC) maneuvers, and Emergency Safe Stop (ESS), among other functionality.
Founded in 2014, Oxbotica is a self-driving AV software platform provider based in Oxford, England. The firm’s modular, hardware-agnostic, cloud-integrated, low-compute-power, GPS- and maps-optional Universal Autonomy AV software uses best-in-class LiDAR and radar for localization.
The flexible and licensable Universal Autonomy suite includes a full set of installation and operating tools, including simulation, mapping, risk management, audit, and insurance. It’s designed for a wide range of OEM carmakers, fleet operators, delivery firms and/or transport companies for on- and off-road utility vehicles, robo-taxis, shuttles, and more.
Oxbotica has tested its Universal Autonomy/Selenium suite in all types of weather and light conditions on roads in Europe, North America, and Asia over the last five years.
London Testing with Addison Lee
After successful testing of a number of modified AVs in Oxford, Oxbotica, in conjunction with Internet cybersecurity and domain registrar firm Nominet and insurer AXA (both members of the UK’s DRIVEN AV consortium), will soon begin more rigorous testing of its AV platform in London with taxi firm Addison Lee (which leads the DRIVEN consortium as well as the UK’s MERGE Greenwich AV consortium).
“Being autonomous before [the end of 2019] showcased the huge amount of work Oxbotica’s expert team of engineers has completed since the DRIVEN consortium was established,” said Graeme Smith, CEO of Oxbotica. “These trials [in Oxford] further demonstrated to the wider UK public that connected and autonomous vehicles will play an important role in the future of transport. This milestone showed the advanced state of our capabilities and firmly keeps us on the road to providing the technology needed to revolutionize road travel.”
TTTech is an Austrian safety control and real-time network company that was spun off from the Vienna University of Technology in 1998. In 2013, TTTech agreed to collaborate with NXP (see above) on automotive Ethernet switches.
At the 2014 Las Vegas CES, TTTech showed a prototype of its zFAS electronic control unit (ECU) with Audi, continuing collaborative efforts that began in 2001. (Audi is a minority investor in TTTech.)
In March 2015, semiconductor maker Infineon (see above) made an undisclosed investment in TTTech.
In 2016, TTTech announced an agreement with Japan’s Renesas Electronics (see the second Asia article in this series) to collaborate on a new ADAS platform.
In 2017, TTTech announced a strategic partnership with Samsung/Harman to cooperate on AV and safety technology.
Later that year, Audi launched the world’s first SAE Level 3 production car, the 2018 A8. It included TTTech’s SAE Level 5-capable, scalable, modular MotionWise software framework and zFAS ADAS platform, the latter of which integrated ASIL-D-capable microcontrollers from Infineon.
In 2018, TTTech signed a cooperative agreement with BMW to develop SAE Level 3 and 4 AV functionality for highway driving.
Later in 2018, TTTech established the “Technomous” joint venture with SAIC Group, China’s largest automaker, with the aim of mass-production of consumer AVs in China. At the end of 2018, as the first fruit of this venture, SAIC introduced the MARVEL X vehicle, the first mass-production Chinese car with SAE Level 3 AV capabilities.
In June 2018, TTTech organized a new, specialized automotive-focused subsidiary, TTTech Auto, together with investment partners Infineon, GE Ventures, Samsung/Harman (see the second Asia article in this series) and Audi.
In September 2019, TTTech Auto co-founder Ricky Hudi spoke at an AV industry event in Vienna as chairman of “The Autonomous,” an industry group of AV companies and carmakers dedicated to safe mobility. The group counts companies such as Aptiv, Audi, BMW, Continental, Daimler, Five.ai, Infineon, Intel, NVIDIA, NXP, Renesas, SAIC, Samsung, and Volkswagen among its members.
Sweden’s Veoneer is a spinoff of auto safety systems company Autoliv.
Besides making ADAS software, localization and positioning systems, braking systems, radar, LiDAR and camera systems, Veoneer has its own SAE Level 4-capable AV platform it calls Zeus.
Zeus uses DRIVE AGX Xavier chips from NVIDIA (see the second North America article in this series) as well as software from NVIDIA and Zenuity, a company co-founded by Veoneer and automaker OEM brand Volvo—itself owned by China’s Geely Group.
At CES 2019, Veoneer announced an SAE Level 2+ AV platform that would be available starting as early as 2020. It also unveiled autonomous valet parking for its Learning Intelligent Vehicle (LIV) prototype cars (see below).
In March 2019, Veoneer announced that its AV components, as well as software from Zenuity, would be integrated into the Polestar 2, an electric vehicle (EV) from Geely.
In June 2019 at the American Center for Mobility in Ypsilanti, Michigan, Veoneer demonstrated third-generation, deep-learning-enabled LIV 3.0 prototype cars equipped with its AV platforms. The company showed off multiple features, including adaptive cruise control (ACC) with stop and go, traffic jam assist, autonomous emergency braking, and lane-keeping assist as well as driver monitoring and cloud-based mapping.
In late 2019, Veoneer joined the Autonomous Vehicle Computing Consortium (AVCC), joining automaker OEMs such as General Motors and Toyota and other AV platform companies such as Bosch, Continental, NXP (see above), NVIDIA (see the second North America article in this series), Renesas (see the second Asia article in this series), and components companies ARM and Denso.
At CES 2020 in Las Vegas, Veoneer demonstrated a prototype SAE Level 4-equipped AV on public roads.
Founded in 2017 in Cambridge, England, software AV platform company Wayve is now based near King’s Cross Station in London.
Wayve has been testing its platform using a fleet of Renault Twizy microcars and Jaguar I-PACE electric SUVs on public roads since 2018.
A Radically Different Approach
As opposed to many heavily funded autonomous tech vendors and carmaker OEMs, Wayve has taken a radically different approach to developing its much-talked-about platform, which is optimized for complex European urban driving environments.
Instead of employing a huge cache of HD maps or a raft of ultra-costly radar and LiDAR sensors, its platform has been trained by observing real-world human driving with cameras and a basic sat-nav and by utilizing reinforced, simulation-based machine learning.
In summary, Wayve’s end-to-end conditional imitation-learning, AI-driven platform is the first to demonstrate learned full-vehicle control (both lateral and longitudinal) following a user-prescribed route in urban environments and interacting with other traffic.
For this feat, Wayve was inspired by other simulation learning, by learned approaches to parts of typical AV platforms such as motion planning, and by chipmaker NVIDIA’s demonstrations of steering-only control and interpretability on novel roads without traffic.
The company’s platform iterates learning through three basic processes: exploration, optimization, and evaluation. Its network architecture uses less than 10,000 parameters—as opposed to the tens of millions employed by its competitors.
After achieving its momentous success, Wayve came to three conclusions: that AV data distribution is highly important, that robust computer vision representations matter, and that multiple learning signals for data efficiency must be used.
Wayve makes the point that the AV industry’s collective $50 billion expended, 10 years spent, and 20+ million miles driven have not resulted in commercial SAE Level 4+ cars on public roads. For Wayve, end-to-end machine learning is the only logical alternative—without maps, rules, expensive sensor-and-compute suites, or already-known roads.
Low Power, Less Cost
Wayve propagates uncertainty throughout its models, making computation highly efficient. In fact, Wayve’s whole system operates using the equivalent of a laptop computer. The system learns from patterns, rather than from explicit instructions. Compared to other platforms, Wayve’s uses 10 percent of the power and has 90 percent less cost.
On its website, Wayve has videos of its AVs navigating streets they’ve never previously driven on, in a variety of weather conditions, using only simple cameras and a sat-nav map route from a consumer-grade GPS. Typical obstacles include vehicular traffic and cyclists.
Publications such as Forbes and the Washington Post have crowed over the company’s unorthodox approach. Wayve says that its platform learned to follow driving lanes from scratch in 20 minutes.
The company boasts that its platform algorithm “dreams about driving, getting better with every dream.” Wayve claims it will be the first company to deploy AVs in 100 cities globally.
Yandex is a Russian search-engine company whose Internet search engine is the fifth-most-popular in the world.
Like Google with its Waymo sister company, Yandex has been developing its own AVs since at least 2016. Like Waymo, Yandex’s AVs have accumulated an impressive number of real-world driving miles—two million as of February 2020. (AVs from only four other companies—Waymo, Baidu, Cruise, and Uber have driven a million miles or more.)
Presently, Yandex’s 110-unit fleet of AVs is licensed to drive on public roads in three countries — Russia, the United States, and Israel.
Yandex’s first AV was a modified Toyota Prius that used sensors from LiDAR maker Velodyne and chips from Intel and NVIDIA (see the second North American articles in this series). Today, modified Priuses make up half of Yandex’s AV fleet.
SAE Level 4 Operation
In 2018, Yandex began operating an SAE Level 4 robo-taxi service using its vehicles in the Russian town of Innopolis. The Yandex AV group is now known as Yandex.Taxi.
At CES in 2019 and 2020, Yandex.Taxi demonstrated its AVs on public roads in Las Vegas. In both Innopolis and Las Vegas, there were no drivers at the steering wheel during trips; a Yandex.Taxi representative sat in the front passenger seat for safety purposes.
Like Waymo, Yandex.Taxi has developed its own proprietary LiDAR sensors, vastly reducing the cost of components for its AVs.
Partnership with Hyundai
In 2019, Yandex.Taxi signed a cooperation agreement with South Korea’s Hyundai to jointly develop SAE Levels 4 and 5 cars. The first production model to incorporate Yandex technology was the SAE Level 2-capable 2020 Hyundai Sonata.
Industry observers have noted that while Yandex discloses the number of miles/kilometers its AVs have driven, it does not disclose the number of issues (known in the industry as “disengagements” — for when a human must take over driving) its AVs have had while in service.
German automotive part vendor ZF Friedrichshafen has been producing components for advanced driver assistance systems (ADAS) such as actuators, electronic control units (ECUs) and camera, LiDAR and radar sensors for years.
The Beginning of ZF AVs
In the mid-2010s, ZF realized that real AV smart systems came from combining sensor data to enable truly intelligent autonomous decision-making instead of mere single-source-driven reactions. The company developed a “See-Think-Act” credo for the evolution of its future AV technology.
In 2015, ZF acquired TRW Automotive, a U.S. company specializing in car safety technology. In 2016, ZF formed the subsidiary Zukunft Ventures GmbH to pool investment funds for AV startups it wished to invest in.
At the 2017 Consumer Electronics Show (CES) in Las Vegas, ZF unveiled its cloud-updateable, deep-learning, low-power, high-performance ProAI AV central electronic control unit (ECU). ZF called it “the world’s most powerful automotive supercomputer.”
Currently, ProAI uses the Xavier artificial intelligence (AI) chip and Turing GPU from the U.S.’s NVIDIA (see the second North America article in this series). ProAI is designed for automated highway driving and enables functions such as Evasive Maneuver Assist, Highway Driving Assist, Traffic Jam Assist, and SafeRange.
In 2018, ZF rolled out its cubiX software that takes control of a vehicle’s chassis systems, including steering, shock absorbers, driveline, and brakes to ensure a comfortable ride for all passengers. The cubiX software interfaces in a plug-and-play manner between the vehicle’s ADAS (which calculate speeds and trajectories) functions and the vehicle’s actuators.
cubiX is designed to work with advanced drivetrain components (all of which ZF manufactures), such as electronic braking systems (EBS), electric power steering (EPS), electrically controlled shock absorbers, electric-drive modular rear axles, and rear-wheel steering.
It can also network with vehicular active dynamic driving systems such as active roll stabilization, torque vectoring, and electromechanical roll control (ERC) to improve comfort, performance, and safety.
Thus, ZF now makes all three critical components of AVs: sensors, the AV platform “brains,” and active chassis parts of vehicles. For carmaker OEM customers of ZF, this streamlines sourcing of AV components and turns ZF into a “one-stop shop” for all their AV needs.
ZF Demonstration Vehicles
ZF presently has an array of prototype vehicles outfitted with ZF sensors, ProAI and cubiX that the company is using to learn and show off its AV functionality with.
One of these vehicles is the SAE Level 4-capable Vision Zero EV passenger car, which aims at the company’s goal of enabling vehicles to eventually have both zero emissions and zero accidents. The Vision Zero car incorporates functions such as Driver Distraction Assist, Wrong-Way Inhibit, Comfort Steering, Assisted Driving, Automated Driving, and Automated Overtaking.
The ZF Innovation Truck is a full-size truck cab that can drive itself within a defined geofenced area and detach from and attach to fully loaded freight trailers without a driver. Typically, this functionality is extremely taxing for a driver when done manually.
The ZF Terminal Yard Tractor is a substitute truck cab that autonomously connects with freight trailers parked in static positions and guides them to nearby loading docks for loading and unloading. When this is finished, it takes the trailers back to where truck cabs can connect to them.
The ZF Innovation Van is a self-parking electric SAE Level 4 delivery van. It can intelligently calculate the most efficient route between delivery locations and navigate through urban and rural environments, even when road signs, traffic lights, and lane markings are absent.
The ZF Innovation Tractor is a traditional, low-speed agricultural-style tractor with electric traction that can intelligently connect to and detach from various pulled or hitched cargo carriers. The Tractor employs ZF’s SafeRange system, which can automatically detect pedestrians and cargo hitches.
In 2017, ZF announced it would be working with Chinese search-engine company Baidu to enhance Baidu’s Apollo digital mapping and AV technology with ZF’s ProAI platform. The companies are testing out this new technology in Beijing with Chinese car-sharing operator Pand Auto. The Chinese government has supported the construction of a 130,000-square-meter driving test site to enable these efforts. In 2018, ZF signed a cooperation agreement with Chinese OEM Chery to collaborate on SAE Level 3 AVs for Chinese production.
In January 2019, ZF said it would be bringing an optionally driverless, 15-person “people mover” electric AV to market in conjunction with European mobility firms e.Go and Transdev.
For this effort, ZF will supply the AV platform and steering, braking, and EV drive components, and production is slated to be in the tens of thousands of units.
ZF CEO Wolf-Henning Scheider explained that the people movers will initially “drive at a very low speed into the city center[s]. These are use cases that will take us to automatic driving functions step by step.”
ZF clearly foresees that as shared mobility cuts down on the need for people to own individual cars, “people movers” will likely become much more predominant, particularly in cities. This will necessitate major changes in vehicle manufacturing, supply chains, and parts.
For these electric people mover-type vehicles, ZF has developed a universal Intelligent Dynamic Driving Chassis (IDDC), which comprises EasyTurn front axles and modular semi-trailing arm rear suspension (mSTARS) with built-in Active Kinematics Control (AKC) rear-axle steering. The IDDC can be used modularly as a versatile platform for different mobility uses—cargo or passengers.
In March 2019, ZF acquired a 60-percent stake in Dutch mobility specialist 2getthere. 2getthere has been providing AV solutions since at least 1984 when the company used wires in the ground to guide its early AVs along fixed paths. Later, the company patented an embedded magnetic, grid-based system that worked with sensors mounted on its vehicles.
At first, 2getthere’s vehicles only carried freight cargo, but in 1997, the firm started to make “last-mile” shuttles for people and installed them in locations such as Amsterdam’s Schiphol Airport. These vehicles also worked with a separate-lane grid system embedded in the road. Altogether, 2getthere’s dedicated AVs have driven 14 million passengers more than 100 million kilometers. Further installations included business parks in Rotterdam and Abu Dhabi. Future sites for 2getthere include Zaventem Airport in Brussels, where its vehicles will mix with regular traffic for the first time.
Flying Carpet 2.0
In July 2019, ZF announced its Flying Carpet 2.0 predictive chassis which uses cubiX (see above) and activates ZF sMOTION shock absorbers predictively based on upcoming hazards/obstacles like road bumps or potholes. It takes trajectory and speed information from the vehicle’s ADAS and makes on-the-fly continuous damping control (CDC) adjustments to chassis actuators as necessary.
This results in a smoother and safer ride, particularly for drivers who in future AVs may become passengers. For people who may have suffered from kinetosis (motion sickness) previously in vehicles—a surprisingly large population—this smoother, more comfortable ride is critical to AV acceptance. Greater confidence in and acceptance of AVs are qualities that will encourage the use of robo-taxis and ride-hailing services in the coming years.
At CES in 2020, ZF previewed three new AV platforms for future production vehicles:
coASSIST is an affordable (sub-$1000), Euro NCAP 2024-compliant SAE Level 2+ AV platform that concentrates mainly on safety functionality, with some comfort features. It incorporates radar and a front camera, but no LiDAR, and requires no changes to a vehicle’s architecture.
coASSIST includes traffic jam support with full-speed-range adaptive cruise control, driver-initiated lane changes, lane centering, and automatic emergency braking. It makes use of EyeQ camera technology from AV supplier MobilEye, a subsidiary of Intel (see the second North America article in this series). According to ZF, an unnamed Asian automaker OEM has committed to launching a new production car with coASSIST by the end of 2020.
For higher-end car models, ZF announced the Euro NCAP 2024-compliant coDRIVE platform. coDRIVE integrates 360-degree perception of pedestrians and cyclists, driver monitoring, other vehicle trajectory and driving path tracking, and more sophisticated ADAS functionality for traffic jams and highways. This includes hands-free and feet-free automated highway entry and exiting, AI-enhanced lane changes, and overtaking. There’s also localization and redundant mapping of roads and routes. coDRIVE also makes use of MobilEye EyeQ.
Finally, ZF’s SAE Level 2+ to 4 coPILOT open platform will allow carmaker OEMs to add ZF’s cognitive ProAI technology to their vehicles while making room for additional custom/third-party AV functionality. ZF has demonstrated coPILOT’s features, such as pedestrian and cyclist detection, hands-free/feet-free highway driving with active steering, automated lane changing, and lane-keeping assist. There’s also automated valet and garage parking.
With coPILOT, a driver will be able to use voice commands to tell the system to change lanes, overtake another vehicle, and/or merge into moving traffic. coPILOT has intelligent navigation with real-time environment visualization, localization, and mapping that learns repeated routes. It also features driver monitoring, so if a driver appears to be distracted or falling asleep, it can signal an alert. Like coDRIVE, coPILOT is expected to enter the marketplace in 2021 or 2022.
For ZF, different types of cities will require different types of mobility. As ZF CEO Scheider says, “‘Next-Generation Mobility’ is not about the question of whether individual transport has a future or which type of mobility will prevail. If people and goods are to be mobile on a sustainable basis, existing conditions need to be built on to find an individually suitable solution. The right ‘modal split’ and the networking of the various modes of transport will be the signatures of a new urban mobility. The routes to achieve this are manifold. ZF’s solutions and technologies are geared to this diversity.”