Search
Voyage's vehicles are equipped with the best sensors available, giving us a 360-degree picture of the world hundreds of meters from the car.
Voyage's vehicles are equipped with the best sensors available, giving us a 360-degree picture of the world hundreds of meters from the car.
( Source: Voyage)

AV Platform An Overview of Autonomous Car Tech Platforms—North America, Part II

| Author / Editor: Seth Lambert / Erika Granath

Plenty of automakers have announced or shown off self-driving vehicles. But the majority of these companies are not creating their own self-driving hardware and software systems in-house; they’ve either acquired specialized firms to create these systems for them or are outsourcing the work to third parties. Some of the most respected leaders in this growing market segment are in the United States and Canada.

There’s lots of talk about autonomous vehicles (AVs) these days, and a fair number of prototype cars and trucks are currently on public roads. There are also fully functioning autonomous robo-taxi services ferrying passengers around various global locales, occasionally without anyone in the driver’s seat.

But for all the talk and test vehicles in the marketplace, the number of actual “platforms”—the intelligent technology at the heart of AVs—remains limited. The reason for this is simple—creating these platforms is a major endeavor requiring significant capitalization, copious systems engineering, and rigorous testing. For now, this work is better suited to technology companies (ones with experience in computer hardware and software), rather than automakers. (For the purposes of these articles, automakers may be referred to as original equipment manufacturers, or OEMs.)

This explains why some of the biggest car companies in the world have either outsourced their tech platforms or have purchased companies specializing in creating them.

A new awareness of the limitations of the technology has led a number of vendors focusing their efforts on delivering a fleet of robo-taxi AVs rather than production vehicles in order to gain valuable real-world testing experience. Another reason for this focus on fleets and taxi services is that in the future, individual mobility may decrease in favor of multi-person or group mobility.

Some companies have chosen to collaborate or share data with their former competitors in an effort to lower the costs of developing their own AV platforms. This has created an often confusing web of alliances, licensing agreements, and investment partnerships in the automotive industry.

This article is the second in a series comparing and contrasting the dominant industry players and their technology platforms, often referred to in the industry as “stacks.” This article further explores companies based in North America, while additional articles in this series examine those in the Europe/Middle East/Africa (EMEA) region, and in Asia. Specifically, this article covers MobilEye/Intel, NVIDIA, Qualcomm, Tesla, Uber ATG, Visteon, Voyage, and Waymo.

If you've not read the first article in the series yet, you can find it here.

North American AV tech platform players—M-Z

The following companies have developed a broad range of AV platform technology for disparate OEM purposes. For some firms, production cars for consumers are the primary target application. For others, the main focus may be autonomous trucking, suburban ridesharing, middle-mile delivery, or low-speed shuttles.

Note that not every company listed below produces or incorporates a complete standalone platform. But each produces critical hardware and software that enable AV platform functionality.

MobilEye/Intel

MobileEye's Responsible Sensitive Safety (RSS) is an open, transparent and non-proprietary proposal for the entire AV industry.
MobileEye's Responsible Sensitive Safety (RSS) is an open, transparent and non-proprietary proposal for the entire AV industry.
(Source: MobileEye)

Venerable microprocessor maker Intel has been involved in the auto industry for some time via technology supplied for advanced driver assistance systems (ADAS) for dozens of carmaker OEMs. By 2019, Intel had ADAS components installed in more than 34 million vehicles worldwide, enabling up to Society of Automotive Engineers (SAE) Autonomy Level 2 (“hands-free driving”).

In 2017, microprocessor firm Intel purchased Israeli AV tech firm MobilEye for USD$15.3 billion.

Prior to this purchase, MobilEye had been working with Germany’s BMW on AV hardware, with 40 BMW 7-series cars being tested on public roads. Other companies such as General Motors and Volvo were purchasers of MobilEye’s EyeQ system-on-a-chip (SoC) image processing components.

MobilEye already had cooperative agreements with Japanese parts supplier Denso, Canadian parts company Magna (see the first article in this series), American AV platform maker Visteon (see below), Irish parts firm Delphi (later Aptiv, see the first EMEA article in this series), and German parts maker Continental (see the first EMEA article in this series). MobilEye had supplied components and chips for ADAS systems to multiple automotive OEMs for over a decade by the time Intel purchased the company.

MobilEye Aftermarket Products

MobilEye’s aftermarket ADAS products included its Forward Collision Warning (FCW) system (compliant with U.S. National Highway Traffic Safety Administration regulations as of 2011) and Advanced Warning Systems (AWS-4000 and AWS-1000). MobilEye’s aftermarket C2-270 Collision Prevention System warned drivers about other vehicles, pedestrians, bicycles and motorcycles.

In 2015, Tesla (see below) agreed to build MobilEye hardware into the company’s popular Model S vehicle. However, a year later, a fatal crash of a Model S led to the end of this relationship. Nonetheless, as of 2020, EyeQ had been integrated into more than 54 million vehicles.

In July 2016, Intel, BMW, and Aptiv announced they were forming a consortium to develop an open AV platform (Magna and Fiat Chrysler later joined this consortium). Since then, however, BMW has signed separate cooperation agreements with Germany’s Daimler and Bosch. For its part, Fiat Chrysler has also signed agreements with Waymo (see below) and Aurora (see the first article in this series).

A second AV consortium, called the Automotive Edge Computing Consortium, was formed in August 2016, led by Intel, Toyota, Denso, and communications firms NTT DoCoMo and Ericsson.

Intel/MobilEye Platform

At the 2017 Las Vegas Consumer Electronics Show (CES), Intel showed off its low-power, high-bandwidth, 5G-compatible GO AV development platform.

Intel used test AVs at its vehicle labs in San Jose, California and Chandler, Arizona to conduct deep learning for the platform’s algorithms. In addition, Intel introduced an associated GO software development kit (SDK) for carmaker OEMs. Intel also announced an agreement to co-develop stereo vision for AVs with Denso.

In mid-2017, Intel stated it would build a further 100 SAE Level 4 AVs for testing in Europe, the United States, and Israel. Later that year, Audi put Intel’s Programmable Solutions Group (PSG) processor and VxWorks OS at the heart of the 2018 A8, the world’s first production car to feature SAE Level 3 automation features. Waymo (see below) also began to integrate Intel components into their Waymo One AVs.

Intel and MobilEye have purposely designed their platform to employ separate end-to-end camera, radar, and LiDAR systems. Each system is independently capable of providing reliable and sufficient data prior to this data being fused.

REM, RoadBook, and RSS

MobilEye’s retrofittable 8 Connect and Road Experience Management (REM)-enabled Global RoadBook are used for crowdsourced environmental data collection and mapping.

Intel and MobilEye have made much of the fact that their common sense- and math-based Responsibility-Sensitive Safety (RSS) logical operating rules prevent their AV platforms from causing road accidents in ways that are superior to their competitors’. Intel has since made RSS an open standard for the AV industry. Semantic artificial intelligence (AI) is used for the companies’ AV platform ASIL-D-level decision-making.

In 2018, MobilEye created something of a stir in the AV industry after ridesharing firm Uber (see below) experienced a fatal crash. MobilEye used Uber’s in-car footage of the accident to demonstrate how MobilEye’s platform could have prevented it. Since this incident, MobilEye and Intel have become known for their intense research into AV safety.

One of the things that Intel learned via its extensive testing is that exactly 37 driving scenarios account for more than 99 percent of all light-vehicle crashes in the United States. By teaching an AV platform how to approach each one of these scenarios, Intel could vastly improve safety practices. In Arizona, Intel is a founding member of the public-private Arizona Institute for Automated Mobility (AIM) consortium focused on AV safety, science, and policy.

For an overview of MobileEye's RSS technology, have a look at this video:

2018 MobilEye/Intel Partnerships

At the 2018 Las Vegas Consumer Electronics Show (CES), Intel announced that BMW, Nissan, and Volkswagen would be adopting MobilEye REM/Roadbook technology for upcoming AVs. Further partnerships were announced with China’s SAIC Motor (for AVs) and Italy’s Ferrari (for deep learning).

In 2018, Intel/MobilEye, Volkswagen, and Israel’s Champion Motors (a VW auto distributor) signed an agreement to develop Israel’s first ride-hailing service using electric vehicles from Volkswagen as part of a mobility-as-a-service (MaaS) solution. The venture will be known as New Mobility in Israel, but is nicknamed “Pinta.” The MobilEye hardware and software platform will be the center of the service’s SAE Level 4 AVs. Full commercial operation with hundreds of vehicles is expected to begin in 2022.

In July 2018, an additional partnership was announced with China’s Baidu (see the first Asia article in this series). Baidu employed MobilEye’s RSS in its Apollo Pilot and Project Apollo AV platforms for Chinese carmaker OEMs.

2019 MobilEye/Intel Partnerships

At the 2019 Las Vegas CES, China’s ITS regulatory body agreed to use RSS as a basis for its AV safety standards. European auto tech supplier Valeo also agreed to contribute to RSS and cooperate with Intel on standards development. American standards body the Institute of Electrical and Electronics Engineers (IEEE) later approved a proposal to use RSS as a starting point for future AV safety standards.

Intel announced a tentative cooperative agreement with China’s Great Wall Motors to integrate MobilEye AV technology into Great Wall’s vehicles to enable SAE Level 3 and above autonomy. Going forward, Intel made it clear that SAE Level 4 and 5 autonomy would be reserved for car fleets such as ride-hailing or robo-taxi services—versus production cars—for the near future. The company had previously cited cost, lack of regulations, and geographic scale as factors for this reasoning.

Intel indicated it wanted to use RSS to augment emergency braking, with what it calls automatic preventative braking (APB). APB is just one system that’s designed to help AVs conform to a worldwide regulatory movement known as Vision Zero. Originally begun in Sweden, Vision Zero proscribes precise speed limits to specific stretches of roads to reduce vehicular road accidents to an absolute minimum.

In another announcement at CES 2019, Intel/MobilEye also formalized a MaaS partnership with China’s Beijing Public Transport Company (BPTC) and Beijing Beytai that will use Intel technology for SAE Level 4 driverless public transport services.

In late 2019, Intel/MobilEye announced a cooperative agreement with China’s NIO to supply SAE Level 4 AV technology for future NIO electric vehicles. These vehicles would be both consumer production cars and MaaS robo-taxi fleet cars operated (initially) by MobilEye. Future partners for this latter application include Paris’ Régie Autonome des Transports Parisiens (RATP Group) and Daegu Metropolitan City in South Korea.

For now, it’s a given that both robo-taxi services and data monetization will allow MobilEye to fund future AV innovations and research.

NVIDIA

NVIDIA DRIVE AGX is a scalable, open autonomous vehicle computing platform that serves as the brain for autonomous vehicles.
NVIDIA DRIVE AGX is a scalable, open autonomous vehicle computing platform that serves as the brain for autonomous vehicles.
(Source: NVIDIA)

Computer hardware specialty manufacturer NVIDIA has been working with AV and carmaker OEM firms for some time. As of 2020, Reuters reported that the company had more than 370 separate business customers for its AV platform products. Just a few of these customers include Audi, Daimler, ZF, and Volkswagen (see the EMEA region article in this series), Aurora (see the first article in this series), Uber ATG (see below), and Baidu, Volvo, and Toyota (see the Asia articles in this series).

NVIDIA Autopilot

In 2019, NVIDIA showed what it claimed was the first commercially available SAE Level 2+ AV open platform, called AutoPilot, at that year’s Las Vegas CES. The “+” denotes that the platform enables features that require deep learning, such as driver monitoring and surround perception.

As Rob Csongor, NVIDIA vice president of autonomous machines, states, “A full-featured, Level 2+ system requires significantly more computational horsepower and sophisticated software than what’s on the road today. NVIDIA AutoPilot provides these, making it possible for carmakers to quickly deploy advanced autonomous solutions by 2020 and to scale this solution to higher levels of autonomy faster.”

The AutoPilot platform integrates the NVIDIA Xavier high-speed, low-power system-on-a-chip (SoC) parallel processor and DRIVE AI software. These enable the platform to process deep neural networks (DNNs) for perception from camera sensors mounted on and in a vehicle. These DNNs include SignNet, DriveNet, OpenRoadNet, LaneNet, and WaitNet.

These DNNs help DRIVE to understand where other vehicles are, read lane markings, detect pedestrians and cyclists, distinguish between different types of lights and colors, recognize traffic signs, and understand complex driving environments.

In turn, a vehicle is then capable of AV functions such as highway merging, lane changing, lane splitting, and custom personal mapping. Also included is the DRIVE IX software for driver monitoring and alerting, AI copiloting, and in-cabin visualization of the platform’s computer vision.

DRIVE Constellation and AGX Pegasus

In addition, NVIDIA offers the DRIVE Constellation platform for data centers. A simulator server uses NVIDIA graphics processing units (GPUs) running Drive Sim software. A second vehicle server contains the Drive AGX computer, which processes simulated sensor data. Driving decisions from the vehicle server are sent to the simulator to realize timing-accurate, bit-accurate hardware-in-the-loop testing.

For robo-taxis, NVIDIA’s much faster DRIVE AGX Pegasus platform can be utilized to enable up to SAE Level 5 AV functionality.

NVIDIA’s DRIVE is the “brain” for AVs from Optimus Ride, a Boston-based firm founded by veterans of MIT’s electric vehicle and self-driving technology programs. Optimus Ride has a group of geofenced AV shuttle fleets operating in small areas, such as residential communities, industrial parks, school campuses, airports, mixed-use developments, and ports. The company currently operates in five American locales: Boston, MA’s Seaport District; South Weymouth, MA’s Union Point; Brooklyn, NY’s Navy Yard; Reston, VA’s Brookfield Halley Rise; and Paradise Valley, CA’s Paradise Valley Estates.

Other NVIDIA Partnerships

Some companies, such as Sweden’s Volvo (owned by China’s Geely), deploy all of NVIDIA’s AV tech components. Others, such as Germany’s Bosch, ZF, and Continental (see the EMEA articles in this series), deploy only some of them as pieces of their own AV platforms. A number of vendors also use NVIDIA technology for in-car personalization, energy management, and connectivity.

Since 2017, Bosch and German automaker Daimler have been working together with NVIDIA on a Pegasus-driven AV platform, with the original goal of producing SAE Levels 4 and 5 production vehicles for individual consumers. However, over time, both companies have realized that this goal may be unrealistic, at least in the short term.

Instead, the firms are setting up a Level 4 mobile app-accessed ride-hailing service in San Jose, California. This service will make use of lessons learned at Daimler’s 100,000-square-meter testing ground in Immendingen, Germany.

Eventually, it’s thought that as the ride-hailing service proves itself in California, Daimler can take the platform and roll it into a package aimed at third-party taxi fleet operators. In turn, it’s hoped that these operators will buy self-driving vehicles from Daimler.

Qualcomm

Qualcomm's on-demand hardware capabilities include Wi-Fi connectivity services, 4G/5G modem features, AI capabilities, C-V2X and more to offer auto manufacturers connectivity tools that meet consumer needs and support new business models.
Qualcomm's on-demand hardware capabilities include Wi-Fi connectivity services, 4G/5G modem features, AI capabilities, C-V2X and more to offer auto manufacturers connectivity tools that meet consumer needs and support new business models.
(Source: Qualcomm)

Semiconductor powerhouse Qualcomm’s AV scalable platform is called the Snapdragon Ride. The Snapdragon name denotes the company’s proprietary line of systems-on-a-chip (SoCs). These are highly energy-efficient, air-cooled neural processing engines, with multiple hardware performance accelerators, large data pipes, and a seamless development software infrastructure.

Qualcomm already uses other Snapdragon chips for voice-controlled, AI-enhanced vehicle cockpit, infotainment, and wireless connectivity systems it sells to 19 of the largest 25 carmaker OEMs.

Qualcomm insists that its comprehensive AV platform is ready for SAE Level 4 and 5 autonomous applications. The company claims “human-like driving planner assertiveness” and “low cost of development [for OEMs].” Qualcomm makes use of the Blackberry QNX (see the first article in this series) OS for Safety and Hypervisor components.

“We’ve spent the last several years researching and developing our new autonomous platform and accompanying driving stack, identifying challenges, and gathering insights from data analysis to address the complexities automakers want to solve,” says Nakul Duggal, a senior vice president of Qualcomm’s product management.

While Qualcomm hasn’t yet announced any specific carmaker OEMs that will be making use of the Snapdragon Ride, it’s expected that the platform will be integrated into General Motors vehicles (perhaps including Cadillac models) in the near future.

Tesla

All new Tesla cars come standard with advanced hardware capable of providing Autopilot features today, and full self-driving capabilities in the future—through software updates designed to improve functionality over time.
All new Tesla cars come standard with advanced hardware capable of providing Autopilot features today, and full self-driving capabilities in the future—through software updates designed to improve functionality over time.
(Source: Tesla)

Elon Musk, the enigmatic CEO and co-founder of Tesla, has said that by late 2020, his company would have as many as one million SAE Level 5 (fully autonomous) shareable—but individually owned—vehicles on the road, all using the firm’s proprietary Autopilot AV platform.

However, many industry observers consider Musk’s statement to be a wild exaggeration. Musk has since qualified his words by saying that fully autonomous operation would be on a case-by-case basis due to varying country and state regulations.

While Tesla has certainly been conducting extensive real-world AV testing, the status and refinement of its platform beyond what’s being built into present-year Tesla models are significant unknowns.

Another unknown is whether Tesla would share or license its Autopilot platform with other carmakers in the future. Autopilot currently delivers SAE Level 3 functionality (conditional automation) in 2020 model year Tesla vehicles.

In 2019, at an event Tesla calls Autonomy Day, the company claimed that the second-generation Full Self-Driving (FSD) computer component of its Autopilot was superior to NVIDIA’s formidable Pegasus platform (see above) because it runs on two chips, versus Pegasus’ one.

But speed-wise, the FSD is still not as fast as NVIDIA’s Pegasus. For now, at least, it seems these two companies are close to neck-and-neck performance-wise. (Tesla formerly used NVIDIA components before developing its own.)

Uber ATG

Uber's software interprets the world around the vehicle, predicts what actors will do next, and plans a way to safely navigate on‑road.
Uber's software interprets the world around the vehicle, predicts what actors will do next, and plans a way to safely navigate on‑road.
(Source: Uber)

Ridesharing firm Uber’s Advanced Technology Group (ATG) has its own proprietary AV platform. Launched in 2015, the bulk of ATG’s engineers were drawn from Carnegie Mellon University’s Robotics Institute. ATG has minority investments from Toyota, Japanese auto parts supplier Denso and Japan’s Softbank.

Setbacks

Uber originally aimed to make tech platforms for both autonomous cars and trucks. But after spending USD$925 million on ATG’s truck program, Uber shut it down amidst a trade secrets theft case involving a former employee of Waymo (see below).

By 2016, ATG had autonomous cars on the roads of Pittsburgh and Tempe, Arizona. In early 2018, the division suffered a setback when a pedestrian was killed by one of its AVs. By the end of the year, ATG had settled with the victim’s family and agreed to test AVs only at slower speeds and during daylight hours. Since then, ATG has added Washington, D.C. to its test cities.

But despite spending up to a reported USD$200 million per quarter on AVs, Uber scientist Raquel Urtasun admitted that “self-driving cars are going to be in our lives; the question of when is not clear yet. To have [them] at scale is going to take a long time.”

A Hard Road Ahead?

Like Argo AI, not-yet-profitable Uber has dialed back its ambitions and admitted its goals are to operate its AV ridesharing services in a limited number of communities where it’s cost-effective to do so. This means the price of operation (including all of ATG’s operating and R&D costs) would have to be lower than the cost of its existing ridesharing business that uses human drivers.

According to an Uber spokeswoman, this scenario is still “a long way off” (consulting firm McKinsey has predicted it will come between 2025 and 2030). In the meantime, a mitigating factor for Uber could be superior luxury vehicles that ATG is sourcing from partner companies Volvo and Toyota.

While ATG is aiming for working SAE Level 4 functionality (“attention-free,” hands-off driving) for its platform, it isn’t there yet. Eventually, the company would like to have a mix of Level 3, Level 4, and Level 5 ridesharing AVs in the marketplace.

Analysts like Bernstein’s Mark Shmulik are concerned that “ATG [may be] throwing good money after bad. Even in my conversations with them, there seems to be almost a ‘we have to do it’ versus ‘we want to do it’ approach.” Shmulik is concerned that Uber arch-competitor Lyft may be faster to market with Level 4 AVs given Lyft’s partnerships with Aptiv (see the first EMEA article in this series) and Waymo (see below).

Visteon

Visteon's white-label Android app store allows car manufacturers to offer their own app store to recommend, sell, and distribute apps for their Android-based connected infotainment system.
Visteon's white-label Android app store allows car manufacturers to offer their own app store to recommend, sell, and distribute apps for their Android-based connected infotainment system.
(Source: Visteon)

Van Buren Township, Michigan-based Visteon has developed a scalable, modular AI-driven hardware and software AV platform it calls DriveCore. There are three components of DriveCore—a Compute hardware component and the Runtime and Studio middleware and collaborative software development kit (SDK) components.

The desktop-based Studio component is designed for continuous improvements to be made throughout the development life cycle. It allows for analysis of driving algorithms and performance before finalized solutions are uploaded to cloud storage. Once in cloud storage, they can be compared using customized parallel executions in virtual environments. Optimal algorithms can be securely deployed over-the-air (OTA) to the DriveCore Runtime component onboard a vehicle.

DriveCore can be utilized for SAE Autonomy Levels 2 through 5 functionality. It currently enables a number of highway driving processes such as automatic cruise control, automatic lane change, lane centering and overtaking. There’s also a valet parking mode. All of these functions incorporate path planning for obstacle and pedestrian detection and avoidance. DriveCore can read road signs and obey speed limits.

Visteon is currently working with Chinese automaker Guangzhou Automobile Group (GAC) and Chinese technology conglomerate Tencent to integrate and further develop DriveCore. It’s also in talks with other carmaker OEMs for future partnerships. Visteon currently supplies smart cockpit and infotainment systems for Jaguar Land Rover, Daimler, BMW, Groupe PSA (Peugeot/Citroën/Opel/Vauxhall), and others.

Voyage

Voyage's prediction engine uses a combination of advanced probability models, high-definition maps, and time-based behavior models to predict exactly what’s going to happen around Voyage vehicles.
Voyage's prediction engine uses a combination of advanced probability models, high-definition maps, and time-based behavior models to predict exactly what’s going to happen around Voyage vehicles.
(Source: Voyage)

Founded in 2016 by alumni from online learning venture Udacity, Voyage has attracted investment from the likes of Jaguar Land Rover and energy company Chevron.

With a fleet of just 50 retrofitted Ford and Chrysler vehicles, Voyage offers SAE Level 4 (hands-free) and 5 (fully driverless) autonomous ridesharing in locations in California and Florida.

At the moment, Voyage has limited itself to locales where its vehicles will have a maximum speed of 25 mph (footage from at least one of its AV platform demonstration videos has been sped up, so the cars appear to drive faster than this). For now, this translates to low-traffic retirement communities where there are few pedestrians and other cars to worry about. Low speeds allow Voyage’s AVs to make unprotected turns due to slower oncoming traffic.

Still, Voyage’s platform is quite sophisticated in terms of allowing AVs to traverse winding streets, odd-angle junctions, rotaries, and markedly non-highway road types.

The company hopes to build on real-world testing and SAE Level 4 driving experience it gains in these areas before ramping up its capabilities to include higher-speed-limit roads with more traffic density. Like Waymo (see below), the company has built its own 3D self-driving car simulator so it can extensively test out responses to various circumstantial road-based scenarios.

Waymo

Before Waymo's cars drive in any location, their team builds its own detailed three-dimensional maps that highlight information such as road profiles, curbs and sidewalks, lane markers, crosswalks, traffic lights, stop signs, and other road features.
Before Waymo's cars drive in any location, their team builds its own detailed three-dimensional maps that highlight information such as road profiles, curbs and sidewalks, lane markers, crosswalks, traffic lights, stop signs, and other road features.
(Source: Waymo)

As the market-leading AV company with the most extensive (10+ years) real-world driving record for its cars, Google spinoff Waymo is one of the most respected names in the industry. The company’s proprietary Waymo Driver AV platform was the first one capable of delivering SAE Level 5 autonomy, an achievement it demonstrated in 2017.

Since then, Waymo has rolled out a fleet of Waymo One robo-taxis that it currently operates in the suburbs of Phoenix, Arizona. As with Aptiv’s robo-taxi service in Las Vegas (see above), some—but not all—of Waymo One’s taxis are able to be hailed using an app from Uber competitor Lyft.

Although they technically operate at SAE Level 5, most of Waymo One’s taxis have a backup driver in the vehicle. But since 2019, Waymo has been testing fully driverless vehicles traveling at speeds up to 45 mph on busy streets and executing maneuvers like unprotected left-hand turns.

Waymo claims its fifth-generation Waymo Driver platform can spot a pedestrian from half a kilometer away.

Waymo has even developed a protocol for situations where police or emergency services need to interact with one of its taxis that has neither passengers nor a driver.

Besides its robo-taxi service, Waymo also sells a key AV tech component, its USD$7,500 Laser Bear Honeycomb, a LiDAR sensor, to other firms (as long as they pledge not to compete with Waymo in its robo-taxi marketplaces).

As far as selling the Waymo Driver AV platform to carmaker OEMs, it remains to be seen if the company will go in this direction. For now, it seems more likely that Waymo will try to sell self-branded cars either alone or in partnership with a major automaker such as Fiat Chrysler, Jaguar Land Rover, or Groupe Renault. (All three companies have either supplied vehicles or signed cooperative agreements with Waymo recently.)

(ID:46492877)