Could you ever truly relax as a passenger on a car journey if you knew you might need to take the wheel in the event of an emergency? This is the conundrum presented by the latest Level 3 automated lane-keeping system (ALKS) technology being developed by traditional car manufacturers such as Mercedes-Benz, Audi and Toyota.
Designed for use on highways at slower speeds of up to 60km/h, an ALKS enables a vehicle to drive itself in a single lane while maintaining the ability to return control to the driver when required. The UK, US and Japanese governments are all pushing ahead with legalizing the use of ALKS, but the Insurance Institute for Highway Safety (IIHS) in the US is already warning of trouble ahead.
The IIHS has developed a new ratings program that evaluates the safeguards that vehicles with partial automation employ to help drivers stay focused on the road. To earn a good rating, systems will need to ensure that the driver’s eyes are directed at the road and their hands are either on the wheel or ready to grab it at all times.
Escalating alerts and appropriate emergency procedures when the driver does not meet those conditions will also be required.
“Even when drivers understand the limitations of partial automation, their minds can still wander,” explains IIHS research scientist Alexandra Mueller. “As humans, it’s harder for us to remain vigilant when we’re watching and waiting for a problem to occur than it is when we’re doing all the driving ourselves.”
Strict ODD
Mercedes-Benz became the first automotive company in the world to meet the demanding legal requirements of UN-R157 for a Level 3 system, when the German Federal Motor Transport Authority approved its Drive Pilot at the end of 2021. (UN-R157 is a United Nations technical regulation put in place to deliver Level 3 autonomous technology on public roads, subject to national legislation.)
The Drive Pilot technology will debut on the S-Class in the first half of 2022, enabling customers to drive hands-free in heavy traffic or congested situations on suitable stretches of German highway. The system’s operational design domain (ODD) is limited to use on highways up to speeds of 60km/h, with at least two lanes of traffic in each direction and no intersections. These areas are defined by a high-definition map and geofence, which ensures the feature can be operated only within designated areas.
Drive Pilot’s ODD is also restricted to machine-detectable lane markings and the absence of tunnels, toll booths and traffic control devices such as traffic lights. Certain weather conditions – heavy rain, snowstorms, icy conditions and heavy fog – as well as adverse traffic conditions – such as temporary roadworks or emergency vehicles needing to pass – also fall outside the ODD.
These conditions are detected by several dedicated sensors, the performance of which is continuously monitored to detect whether the conditions are interfering with Drive Pilot’s ability to perceive the operating environment. When adverse conditions are perceived, the system will be blocked from activating in the first place or, if already deployed, a request is made for the human driver to take back the wheel.
Driver monitoring
What if the driver is distracted at such a moment? Currently, no technology can determine whether someone’s mind is focused on driving. However, technology can monitor a person’s gaze, head posture or hand position to ensure these are consistent with someone who is actively engaged in driving.
In the German legislation for Level 3, the driver can turn away from driving and, for example, read emails on the multimedia interface (MMI). However, the driver must continue to monitor the driving process so that they can take over the task of driving again if they notice errors in the system or in the functioning of the Level 3 system.
Audi says it monitors drivers using what it calls ‘driver availability detection’. This system uses a camera in the upper part of the instrument panel to analyze different criteria such as position and motion of the head as well as blinking. The aim is to determine whether the driver is prepared to take over the steering again if necessary.
A datalogger is used to clarify who is in charge of driving in the event of an accident. It can determine the transfers between driver and car when the automated driving system is active.
The information is stored in the logger for a defined period. At the same time, the datalogger records various measurement variables in situations involving near or actual contact with other vehicles, or situations in which certain acceleration thresholds are exceeded – as in airbag deployment and automatic emergency braking (AEB).
An Audi spokesperson says, “The rights and obligations of the driver in the respective levels of autonomy must be clearly communicated and it must be clear to the driver. All these aspects have an impact on the liability. The measurement variables are continually being written to a ring memory of several seconds in length in the control unit. The data allows no inferences to the identities of people or vehicles on the basis of faces or license plate numbers, for example.”
Driver alerts
It must also be clear to the driver when he or she is expected to take back full control.
Drivers can be monitored in a variety of ways: through contact or torque sensors on the steering wheel, or through visual inspection with eye tracking. Internal cameras can monitor the driver and process what they are doing by analyzing gestures, eye positions and movements.
Audi uses a combination of optical, acoustic and haptic warning in its autonomous vehicle development models. The Audi spokesperson continues: “We use a driver availability detection system. This uses a camera installed in the upper part of the instrument panel to analyze different criteria such as position and motion of the head as well as blinking. The aim is to determine whether the driver is prepared to take over the steering again if necessary.”
Mercedes uses a multimodal warning strategy with Drive Pilot to alert drivers to the need to take back control of the vehicle. This includes a red visual display and audio and haptic cues to reinforce the need for the driver to take over.
If the driver fails to do so within 10 seconds of the takeover request, Drive Pilot will automatically bring the vehicle to a controlled stop and turn on the car’s hazard lights. During this period of an active takeover request, Drive Pilot maintains its crash avoidance capabilities through steering or braking.
Prolonged failure to respond to the need to take back control after the vehicle has stopped will automatically place an emergency call and unlock the doors to prepare the vehicle for emergency assistance.
Liability concerns
The scope and ability of the L3 system to hand back control to the driver is particularly important when considering where the liability for an incident lies. A recent joint legal report from the UK Law Commission concluded that drivers of autonomous vehicles would not be liable in the event of a collision. This puts pressure on vehicle manufacturers and software developers to monitor human behavior and develop failsafes for events that cannot be predicted.
Professor Sergio Savaresi (pictured, left), full professor in automatic control and head of the Move research lab at the Politecnico di Milano in Italy, says, “There is a general concern among technicians that with the increasing sophistication of Level 3 technology, humans will progressively lose their ability to drive.”
When drivers get their license, they have limited experience and then build up knowledge over thousands of miles. The fear is that drivers who do not routinely engage in the act of driving will essentially become freshly licensed drivers who are expected to take control in a critical situation.
“This is a clear contradiction,” asserts Savaresi. “There is the view from some in the industry that there will be a leapfrog over Level 3 or this transitional stage in order to remove the element where control is handed back.”
Savaresi says developers are working in a gray area with regard to driver re-engagement, but there is a general consensus that humans need around 10 seconds to take back control. This is a lifetime in terms of distance on the road in an emergency, even at 60km/h. “Drivers can’t take back control in a fraction of a second,” he says. “You must have time to disengage from what you were doing, you need to pay attention, assess the situation and make a decision. It’s clear that a good Level 3 system should be capable of a safe stop in any critical situation where a driver was not able to take back control. Even a professional driver in some cases would not be able to manage. This is defined as an emergency maneuver. There is still no seriously developed emergency maneuver module on the market right now.”
Who’s responsible?
The authorization of Level 3 technology to be used in Germany came with no changes to the regulations on product liability. It clearly stipulates, as before, that if a Level 3 system is activated and an accident occurs due to a malfunction of this system, the vehicle manufacturer may be held liable under product liability laws.
In the case of a Level 3 vehicle with a driver, it will also be necessary to check whether the driver has made a mistake and can be prosecuted accordingly.
Determining beyond doubt who will ultimately have to pay for any damages or otherwise be legally liable will require answers to a whole host of questions, such as: Was an autonomous system active? In what mode and at what level was the vehicle driving? Did the driver make a mistake?
For this reason, consideration is already being given to what technical solutions for fault analysis and accident tracking could look like. One option is using onboard telematics to help determine the source of the error. This hardware would record the course of an accident before and after the event to determine the cause beyond any doubt.
Uta Klawitter (pictured, left), head of general counsel legal services at Audi, is currently working with her team to address the legal parameters of autonomous driving. Audi’s SocAIty study, which investigated societal acceptance of autonomous vehicle technology, proposes that moving forward, every person who chooses to ‘drive’ or own an autonomous vehicle will bear a high level of responsibility and, in the event of doubt, will have to accept liability for their own mistakes.
This is an important factor for the industry as a whole. If all responsibility for liability is determined to lie with manufacturers, fleet operators and other companies, Audi says this might reduce their willingness to develop and bring new technologies to market.
Klawitter concludes, “Liability cannot be one-sided at the expense of one party. Both sides must be incentivized to behave with care. Equally, this provides the motivation for manufacturers to continue researching and regularly bringing new innovations to market.” END
This feature was first published in the April 2022 edition of Autonomous Vehicle International. You can subscribe to receive future issues, for free, by filling out this simple form.