Internet-Draft | draft-dong-remote-driving-usecase-00 | June 2022 |
Dong, et al. | Expires 29 December 2022 | [Page] |
This document illustrates the use case of remote driving that leverages the human driver's advanced perceptual and cognitive skills to enhance autonomous driving when it is absent or falls short. Specifically the document analyzes the end-to-end latency that is required in the network to support collision avoidance in remote driving. The document also summarizes the other necessary requirements that the networking services shall support.¶
This Internet-Draft is submitted in full conformance with the provisions of BCP 78 and BCP 79.¶
Internet-Drafts are working documents of the Internet Engineering Task Force (IETF). Note that other groups may also distribute working documents as Internet-Drafts. The list of current Internet-Drafts is at https://datatracker.ietf.org/drafts/current/.¶
Internet-Drafts are draft documents valid for a maximum of six months and may be updated, replaced, or obsoleted by other documents at any time. It is inappropriate to use Internet-Drafts as reference material or to cite them other than as "work in progress."¶
This Internet-Draft will expire on 29 December 2022.¶
Copyright (c) 2022 IETF Trust and the persons identified as the document authors. All rights reserved.¶
This document is subject to BCP 78 and the IETF Trust's Legal Provisions Relating to IETF Documents (https://trustee.ietf.org/license-info) in effect on the date of publication of this document. Please review these documents carefully, as they describe your rights and restrictions with respect to this document. Code Components extracted from this document must include Revised BSD License text as described in Section 4.e of the Trust Legal Provisions and are provided without warranty as described in the Revised BSD License.¶
Autonomous vehicles (AV) have made great progress in the recent years, which rely on numerous well-placed sensors that continuously detect, observe the location and movement of surrounding vehicles, conditions on the road, pedestrians, traffic lights, etc. Autonomous vehicle can be controlled by its own central computer, which manipulates the steering, accelerator, and brake, achieving self-driving in different levels.¶
SAE International's new standard "J3016: Taxonomy and Definitions for Terms Related to On-Road Motor Vehicle Automated Driving Systems" defines six LoAs (Level of Automation) [SAEJ3016], including full automation (level 5), high automation (level 4), conditional automation (level 3), partial automation (level 2), driver assistance (level 1), and no automation (level 0).¶
Although each vehicle manufacturer has been taking its best effort of making progress in increasing the level of automation, the current automated vehicles by themselves can only fit into the SAE classification 2 or 3. AVs may fail short in unexpected situations. In such cases, it is desirable that humans can operate the vehicle manually to recover from a failure situation through remote driving. Until the autonomous technology becomes mature enough to be level 5, the experts suggest AVs should be backed up by tele-operations.¶
The terms and abbreviations used in this document are listed below.¶
The above terminology is defined in greater details in the remainder of this document.¶
Remote driving is a mechanism in which a human driver operates a vehicle from a distance through communication networks. Remote driving leverages the human driver's advanced perceptual and cognitive skills to further assist the autonomous driving when it falls short, and overcomes many complex situations that computer vision or artificial intelligence could not foresee or apprehend. Such situations and possible failures of autonomous driving include: (1) perception failure at night or under challenging weather conditions, e.g., low visibility due to fog, lane markers are covered by snow; (2) confusing or malfunctioning traffic lights, unrecognizable traffic signs due to corrosion or graffiti; (3) Confusing detour signs or complex instructions temporarily ordered by police officers, which require extra knowledge about the local traffic and understanding of the local construction works; (4) Complex or confusing parking signs, which might be handwritten and hard to be understood by computers. Parking might only be allowed on certain dates during the week, or parking lots are only permissible for certain types of vehicles. With remote driving being added to the AV control loop, passengers could feel safe enough.¶
Remotely operated vehicles may also be of interest to personal transportation services. Vay, a Berlin-based startup [Vay] plans to debut a fleet of taxis controlled by remote teledrivers by 2022. The concept behind Vay is that when you order a Vay, one of teledrivers is tasked to navigate one Vay to your pickup location. Then you take control the Vay. After you reach your destination, the teledriver takes control of the Vay and deliver it to the next nearby customer. During the whole transaction, the remote driving takes place for Vay delivery. This is advertised to happen at the initial roll-out stage, the Vay might be remotely controlled by teledrivers to drive the customers around in the future stages when the technologies are mature enough. Vay's system is promised to be built safer than conventional driving by controlling the top four causes of fatal urban accidents, which are driving under the influence, speeding, distraction, and fatigue.¶
Remotely operated trunks could possibly eliminate the threats to road safety, driver/passenger safety that are caused by fleet driver fatigue during long drives. Remotely operated vehicles are also particularly useful compared to autonomous trunking [Tusimple] in situations where it would be hazardous or impossible for humans to operate in, for example, construction vehicles in remote sites or emergency service vehicles in areas that are affected by chemical spills, by active wildfires, or by hurricane conditions.¶
A remotely controlled vehicle needs to transit necessary data in high volumes to the remote operation center which might be located in edge cloud or central cloud. The data includes all the sensory feeds that the autonomous vehicle itself could collect. Signals from GPS (Global Positioning System) satellites could be combined with reading from tachometers, altimeters, and gyroscopes to provide more accurate positioning of the vehicle. Radar sensors monitor the positions of other vehicles nearby. Lidar (Light Detection and Ranging) sensors bounce pulses of light off the surroundings to identify lane markings and road boundaries. Ultrasonic sensors are used to measure the position of objects that are very close to the vehicle. Video cameras consistently take pictures of the surroundings from different angles. Volumetric data from vehicles are sent from the vehicles to the remote driving center to provide the remote driver with adequate perception of the environment. The remote driver can then provide appropriate instructions to help the autonomous vehicle resolve the issues.¶
In this section, we use a specific collision avoidance scenario in remote driving as shown in Figure 1 to illustrate that the network and its protocols need to provide the necessary support. There are many similar use cases that have already been specified in [TR22.885] and [TR22.886].¶
Given the current technologies in sensing, encoding and decoding, together with the Best Effort (BE) service provided in the current Internet, the total roundtrip delay between the time when the roadside camera captures picture of pedestrian on the crossroad and the time when the self-driving car receives the signal to brake is around 250-400 ms. On the other hand, the latency already incurred by the remote driver's reaction time also adds the total latency, adding to the distance required for the vehicle to come to a stop. The detailed breakdown of the total latency is shown as below:¶
The collision avoidance distance is proportional to the vehicle speed. For example, if the car is driving at 60 km/hour, the collision avoidance distance must be longer than 7 meters, in other words, the self-driving car must start to brake more than 7 meters away from the pedestrians. Table 1 shows the calculation of collision avoidance distance based on the vehicle's speed and the current total latency.¶
If the vehicle is driving at higher speed (e.g., 80 km/hour) and for it to start to brake at shorter distance away from the pedestrians (e.g., 4 meters), the total round-trip delay needs to be much shortened (e.g., 4/(80/3600)=180 ms). Assuming with the technologies advancement, the total time needed for sensory image capture, framing and encoding, decoding and display is reduced to 60 ms, the total transmission time in the network cannot be longer than 20 ms precisely. Within the 20 ms, the captured image or video data, and other sensory data need to arrive the remote server, the command from the remote driver needs to reach the vehicle as well.¶
Speed | Collision Avoidance Distance |
---|---|
5 km/hour = 1.4 m/sec | 1.4*0.4 = 0.56 m |
30 km/hour = 8.4 m/sec | 8.4*0.4 = 3.36 m |
60 km/hour = 16.8 m/sec | 16.8*0.4 = 6.72 m |
80 km/hour = 22.3 m/sec | 22.3*0.4 = 8.92 m |
120 km/hour = 33.4 m/sec | 33.4*0.4 = 13.36 m |
The following requirements need to be supported by the networks:¶
This document requires no actions from IANA.¶
This document introduces no new security issues.¶