A remote computer system may receive data associated with an autonomous vehicle traversing an environment along a route in accordance with a planned trajectory and data associated with an event within the environment and cause a display to display a representation of the autonomous vehicle. A request to generate an intermittent stopping action message comprising one or more of a position or orientation, a period of time, or a condition is received by the remote computer system and transmitted to the autonomous vehicle, wherein the autonomous vehicle is configured to move to the one or more of position or orientation for the period of time or until the condition is met, such that the vehicle moves to or is at the one or more of position or orientation prior to the event occurring.
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
B60Q 1/50 - Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking
B60Q 5/00 - Arrangement or adaptation of acoustic signal devices
Techniques for segmenting and classifying a representation of aggregated sensor data from a scene are discussed herein. Sensor data may be collected during multiple traversals of a same scene, and the sensor data may be filtered to remove portions of the sensor data not relevant to road network maps. In some examples, the filtered data may be aggregated and represented in voxels of a three-dimensional voxel space, from which an image representing a top-down view of the scene may be generated, though other views are also contemplated. Operations may include segmenting and/or classifying the image e g., by a trained machine-learned model, to associate class labels indicative of map elements (e.g., driving lane, stop line, turn lane, and the like) with segments identified in the image. Additionally, techniques may create or update road network maps based on segmented and semantically labeled image(s) of various portions of an environment.
G06T 7/70 - Determining position or orientation of objects or cameras
G06V 10/764 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
G06V 10/80 - Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
G06V 10/98 - Detection or correction of errors, e.g. by rescanning the pattern or by human intervention; Evaluation of the quality of the acquired patterns
This disclosure relates to detecting and calibrating out noise from lidar data, including noise caused by solar radiation and/or other lidar background noise sources. A lidar system may determine a background noise level for a location by sampling using an analog-to-digital converter (ADC) during a time window associated with a lidar pulse. The ADC sample may be compared to additional ADC samples performed during additional time periods between lidar pulses. The ADC samplings may be analyzed to determine the lidar background noise level of the environment, and the lidar data may be modified to calibrate out the background noise. In some examples, the lidar system itself may be reconfigured based on the lidar background noise, including modifying the laser transmit power, aperture size, optical gain, and/or other features of the lidar system to calibrate out the background noise and/or improve the signal-to-noise ratio (SNR) of the lidar data.
G01S 7/483 - RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES - Details of systems according to groups , , of systems according to group - Details of pulse systems
Systems and techniques for determining a sideslip vector for a vehicle that may have a direction that is different from that of a heading vector for the vehicle. The sideslip vector in a current vehicle state and sideslip vectors in predicted vehicles states may be used to determine paths for a vehicle through an environment and trajectories for controlling the vehicle through the environment. The sideslip vector may be based on a vehicle position that is the center point of the wheelbase of the vehicle and may include lateral velocity, facilitating the control of four-wheel steered vehicle while maintaining the ability to control two-wheel steered vehicles.
Techniques for determining a safety metric associated with a vehicle controller are discussed herein. To validate safe operation of a system, a simulation may be executed including determining a relative location of a simulated object within the simulation with respect to a location of a simulated vehicle, determining, based on the relative location of the simulated object, an adjusted location of the simulated object within the simulation, controlling, by the autonomous vehicle controller and based on the relative location of the simulated object, the simulated vehicle to follow a trajectory within the simulation, and performing a collision check between the simulated vehicle and the simulated object at the adjusted location. The safety metric associated with the autonomous vehicle controller may then be determined based at least in part an outcome of the collision check.
Techniques are discussed herein for determining optimal driving trajectories for autonomous vehicles in complex multi-agent driving environments. A baseline trajectory may be perturbed and parameterized into a vector of vehicle states associated with different segments (or portions) of the trajectory. Such a vector may be modified to ensure the resultant perturbed trajectory is kino-dynamically feasible. The vectorized perturbed trajectory may be input, including a representation of the current driving environment and additional agents, into a prediction model trained to output a predicted future driving scene. The predicted future driving scene, including predicted future states for the vehicle and predicted trajectories for the additional agents in the environment, may be evaluated to determine costs associated with each perturbed trajectory. Based on the determined costs, the optimization algorithm may determine subsequent perturbations and/or the optimal trajectory for controlling the vehicle in the driving environment.
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
Techniques for determining a vehicle trajectory that causes a vehicle to navigate in an environment relative to one or more objects are described herein. For example, the techniques may include a computing device determining a decision tree having nodes to represent different object intents and/or nodes to represent vehicle actions at a future time. A tree search algorithm can search the decision tree to evaluate potential interactions between the vehicle and the one or more objects over a time period, and output a vehicle trajectory for the vehicle. The vehicle trajectory can be sent to a vehicle computing device for consideration during vehicle planning, which may include simulation.
A modified Kalman filter may include one or more neural networks to augment or replace components of the Kalman filter in such a way that the human interpretability of the filter's inner functions is preserved. The neural networks may include a neural network to account for bias in measurement data, a neural network to account for unknown controls in predicting a state of an object, a neural network ensemble that is trained differently based on different sensor data, a neural network for determining the Kalman gain, and/or a set of Kalman filters including various neural networks that determine independent estimated states, which may be fused using Bayesian fusion to determine a final estimated state.
G06V 10/82 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
G06V 10/44 - Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
Relocating and/or re-sizing map elements using an updated pose graph without introducing abnormalities to the map data may comprise determining a transformation between a source node of a first pose graph and a target node of a second pose graph and determining a modification to a map element based at least in part on the transformation. The techniques may include determining a stress on the map based at least in part on one or more modifications to map elements and determining if the stress meets or exceeds a threshold. In instances where the stress meets or exceeds a threshold, a modification may be altered, reversed, and/or indicated in a notification transmitted to a user interface.
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
Systems and techniques for determining deceleration controls to use in a trajectory for use in stopping a vehicle are described. A deceleration determination system may receive a trajectory from a trajectory determination system and determine, based on various deceleration parameters, the appropriate controls to configure in a longitudinal profile of the trajectory and the suitable implementation points for implementing the controls. The deceleration determination system may determine deceleration data for various types of trajectories that a vehicle computing system may select from for operating the vehicle based on current conditions.
B60W 30/095 - Predicting travel path or likelihood of collision
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
Systems and techniques for determining a trajectory for use in controlling a vehicle are described. A trajectory determination system may generate a variety of trajectories for potential use in controlling a vehicle, including a maximum braking trajectory that enables the maximum application of the vehicle's brakes. A vehicle computing system may determine a distance between vehicle and an obstacle and stopping distances for the various trajectories and implement the maximum braking trajectory after determining that the distance to stop for that trajectory is the same as, but not substantially greater than, the distance between the vehicle and the obstacle.
A remote operations system receives a request for remote operator assistance and adds the request to a queue of additional requests. The queue may be ordered based on time of receipt, priority, criticality, and the like. The remote operations system determines a remote operator of a set of remote operators to provide a response to the request in the queue based at least in part on one or more of a status of the remote operator (e.g., indicative of availability, whether they are in training, etc.), criteria associated with the remote operator (e.g., skills in responding to various requests, preferences for a geographic area, mission types, etc.), and information associated with the request received from the vehicle (e.g., mission type, sensor data, messages, vehicle status, etc.). If the request is not accepted in a threshold period of time, the request may be rerouted to an additional remote operator.
Techniques for determining that a reference trajectory is free of intersections or potential collisions with objects are discussed herein. Trajectories being generated by a primary computing device of a vehicle can be utilized to select or determine a reference trajectory to be followed by the vehicle. A secondary computing device of the vehicle can identify a current offset from the reference trajectory and utilize the offset with a kinematics model to determine a trajectory that is predicted for the vehicle to drive to return to the reference trajectory. Validation of the reference trajectory may be based on predicted collision data determined using the tracker trajectory. The predicted collision data can be utilized to control the vehicle to follow the reference trajectory or a stop trajectory.
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
B60W 30/08 - Predicting or avoiding probable or impending collision
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
Techniques for controlling a vehicle are described herein. A system may receive a route to navigate from a start position to an end position in an environment. The system may receive map data based on the route and determine a lattice based on the map data. The lattice comprises nodes and connections therebetween. The nodes may represent various states of the vehicle. The connections may represent various feasible transitions between the nodes. The lattice may further comprise a set of connections representing a trajectory from the start position to the end position. The system may receive sensor data representing an object in the environment and determine a state of the object based on the sensor data. The system may modify, based on the object state and as an updated cost, a precomputed cost associated with the trajectory. The system may further control the vehicle based on the updated cost.
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
Techniques for determining right of way through an intersection are discussed herein. Routes through the intersection may be associated with respective priorities. The route associated with an inbound lane devoid of yield or stop markers may be determined as being associated with the highest priority. The hierarchy of the other priorities may be organized based on whether the number of times routes associated with each respective priority intersects the route associated with the highest priority. The routes and the priorities are saved in a data structure, and the data structure is transmitted to an autonomous vehicle for controlling the autonomous vehicle through the intersection.
Techniques for improving operational decisions of an autonomous vehicle are discussed herein. In some cases, a system may generate reference graphs associated with a route of the autonomous vehicle. Such reference graphs can comprise precomputed feature vectors based on grid regions and/or lane segments. The feature vectors are usable to determine scene context data associated with static objects to reduce computational expenses and compute time.
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
G06T 11/20 - Drawing from basic elements, e.g. lines or circles
G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
Techniques for validating or determining trajectories for a vehicle are discussed herein. A trajectory management component can receive status and/or error data from other safety system components and select or otherwise determine safe and valid vehicle trajectories. A perception component of a safety system can validate a trajectory upon which the trajectory management component can wait for selecting a vehicle trajectory, validate trajectories stored in a queue, and/or utilize kinematics for validation of trajectories. A filter component of the safety system can filter out objects based on trajectories stored in a queue. A collision detection component of the safety system can determine the collision states based on trajectories stored in a queue or determine a collision state upon which the trajectory management component can wait for selecting or otherwise determining a vehicle trajectory.
B60W 30/08 - Predicting or avoiding probable or impending collision
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
Techniques for representing a scene or map based on statistical data of captured environmental data are discussed herein. In some cases, the data (such as covariance data, mean data, or the like) may be stored as a multi-resolution voxel space that includes a plurality of semantic layers. In some instances, individual semantic layers may include multiple voxel grids having differing resolutions. Multiple multi -resolution voxel spaces may be merged or aligned to generate combined scenes based on detected voxel covariances at one or more resolutions.
G06T 7/30 - Determination of transform parameters for the alignment of images, i.e. image registration
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
Techniques for determining a probability that a first sensor is miscalibrated with respect a second sensor are discussed herein. For example, a computing device may receive calibrated extrinsics of a camera to a lidar, determine a plurality of sets of perturbed extrinsics based on the calibrated extrinsics, determine respective costs for perturbed extrinsics of the plurality of sets of perturbed extrinsics based on image data captured by the camera, the plurality of sets of perturbed extrinsics, and lidar data captured by the lidar, and determine a local maxima score for the calibrated extrinsics based at least in part on the respective costs for the perturbed extrinsics of the plurality of sets of perturbed extrinsics and a cost of the calibrated extrinsics. The computing device may then determine a probability that the camera is miscalibrated based on a Bayes probability and the local maxima score.
Systems and techniques for determining a buffer region for use in controlling a vehicle and avoiding collisions are disclosed herein. A predicted region of travel of a vehicle front bumper may be determined. The position of the front bumper may be determined at points along a center curve of the predicted region of travel and polygons may be determined for the positions. The polygons may be joined and modified using a convex shape-based algorithm to determine a convex polygonal buffer region that is used in collision detection.
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
Inaccurate sensor data, such as by caused by reflections, may skew ground profile estimations that identify a profile or plane associated with a surface, such as a roadway. Correcting a ground profile that was based on inaccurate sensor data may include determining a location at which a slope of the ground profile meets or exceeds a maximum slope, determining a line associated with the location and having a slope of a magnitude equal to the maximum slope, and using the line to determine a subset of sensor data points that lies outside a region defined by the line. The subset of sensor data points that lies outside the region may be excluded from the set of sensor data points used to generate ground profile(s) and a new ground profile may be determined using the set of sensor data points, less the subset of sensor data points.
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
22.
CAPTURING AND SIMULATING RADAR DATA FOR AUTONOMOUS DRIVING SYSTEMS
A simulation system may generate radar data for synthetic simulations of autonomous vehicles, by using a data store of object radar data and attributes determined from sensor data captured in real-world physical environments. The radar data store may include radar point clouds representing real-world objects and associated object attributes, as well as radar background data captured for a number of physical environments. The simulation system may construct radar data for use in a simulation based on radar object data and/or radar background data, including using various probabilities within various overlay regions to determine subsets of object and background radar points to be rendered. During a simulation, the generated radar data may be provided to a simulated radar sensor of a simulated vehicle configured to execute trained perception models based on radar data input.
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
B60W 30/08 - Predicting or avoiding probable or impending collision
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
G01S 13/931 - Radar or analogous systems, specially adapted for specific applications for anti-collision purposes of land vehicles
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
Techniques for generating more accurate determinations of object proximity by using vectors in data structures based on vehicle sensor data are disclosed. Vectors reflecting a distance and direction to a nearest object edge from a reference point in a data structure are used to determine a distance and direction from a point of interest in an environment to a nearest surface. In some examples, a weighted average query point response vector is determined using the determined distance vectors of cells neighboring the cell in which the point of interest is located and nearest to the same object as the query point, providing a more accurate estimate of the distance to the nearest object from the point of interest.
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
24.
RADAR OBJECT CLASSIFICATION BASED ON RADAR CROSS-SECTION DATA
This disclosure describes techniques for using radar cross-section (RCS) data to classify objects detected by autonomous vehicles within driving environments. In some examples, the variance of the RCS data associated with an object may be evaluated to determine signal interference caused by multipath fading. The variance of the RCS data may be used to classify the object and to determine whether the autonomous vehicle can safely drive over the object. For instance, objects such as manhole covers, storm drains, and expansion joints may provide a significant radar signal, but low RCS variance indicating that they can be driven over by the vehicle. Based on the classification of the object, the autonomous vehicle may determine a trajectory around the object or directly over the object.
G01S 13/931 - Radar or analogous systems, specially adapted for specific applications for anti-collision purposes of land vehicles
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
G01S 7/41 - RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES - Details of systems according to groups , , of systems according to group using analysis of echo signal for target characterisation; Target signature; Target cross-section
B60W 30/08 - Predicting or avoiding probable or impending collision
G01S 13/58 - Velocity or trajectory determination systems; Sense-of-movement determination systems
G05D 1/02 - Control of position or course in two dimensions
A computer-implemented method. Includes obtaining pointwise data indicating, for a plurality of time steps, a pointwise measurement of a state of an object detected by an object detection system. Includes obtaining, from a runtime model, runtime data indicating, for the plurality of time steps, a runtime estimate of the state of the object. Includes processing, by a benchmark model, the pointwise data to determine, for the plurality of time steps, a benchmark estimate of the state of the object. Includes evaluating a metric measuring, for the plurality of time steps, a deviation between the runtime estimate and the benchmark estimate of the state of the object. Includes updating, based on the on the evaluation of the metric, the runtime model.
G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
G06V 10/62 - Extraction of image or video features relating to a temporal dimension, e.g. time-based feature extraction; Pattern tracking
G06V 10/80 - Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
26.
LIDAR POINT CLOUD ALIGNMENT VALIDATOR IN HD MAPPING
Techniques are described for determining whether a point cloud registration (e.g., alignment) between two sets of data points is valid. Such techniques can include determining voxelized representations of the sets of data points, and comparing characteristics of spatially aligned voxels within the voxelized representations. Characteristics of voxels to be compared can include classification labels of data points associated with voxels, including whether or not voxels correspond to free space. Point cloud registrations determined to be invalid can be given a weighting to be used in a subsequent high definition (HD) map building process. Generated maps can then be deployed for use in autonomous vehicles.
Techniques for determining an output from a plurality of sensor modalities are discussed herein. Features from a radar sensor, a lidar sensor, and an image sensor may be input into respective models to determine respective intermediate outputs associated with a tracks associated with an object and associated confidence levels. The Intermediate outputs from a radar model, a lidar model, and an vision model may be input into a fused model to determine a fused confidence level and fused output associated with the track. The fused confidence level and the individual confidence levels are compared to a threshold to generate the track to transmit to a planning system or prediction system of an autonomous vehicle. Additionally, a vehicle controller can control the autonomous vehicle based on the track and/or on the confidence level(s).
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
28.
IDENTIFYING RELEVANT OBJECTS WITHIN AN ENVIRONMENT
This disclosure is directed to techniques for identifying relevant objects within an environment. For instance, a vehicle may use sensor data to determine a candidate trajectory associated with the vehicle and a predicted trajectory associated with an object. The vehicle may then use the candidate trajectory and the predicted trajectory to determine an interaction between the vehicle and the object. Based on the interaction, the vehicle may determine a time difference between when the vehicle is predicted to arrive at a location and when the object is predicted to arrive at the location. The vehicle may then determine a relevance score associated with the object using the time difference. Additionally, the vehicle may determine whether to input object data associated with the object into a planner component based on the relevance score. The planner component determines one or more actions for the vehicle to perform.
Techniques are described herein for generating trajectories for autonomous vehicles using velocity-based steering limits. A planning component of an autonomous vehicle can receive steering limits determined based on safety requirements and/or kinematic models of the vehicle. Discontinuous and discrete steering limit values may be converted into a continuous steering limit function for use during on-vehicle trajectory generation and/or optimization operations. When the vehicle is traversing a driving environment, the planning component may use steering limit functions to determine a set of situation-specific steering limits associated with the particular vehicle state and/or driving conditions. The planning component may execute loss functions, including steering angle and/or steering rate costs, to determine a vehicle trajectory based on the steering limits applicable to the current vehicle state.
B60W 40/10 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to vehicle motion
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
30.
AUTONOMOUS VEHICLE OPERATIONS RELATED TO DETECTION OF AN UNSAFE PASSENGER PICKUP/DELIVERY CONDITION
A passenger may be rather vulnerable to safety risks during pickup and/or drop-off of a passenger by a vehicle. To mitigate or eliminate such risk, the vehicle may determine an endpoint for a vehicle route to pickup or drop-off a passenger at a location. The vehicle may determine an estimated path between the endpoint and the location and may determine a safety confidence score by a machine-learned model for the estimated path and/or may predict a trajectory of a detected object to ascertain whether the estimated path is safe. The vehicle may execute any of a number of different mitigation actions to reduce or eliminate a safety risk if one is detected.
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
B60W 40/10 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to vehicle motion
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
B60R 25/34 - Detection related to theft or to other events relevant to anti-theft systems of conditions of vehicle components, e.g. of windows, door locks or gear selectors
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
Techniques for detecting and tracking objects in an environment are discussed herein. For example, techniques can include detecting a center point of a block of pixels associated with an object. Unimodal (e.g., Gaussian) confidence values may be determined for a group of pixels associated with an object. Proposed detection box center points may be determined based on the Gaussian confidence values of the pixels and an output detection box may be determined using filtering and/or suppression techniques. Further, a machine-learned model can be trained by determining parameters of a center pixel of the detection box and a focal loss based on the unimodal confidence value which can then be backpropagated to the other pixels of the detection.
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
Techniques for a perception system of a vehicle that can detect and track objects in an environment are described herein. The perception system may include a machine‑learned model that includes one or more different portions, such as different components, subprocesses, or the like. In some instances, the techniques may include training the machine-learned model end-to-end such that outputs of a first portion of the machine‑learned model are tailored for use as inputs to another portion of the machine‑learned model. Additionally, or alternatively, the perception system described herein may utilize temporal data to track objects in the environment of the vehicle and associate tracking data with specific objects in the environment detected by the machine‑learned model. That is, the architecture of the machine‑learned model may include both a detection portion and a tracking portion in the same loop.
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
33.
ENCODING RELATIVE OBJECT INFORMATION INTO NODE EDGE FEATURES
Techniques for determining unified futures of objects in an environment are discussed herein. Techniques may include determining a first feature associated with an object in an environment and a second feature associated with the environment and based on a position of the object in the environment, updating a graph neural network (GNN) to encode the first feature and second feature into a graph node representing the object and encode relative positions of additional objects in the environment into one or more edges attached to the node. The GNN may be decoded to determine a predicted position of the object at a subsequent timestep. Further, a predicted trajectory of the object may be determined using predicted positions of the object at various timesteps.
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
A system for faster object attribute and/or intent classification may include a machine-learned (ML) architecture that processes temporal sensor data (e.g., multiple instances of sensor data received at different times) and includes a cache in an intermediate layer of the ML architecture. The ML architecture may be capable of classifying an object's intent to enter a roadway, idling near a roadway, or active crossing of a roadway. The ML architecture may additionally or alternatively classify indicator states, such as indications to turn, stop, or the like. Other attributes and/or intentions are discussed herein.
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
B60T 7/20 - Brake-action initiating means for initiation not subject to will of driver or passenger specially adapted for trailers, e.g. in case of uncoupling of trailer
G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
G06V 20/70 - Labelling scene content, e.g. deriving syntactic or semantic representations
G06V 10/764 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
Techniques are discussed herein for generating, evaluating, and determining trajectories for autonomous vehicles traversing environments. A state transition model may be generated and used to determine a trajectory from multiple possible trajectories generated by one or more vehicle systems. In some examples, a state transition model may determine a trajectory based on the validation results of the possible trajectories, along with vehicle status data from one or more vehicle components. Various techniques described herein may improve vehicle safety and driving efficiency, by ensuring that the vehicle determines a safe and valid trajectory consistently while navigating the environment, while also being responsive to requests and status updates from various vehicle components.
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
Techniques for using a set of variables to estimate a vehicle velocity of a vehicle are discussed herein. A system may determine an estimated velocity of the vehicle using a minimization based on an initial estimated velocity, steering angle data and wheel speed data. The system may then control an operation of the vehicle based at least in part on the estimated velocity.
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
37.
DRIVABLE SURFACE MAP FOR AUTONOMOUS VEHICLE NAVIGATION
The present disclosure is related to generating map data explicitly indicating a total drivable surface, which may include multiple types of drivable surfaces. For instance, a given portion of a map may include map data indicating a combination of various drivable surfaces, such as road segments, lane properties, intersections, parking areas, shoulders, driveways, etc. Examples of the present disclosure join these different types of drivable surfaces into combined map data that explicitly indicates a total drivable surface, such as a perimeter boundary indicating or representing a transition from a drivable surface to a non-drivable surface. The map data indicating the total drivable surface may be searched to determine information related to a drivable surface boundary, such as location and type. This boundary information may be used in various contexts, such as when planning a trajectory or remotely controlling a vehicle.
G01C 21/00 - Navigation; Navigational instruments not provided for in groups
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
Techniques for interacting with authorized and unauthorized users by a door interface component are discussed herein. A vehicle computing device can implement the door interface component and/or an authentication component to control operation of a door of the vehicle or initiate a communication for assistance. For instance, the door interface component can include a button, that provides different visual indicators and functionality based on whether the user is authorized to enter the autonomous vehicle (e.g., open a door or provide visual indicators for the user to select to cause the door to open) or to request to move the vehicle, initiate hiring the vehicle, or call for help when the user is unauthorized to enter the autonomous vehicle.
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
B60W 40/08 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to drivers or passengers
B60W 50/14 - Means for informing the driver, warning the driver or prompting a driver intervention
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
39.
VERIFYING REPRODUCIBILITY FOR A VEHICLE CONTROLLER
Techniques associated with detecting non-deterministic behavior with a component and/or subcomponents of an autonomous vehicle are discussed herein. In some cases, a simulation system may be configured to simulate operations of the autonomous vehicle and to detect changes in behavior between instances and with respect to log data or expected results. The simulation system may flag or otherwise identify components and/or subcomponents responsive to detecting potentially non-deterministic behavior.
G06F 30/20 - Design optimisation, verification or simulation
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
Techniques for converting power received from a power grid at a first voltage and outputting a signal at a second voltage are discussed herein. A power converter with a transformer that has a 22.5 degrees phase shift between current output by corresponding pairs of secondary windings can be utilized to convert power of a first level to power of a second level. The transformer can output power from 30 secondary windings. The power converter can output power with a total harmonic distortion of 5% and an efficiency of 96% or higher. Further, power can be output by a transmission coil and received by a receive coil in a device, such as a vehicle, to wirelessly charge the vehicle.
B60L 53/38 - Means for automatic or assisted adjustment of the relative position of charging devices and vehicles specially adapted for charging by inductive energy transfer
B60L 53/20 - Methods of charging batteries, specially adapted for electric vehicles; Charging stations or on-board charging equipment therefor; Exchange of energy storage elements in electric vehicles characterised by converters located in the vehicle
B60L 55/00 - Arrangements for supplying energy stored within a vehicle to a power network, i.e. vehicle-to-grid [V2G] arrangements
H02M 7/04 - Conversion of ac power input into dc power output without possibility of reversal by static converters
H02M 1/42 - Circuits or arrangements for compensating for or adjusting power factor in converters or inverters
Techniques for a pose component that may determine a pose are described herein. A pose may refer to the inertial pose or position of a vehicle which may be updated in real-time or near real-time. For example, the techniques may include receiving a plurality of input signals at a pose component and monitoring the plurality of input signals. The pose component may determine, based on the monitoring of the plurality of input signals, a particular pose update algorithm of a plurality of pose update algorithms for determining the pose and determine, using the particular pose update algorithm, the pose based the plurality of input signals and IMU measurements associated with a primary IMU.
B60W 40/10 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to vehicle motion
B60W 50/029 - Adapting to failures or work around with other constraints, e.g. circumvention by avoiding use of failed parts
B60W 50/02 - Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
Techniques for aggregating costs associated with one or more heat maps to control a vehicle in an environment are discussed herein. A vehicle computing device can implement a model to determine heat maps and respective cost information for different features of the environment based on sensor data. The vehicle computing device can output a planned trajectory for the vehicle based on combining the heat maps. The techniques can also include determining a rationalization or root cause detailing reasons why the planned trajectory was determined.
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
B60W 50/14 - Means for informing the driver, warning the driver or prompting a driver intervention
Techniques for detecting and classifying objects using lidar data are discussed herein. In some cases, the system may be configured to utilize a predetermined number of prior frames of lidar data to assist with detecting and classifying objects. In some implementations, the system may utilize a subset of the data associated with the prior lidar frames together with the full set of data associated with a current frame to detect and classify the objects.
Techniques for utilizing a depth completion algorithm to determine dense depth data are discussed are discussed herein. Two-dimensional image data representing an environment can be captured or otherwise received. Depth data representing the environment can be captured or otherwise received. The depth data can be projected into the image data and processed using the depth completion algorithm. The depth completion algorithm can be utilized to determine the dense depth values based on intensity values of pixels, and other depth values. A vehicle can be controlled based on the determined depth values.
Techniques for generating and validating map data that may be used by a vehicle to traverse an environment are described herein. The techniques may include receiving sensor data representing an environment and receiving map data indicating a traffic control annotation. The traffic control annotation may be associated, as projected data, with the sensor data based at least in part on a position or orientation associated with a vehicle. Based at least in part on the association, the map data may be updated and sent to a fleet of vehicles. Additionally, based at least in part on the association the vehicle may determine to trust the sensor data more than the map data while traversing the environment.
Techniques for providing remote assistance to a vehicle are discussed. The techniques include receiving, from a vehicle, an indication of an event and displaying, on a display and to a user, a portion of an environment including the vehicle. The techniques further determine a valid region in the portion of the environment associated with a location at which the vehicle is capable of navigating. The techniques also display, on the display a footprint of the vehicle, where the footprint is associated with a position and orientation. The techniques further include transmitting the position and orientation of the footprint to the vehicle, which causes the vehicle to traverse in the environment to the position and orientation.
Systems and methods for calibrating multiple inertial measurement units on a system include calibrating a first of the inertial measurement units relative to the system using a first calibration model, and calibrating the remaining inertial measurement unit(s) relative to the first inertial measurement unit using a second calibration model. The calibration of the remaining inertial measurement unit(s) to the first inertial measurement unit can be based on a rigid body model by aligning a rotational velocity of the first inertial measurement unit with a rotational velocity of the remaining inertial measurement unit(s).
B60W 50/02 - Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
B60W 40/10 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to vehicle motion
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
48.
TECHNIQUES FOR DETECTING ROAD BLOCKAGES AND GENERATING ALTERNATIVE ROUTES
The present disclosure involves systems and methods for detecting road blockages and generating alternative routes. In some cases, a system detects, based at least in part on sensor data associated with an autonomous vehicle, a portion of an environment that impedes a planned path of the autonomous vehicle. The system determines a semantic classification associated with the portion of the environment and transmits a re-routing request comprising the semantic classification to one or more remote computing devices. The system receives an instruction associated with navigating the autonomous vehicle around the portion of the environment from the remote computing devices, where the instruction includes an alternative route determined from a plurality of alternative routes. The system further controls the autonomous vehicle to navigate around the portion of the environment based at least in part on the instruction.
G05D 1/02 - Control of position or course in two dimensions
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
49.
DETERMINING OCCUPANCY USING UNOBSTRUCTED SENSOR EMISSIONS
Techniques for determining occupancy using unobstructed sensor emissions. For instance, a vehicle may receive sensor data from one or more sensors. The sensor data may represent at least locations to points within an environment. Using the sensor data, the vehicle may determine areas within the environment that are obstructed by objects (e.g., locations where objects are located). The vehicle may also use the sensor data to determine areas within the environment that are unobstructed by objects (e.g., locations where objects are not located). In some examples, the vehicle determines the unobstructed areas as including areas that are between the vehicle and the identified objects. This is because sensor emissions from the sensor(s) passed through these areas and then reflected off of objects located farther distances from the vehicle. The vehicle may then generate a map indicating at least the obstructed areas and the unobstructed areas within the environment.
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
Trajectory generation for controlling motion or other behavior of an autonomous vehicle may include alternately determining a candidate action and predicting a future state based on that candidate action. The technique may include determining a cost associated with the candidate action that may include an estimation of a transition cost from a current or former state to a next state of the vehicle. This cost estimate may be a lower bound cost or an upper bound cost and the tree search may alternately apply the lower bound cost or upper bound cost exclusively or according to a ratio or changing ratio. The prediction of the future state may be based at least in part on a machine-learned model's classification of a dynamic object as being a reactive object or a passive object, which may change how the dynamic object is modeled for the prediction.
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
51.
SYSTEMATIC FAULT DETECTION IN VEHICLE CONTROL SYSTEMS
An evaluation computing system may implement techniques to validate a vehicle controller, such as based on a detection of a systematic fault. The evaluation computing system may access data (e.g., log data and/or map data) associated with an operation of the vehicle in an environment as controlled by the controller. The evaluation computing system may modify a portion of the map data representative of a simulated change associated with a portion of the environment. The evaluation computing system may run a simulation with a simulated environment, generated based on the modified map data, to determine whether the controller detects and/or mitigates the simulated change in a sufficient manner. Based on a determination of whether or not the controller detects and/or mitigates the simulated change in a sufficient manner, the evaluation computing system may determine an error associated with the controller or may validate the controller.
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
52.
THREE-DIMENSIONAL OBJECT DETECTION BASED ON IMAGE DATA
Techniques are discussed herein for generating three-dimensional (3D) representations of an environment based on two-dimensional (2D) image data, and using the 3D representations to perform 3D object detection and other 3D analyses of the environment. 2D image data may be received, along with depth estimation data associated with the 2D image data. Using the 2D image data and associated depth data, an image-based object detector may generate 3D representations, including point clouds and/or 3D pixel grids, for the 2D image or particular regions of interest. In some examples, a 3D point cloud may be generated by projecting pixels from the 2D image into 3D space followed by a trained 3D convolutional neural network (CNN) performing object detection. Additionally or alternatively, a top-down view of a 3D pixel grid representation may be used to perform object detection using 2D convolutions.
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
53.
RADAR DATA ANALYSIS AND CONCEALED OBJECT DETECTION
Techniques are discussed herein for analyzing radar data to determine that radar noise from one or more target detections potentially conceals additional objects near the target detection. Determining whether an object may be concealed can be based at least in part on a radar noise level based on a target detection, as well as distributions of radar cross sections and/or doppler data associated with particular object types. For a location near a target detection, a radar system may determine estimated noise levels, and compare the estimated noise levels to radar cross section probabilities associated with object types to determine the likelihood that an object of the object type could be concealed at the location. Based on the analysis, the system may determine a vehicle trajectory or otherwise may control a vehicle based on the likelihood that an object may be concealed at the location.
G01S 7/41 - RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES - Details of systems according to groups , , of systems according to group using analysis of echo signal for target characterisation; Target signature; Target cross-section
G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
Techniques for identifying curbs are discussed herein. For instance, a vehicle may generate sensor data using one or more sensors, where the sensor data represents points associated with a driving surface and a sidewalk. The vehicle may then quantize the points into distance bins that are located laterally along the driving direction of the vehicle in order to generate spatial lines. Next, the vehicle may determine separation points for the spatial lines, where the separation points are configured to separate the points associated with the driving surface from the points associated with the sidewalk. The vehicle may then generate, using the separation points, a curve that represents the curb between the driving surface and the sidewalk. This way, the vehicle may use the curve while navigating, such as to avoid the curb and/or stop at a location that is proximate to the curb.
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
B60W 30/08 - Predicting or avoiding probable or impending collision
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
55.
ESTIMATING ANGLE OF A VEHICLE WHEEL BASED ON NON-STEERING VARIABLES
Techniques for using a set of non-steering variables to estimate an angle of a wheel are described. For example, a yaw rate, a linear velocity of a wheel, and vehicle dimensions (e.g., offset between the wheel and a turn-center reference line), can be used to estimate the angle of the wheel. Among other things, estimating angles based on non-steering variables may provide redundancy (e.g., when determined in parallel with steering-based command angles or other commanded angles) and/or may be used to validate commanded angles based on steering components.
Sensors, including radar sensors, may be used to detect objects in an environment. In an example, a vehicle may include one or more radar sensors that sense objects around the vehicle, e.g., so the vehicle can navigate relative to the objects. A plurality of radar points from one or more radar scans are associated with a sensed object and a representation of the sensed object is determined from the plurality of radar points. The representation may be compared to track information of previously-identified, tracked objects. Based on the comparison, the sensed object may be associated with one of the tracked objects, and, alternatively, the track information may be updated based on the representation. Conversely, the comparison may indicate that the sensed object is not associated with any of the tracked objects. In this instance, the representation may be used to generate a new track, e.g., for the newly-sensed object.
A charging system includes a charging station having input configured to receive a first type of electrical power, and a power converter connected to the input. The power converter is configured to convert the first type of electrical power from the input to a second type of electrical power different to the first type of electrical power, the second type of electrical power including DC electrical power. The charging station has outputs connected to the power converter, the outputs configured such that DC electrical power is providable to each of the outputs simultaneously. Each of the outputs is configured to connect to a respective electric vehicle for charging of the electric vehicle.
B60L 53/24 - Using the vehicle's propulsion converter for charging
B60L 53/67 - Controlling two or more charging stations
B60L 58/10 - Methods or circuit arrangements for monitoring or controlling batteries or fuel cells, specially adapted for electric vehicles for monitoring or controlling batteries
B60L 53/35 - Means for automatic or assisted adjustment of the relative position of charging devices and vehicles
A transportation system controls a fleet of autonomous vehicles to implement passenger transportation and coordinate delivery of baggage or other associated items using separate vehicles. The transportation system receives passenger data and associated item data via a user interface, and determines the number and type of autonomous vehicles to transport the passengers and items from selected pick-up locations to a destination. In various implementations, the transportation system may support different pick-up locations, pick-up times and/or delivery times for the passengers and associated items. The transportation system also may determine delayed item delivery options for different delivery times and modes of transportation. Based on the passenger and item data, along with input received via the user interface, the transportation system determines the vehicles to deploy and the delivery routes, and transmits instructions to the autonomous vehicles to provide the passenger transportation and perform the item delivery.
G06Q 10/08 - Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
G06Q 10/04 - Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
Techniques for accurately predicting and avoiding collisions with objects detected in an environment of a vehicle are discussed herein. A vehicle computing device can implement a model to output data indicating costs for potential intersection points between the object and the vehicle in the future. The model may employ a control policy and a time-step integrator to determine whether an object may intersect with the vehicle, in which case the techniques may include predicting vehicle actions by the vehicle computing device to control the vehicle.
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
A wireless-charging adapter is connectable to a direct current (DC) fast charger and includes an induction coil for wireless charging a second induction coil on a vehicle. In some instances, the adapter may include an electrical connector to mate with a DC fast charger. In addition, the adapter may include hardware and/or software to receive a DC from the DC fast charger and provide an alternating current (AC) to the induction coil. The induction coil of the adapter may be positioned (e.g., on a ground surface) to align with an induction coil on a vehicle.
B60L 53/34 - Plug-like or socket-like devices specially adapted for contactless inductive charging of electric vehicles
B60L 53/126 - Methods for pairing a vehicle and a charging station, e.g. establishing a one-to-one relation between a wireless power transmitter and a wireless power receiver
B60L 53/62 - Monitoring or controlling charging stations in response to charging parameters, e.g. current, voltage or electrical charge
B60L 53/122 - Circuits or methods for driving the primary coil, i.e. supplying electric power to the coil
B60L 53/30 - Constructional details of charging stations
B60L 53/66 - Data transfer between charging stations and vehicles
B60L 53/65 - Monitoring or controlling charging stations involving identification of vehicles or their battery types
B60L 53/10 - Methods of charging batteries, specially adapted for electric vehicles; Charging stations or on-board charging equipment therefor; Exchange of energy storage elements in electric vehicles characterised by the energy transfer between the charging station and the vehicle
Techniques for vehicle deceleration planning are discussed. The techniques include determining a first location and a first velocity of a vehicle. The techniques further include determining a second location and a second velocity of an object. Based on the first location, the second location, the first velocity, and the second velocity, a relative stopping distance between the vehicle and the object can be determined. If the relative stopping distance is less than a threshold distance, the first maximum deceleration value can be increased to a second maximum deceleration value, and the techniques determine a trajectory for the vehicle based at least in part on the second maximum deceleration value.
Techniques for analyzing a parameter space are discussed. Techniques may include receiving policy data for evaluating a vehicle controller. The techniques may further include determining, using a Bayesian optimization and based at least in part on the vehicle controller, parameter sets associated with adverse events. The adverse events may be associated with a violation of the policy data. The techniques may associate, based on exposure data, parameter bounds of the adverse events and probabilities of the adverse events in a driving environment. A safety metric may be determined based on the Bayesian optimization. The techniques may also include weighting an impact of an adverse event based on the safety metric.
B60W 50/02 - Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
Techniques for determining vehicle trajectories to operate a vehicle according to a planned path are described herein. In an example, a vehicle computing system may determine a location of the vehicle at a first time. Based on the location, the vehicle computing system may determine an estimated location of the vehicle at a second time, the estimated location of the vehicle including a lateral coordinate and a longitudinal coordinate. The vehicle computing system may determine the longitudinal coordinate based on a vehicle trajectory associated with the first time (e.g., previously determined trajectory) and the lateral coordinate based on the planned path. The vehicle computing system may determine a second vehicle trajectory based in part on the estimated location and the first trajectory, and may control the vehicle according to the second vehicle trajectory.
A sensor simulation system may generate sensor data for use in simulations by rendering two-dimensional views of a three-dimensional simulated environment. In various examples, the sensor simulation system uses sensor dependency data to determine specific views to be re-rendered at different times during the simulation. The sensor simulation system also may generate unified views with multi-sensor data at each region (e.g., pixel) of the two-dimensional view for consumption by different sensor types. A hybrid technique may be used in some implementations in which rasterization is used to generate a view, after which ray tracing is used to align the view with a particular sensor. Spatial and temporal upsampling techniques also may be used, including depth-aware and velocity-aware analyses for simulated objects, to improve view resolution and reduce the frequency of re-rendering views.
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
Techniques for determining a probability of a false negative associated with a location of an environment are discussed herein. Data from a sensor, such as a radar sensor, can be received that includes point cloud data, which includes first and second data points. The first data point has a first attribute and the second data point has a second attribute. A difference between the first and second attributes is determined such that a frequency distribution may be determined. The frequency distribution may then be used to determine a distribution function, which allows for the determination of a resolution function that is associated with the sensor. The resolution function may then be used to determine a probability of a false negative at a location in an environment. The probability can be used to control a vehicle in a safe and reliable manner.
Techniques for accurately predicting and avoiding collisions with objects detected in an environment of a vehicle are discussed herein. A vehicle safety system can implement a model to output data indicating an intersection probability between the object and a portion of the vehicle in the future. The model may employ a rear collision filter, a distance filter, and a time to stop filter to determine whether a predicted collision may be a false positive, in which case the techniques may include refraining from reporting such predicted collision to other another vehicle computing device to control the vehicle.
B60W 30/08 - Predicting or avoiding probable or impending collision
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
B60W 10/18 - Conjoint control of vehicle sub-units of different type or different function including control of braking systems
B60W 10/20 - Conjoint control of vehicle sub-units of different type or different function including control of steering systems
Techniques for using ball joint sensor data to determine conditions relevant to a vehicle are described in this disclosure. For example, in one example, the ball joint sensor data may be used to determine a ride height at a portion of the vehicle, which may be used to determine roll data and/or pitch data. The ride height, roll data, and/or pitch data may be directly used to navigate through an environment, such as by the vehicle relying on the data when interpreting sensor data or planning driving operations. Also, the ride height, roll data, and/or pitch data may be used to verify the reliability of other sensor data used to navigate through the environment.
B60G 17/019 - Resilient suspensions having means for adjusting the spring or vibration-damper characteristics, for regulating the distance between a supporting surface and a sprung part of vehicle or for locking suspension during use to meet varying vehicular or s the regulating means comprising electric or electronic elements characterised by the type of sensor or the arrangement thereof
This disclosure relates to systems and techniques for identifying collisions, such as relatively low energy impact collisions involving an autonomous vehicle. Sensor data from a first sensor modality in a first array may be used to determine a first estimated location of impact and second sensor data from a second sensor modality in a second array may be used to determine a second estimated location of impact. A low energy impact event may be configured when the first estimated location of impact corresponds to the second estimated location of impact.
B60W 30/08 - Predicting or avoiding probable or impending collision
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
G01S 13/931 - Radar or analogous systems, specially adapted for specific applications for anti-collision purposes of land vehicles
H04R 5/027 - Spatial or constructional arrangements of microphones, e.g. in dummy heads
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
69.
METHODS AND SYSTEMS TO ASSESS VEHICLE CAPABILITIES
Performance anomalies in autonomous vehicle can be difficult to identify, and the impact of such anomalies on systems within the autonomous vehicle may be difficult to understand. In examples, systems of the autonomous vehicle are modeled as nodes in a probabilistic graphical network. Probabilities of data generated at each of the nodes is determined. The probabilities are used to determine capabilities associated with higher level functions of the autonomous vehicle.
B60W 50/02 - Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
70.
VERIFYING RELIABILITY OF DATA USED FOR AUTONOMOUS DRIVING
Techniques for verifying a reliability of map data are discussed herein. In some examples, map data can be used by a vehicle, such as an autonomous vehicle, to traverse an environment. Sensor data (e.g., image data, lidar data, etc.) can be received from a sensor associated with a vehicle and may be used to generate an estimated map and confidence values associated with the estimated map. When the sensor data is image data, images data from multiple perspectives or different time instances may be combined to generate the estimated map. The estimated map may be compared to a stored map or to a proposed vehicle trajectory or corridor to determine a reliability of the stored map data.
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
Techniques for adaptive cross-correlation are discussed. A first signal is received from a first audio sensor associated with a vehicle and a second signal is received from a second audio sensor associated with the vehicle. Techniques may include determining, based at least in part on the first signal, a first transformed signal in a frequency domain. Additionally, the techniques include determining, based at least in part on the second signal, a second transformed signal in the frequency domain. A parameter can be determined based at least in part on a characteristic associated with at least one of the vehicle, an environment proximate the vehicle, or one or more of the first or second signal. Cross-correlation data can be determined based at least in part on one or more of the first transformed signal, the second transformed signal, or the parameter.
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
G06F 9/06 - Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
Techniques for top-down scene generation are discussed. A generator component may receive multi-dimensional input data associated with an environment. The generator component may generate, based at least in part on the multi-dimensional input data, a generated top-down scene. A discriminator component receives the generated top-down scene and a real top-down scene. The discriminator component generates binary classification data indicating whether an individual scene in the scene data is classified as generated or classified as real. The binary classification data is provided as a loss to the generator component and the discriminator component.
G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
B60W 40/10 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to vehicle motion
73.
INSTANTIATING OBJECTS IN A SIMULATED ENVIRONMENT BASED ON LOG DATA
Techniques for instantiating objects in a simulated environment based on log data are disclosed herein. Some of the techniques may include receiving log data representing an environment in which a real-world vehicle was operating. Using the log data, a simulated environment for testing a simulated vehicle may be generated. The simulated environment may represent the environment in which the real-world vehicle was operating. The techniques may further include determining a location of the simulated vehicle as it traverses the simulated environment. Based at least in part on the log data, a prior location of the real-world vehicle in the environment closest to the location of the simulated vehicle in the simulated environment may be determined. In this way, the simulated environment may be updated to include a simulated object representing an object in the environment that was perceived by the vehicle from the prior location.
Techniques are discussed herein for generating and using graph neural networks (GNNs) including vectorized representations of map elements and entities within the environment of an autonomous vehicle. Various techniques may include vectorizing map data into representations of map elements, and object data representing entities in the environment of the autonomous vehicle. In some examples, the autonomous vehicle may generate and/or use a GNN representing the environment, including nodes stored as vectorized representations of map elements and entities, and edge features including the relative position and relative yaw between the objects. Machine-learning inference operations may be executed on the GNN, and the node and edge data may be extracted and decoded to predict future states of the entities in the environment.
G06Q 10/04 - Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
Techniques are discussed herein for executing log-based driving simulations to evaluate the performance and functionalities of vehicle control systems. A simulation system may execute a log-based driving simulation including playback agents whose behavior is based on the log data captured by a vehicle operating in an environment. The simulation system may determine interactions associated with the playback agents, and may convert the playback agents to smart agents during the driving simulation. During a driving simulation, playback agents that have been converted to smart agents may interact with additional playback agents, causing a cascading effect of additional conversions. Converting playback agents to smart agents may include initiating a planning component to control the smart agent, which may be based on determinations of a destination and/or driving attributes based on the playback agent.
Techniques for facilitating a robust clock synchronization across a computer network that presumes network jitter exists are discussed herein. A first device and a second device transceive a plurality of sets of time-synchronization messages to synchronize a synchronization clock of the second device to a first clock of the first device. The second device calculates a smoothing of time delay data of a plurality of sets. The time delay data is associated with a transmission duration of time-synchronization messages of the sets of the plurality. The second device sets a synchronization clock based on a time at the first device and the smoothed time delay data.
Techniques for collision avoidance using an object contour are discussed. A trajectory associated with a vehicle may be received. Sensor data can be received from a sensor associated with the vehicle. A bounding contour may be determined and associated with an object represented in the sensor data. Based on the trajectory, a simulated position of the vehicle can be determined. Additionally, a predicted position of the bounding contour can be determined. Based on the simulated position of the vehicle and the predicted position of the bounding contour, a distance between the vehicle and the object may be determined. An action can be performed based on the distance between the vehicle and the object.
Techniques for determining a classification probability of an object in an environment are discussed herein. Techniques may include analyzing sensor data associated with an environment from a perspective, such as a top-down perspective, using multi-channel data. From this perspective, techniques may determine channels of multi-channel input data and additional feature data. Channels corresponding to spatial features may be included in the multi-channel input data and data corresponding to non-spatial features may be included in the additional feature data. The multi-channel input data may be input to a first portion of a machine-learned (ML) model, and the additional feature data may be concatenated with intermediate output data from the first portion of the ML model, and input into a second portion of the ML model for subsequent processing and to determine the classification probabilities. Additionally, techniques may be performed on a multi-resolution voxel space representing the environment.
G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
G06V 10/77 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
Techniques for representing sensor data and predicted behavior of various objects in an environment are described herein. For example, an autonomous vehicle can represent prediction probabilities as an uncertainty model that may be used to detect potential collisions, define a safe operational zone or drivable area, and to make operational decisions in a computationally efficient manner. The uncertainty model may represent a probability that regions within the environment are occupied using a heat map type approach in which various intensities of the heat map represent a likelihood of a corresponding physical region being occupied at a given point in time.
G06Q 10/04 - Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
G05D 1/02 - Control of position or course in two dimensions
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
This application relates to techniques for determining whether to engage an autonomous controller of a vehicle based on previously recorded data. A computing system may receive, from a vehicle computing system, data representative of a vehicle being operated in an environment, such as by an autonomous controller. The computing system may generate a simulation associated with the vehicle operation and configured to test an updated autonomous controller. The computing system may determine one or more first time periods associated with the vehicle operations that satisfy one or more conditions associated with engaging an autonomous controller and one or more second time periods associated with the vehicle operations that fail to satisfy the one or more conditions. The computing system may enable an engagement of the autonomous controller during the one or more first time periods and disable the engagement during the one or more second time periods.
B60W 40/10 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to vehicle motion
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
A vehicle computing device may implement techniques to predict behavior of objects or predicted objects in an environment. The techniques may include using a model to determine whether a potential object will emerge from an occluded region in the environment. The model may be configured to use one or more algorithms, classifiers, and/or computational resources to predict an intersection point and/or an intersection time between the potential object and the vehicle. Based on the predicted intersection point and/or the predicted intersection time, the vehicle computing device may control operation of the vehicle.
Techniques associated with generating and maintaining sparse geographic and map data. In some cases, the system may maintain a factor graph comprising a plurality of nodes. In some cases, the nodes may comprise pose data and sensor data associated with an autonomous vehicle at the geographic position represented by the node. The nodes may be linked based on shared trajectories and shared sensor data.
Techniques for determining a safety area for a vehicle are discussed herein. In some cases, a first safety area can be based on a vehicle travelling through an environment and a second safety area can be based on a steering control or a velocity of the vehicle. A width of the safety areas can be updated based on a position of a bounding box associated with the vehicle. The position can be based on the vehicle traversing along a trajectory. Sensor data can be filtered based on the sensor data falling within the safety area(s).
Techniques for estimating a direction of arrival of sound in an environment are discussed. First and second audio data are received from a first pair of audio sensors associated with a vehicle. A first region of ambiguity associated with the first pair of audio sensors is determined based on the first and second audio data. Third and fourth audio data are received from a second pair of audio sensors. A second region of ambiguity associated with the second pair of audio sensors is determined based on the third and fourth audio data. The regions of ambiguity can be further based on confidence levels associated with sensor or audio data. An area of intersection of the first region of ambiguity and the second region of ambiguity can be determined. A direction of arrival of an audio event can be determined based on the area of intersection.
G01S 3/80 - Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using ultrasonic, sonic, or infrasonic waves
G05D 1/02 - Control of position or course in two dimensions
Techniques for determining gaps for performing lane change operations are described. A first region in an environment of a vehicle can be determined. The first region can be associated with a first time period through which the vehicle is unable to travel and can correspond to a constraint space. A second region of the environment can be determined. The second region can be associated with a second time period and can correspond to a configuration space. A gap in the environment can be determined based on a portion of the configuration space that is exclusive of the constraint space. A trajectory can be determined based on the gap. The trajectory can be associated with performing a lane change operation and can be associated with a cost. The vehicle can be controlled to perform the lane change operation based at least in part on the trajectory and the cost.
B60W 40/10 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to vehicle motion
A teleoperations system that collaboratively works with an autonomous vehicle guidance system to generate a path for controlling the autonomous vehicle may comprise generating one or more trajectories at the teleoperations system based at least in part on environment data received from the autonomous vehicle and presenting the one or more trajectories to a teleoperator (e.g., a human user, machine-learned model, or artificial intelligence component). A selection of one of the trajectories may be received at the teleoperations system and transmitted to the autonomous vehicle. The one or more trajectories may be generated at the teleoperations system and/or received from the autonomous vehicle. Regardless, the autonomous vehicle may generate a control trajectory based on the trajectory received from teleoperations, instead of merely implementing the trajectory from the teleoperations system.
Techniques for determining a location of a vehicle in an environment using sensors and determining calibration information associated with the sensors are discussed herein. A vehicle can use map data to traverse an environment. The map data can include semantic map objects such as traffic lights, lane markings, etc. The vehicle can use a sensor, such as an image sensor, to capture sensor data. Semantic map objects can be projected into the sensor data and matched with object(s) in the sensor data. Such semantic objects can be represented as a center point and covariance data. A distance or likelihood associated with the projected semantic map object and the sensed object can be optimized to determine a location of the vehicle. Sensed objects can be determined to be the same based on matching with the semantic map object. Epipolar geometry can be used to determine if sensors are capturing consistent data.
G05D 1/02 - Control of position or course in two dimensions
G08G 1/137 - Traffic control systems for road vehicles indicating the position of vehicles, e.g. scheduled vehicles within the vehicle the indicator being in the form of a map
G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
Techniques for clustering sensor data are discussed herein. Sensors of a vehicle may detect data points in an environment. Clustering techniques can be used in a vehicle safety system to determine connection information between the data points. The connection information can be used by a vehicle computing device that employs clustering and/or segmenting techniques to detect objects in an environment and/or to control operation of a vehicle.
G05D 1/02 - Control of position or course in two dimensions
B60W 30/095 - Predicting travel path or likelihood of collision
G06V 10/26 - Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
G06V 10/70 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning
A vehicle computing system may implement techniques to improve collision prediction and avoidance between a vehicle and objects in an environment. A vehicle computing system of the vehicle generates a relevance polygon associated with a planned path of the vehicle based at least in part on a speed associated with the vehicle traveling through the environment. The vehicle computing system identifies objects in the environment and determines whether the objects are located within a boundary of the relevance polygon. Based on a determination that an object is within the boundary, the vehicle computing system determines that the object is relevant to the vehicle and includes data associated therewith in vehicle control planning considerations.
Sensors, including time-of-flight sensors, may be used to detect objects in an environment. In an example, a vehicle may include a time-of-flight sensor that images objects around the vehicle, e.g., so the vehicle can navigate relative to the objects. The sensor may generate first image data at a first configuration and second image data at a second configuration. An estimated depth of an object may be determined from the first image data, and an actual depth of the object may be determined from the second image data, based on the estimated depth. In examples, the first and second configurations have different modulation frequencies such that a nominal maximum depth in the first configuration is greater than the nominal maximum depth in the second configuration.
Techniques for providing a user interface for remote vehicle monitoring and/or control include presenting a digital representation of an environment and a vehicle as it traverses the environment on a first portion of a display and presenting on a second portion of the display a communication interface that is configured to provide communication with multiple people. The communication interface may enable communications between a remote operator and any number of occupants of the vehicle, other operators (e.g., other remote operators or in-vehicle operators), and/or people in an environment around the vehicle. The user interface may additionally include controls to adjust components of the vehicle, and the controls may be presented on a third portion of the display. Furthermore, the user interface may include a vehicle status interface that provides information associated with a current state of the vehicle.
Techniques for predicting and avoiding collisions with objects detected in an environment based on sensor data are discussed herein. Sensors of a vehicle may detect one or more objects in the environment. A model may determine intersection values indicative of probabilities that the object will follow different paths that intersect with a planned path of the vehicle. A vehicle may receive one or more intersection values from a model usable by a computing device to control the vehicle.
Techniques for controlling a vehicle based on a collision avoidance algorithm are discussed herein. The vehicle receives sensor data and can determine that the sensor data represents an object in an environment through which the vehicle is travelling. A computing device associated with the vehicle determines a collision probability between the vehicle and the object at predicted locations of the vehicle and object at a first time. Updated locations of the vehicle and object can be determined, and a second collision probability can be determined. The vehicle is controlled based at least in part on the collision probabilities.
Techniques for estimating ground height based on lidar data are discussed herein. A vehicle captures lidar data as it traverses an environment. The lidar data can be associated with a voxel space as three-dimensional data. Semantic information can be determined and associated with the lidar data and/or the three-dimensional voxel space. A multi-channel input image can be determined based on the three-dimensional voxel space and input into a machine learned (ML) model. The ML model can output data to determine height data and/or classification data associated with a ground surface of the environment. The height data and/or classification data can be utilized to determine a mesh associated with the ground surface. The mesh can be used to control the vehicle and/or determine additional objects proximate the vehicle.
Four-wheel steering of a vehicle, e.g., in which leading wheels and trailing wheels are steered independently of each other, can provide improved maneuverability and stability. A first vehicle model may be used to determine trajectories for execution by a vehicle equipped with four-wheel steering. A second vehicle model may be used to control the vehicle relative to the determined trajectories. For instance, the second vehicle model can determine leading wheels steering angles for steering leading wheels of the vehicle and trailing wheels steering angles for steering trailing wheels of the vehicle, independently of the leading wheels.
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
B60W 10/20 - Conjoint control of vehicle sub-units of different type or different function including control of steering systems
B60W 30/095 - Predicting travel path or likelihood of collision
B62D 7/10 - Steering linkage; Stub axles or their mountings for individually-pivoted wheels, e.g. on king-pins the pivotal axes being situated in a single plane transverse to the longitudinal centre line of the vehicle with single-output steering gear
Simulating realistic movement of an object, such as a vehicle or pedestrian, that accounts for unusual behavior may comprise generating an agent behavior model based at least in part on output of a perception component of an autonomous vehicle and determining a difference between the output and log data that includes indications of an actual maneuver of location of an object. Simulating movement of an object may comprise determining predicted motion of the object using the perception component and modifying the predicted motion based at least in part on the agent behavior model.
A trajectory for a vehicle can be generated using a lateral offset bias. The vehicle, such as an autonomous vehicle (AV), may be directed to follow reference trajectory for through an environment. The AV may determine a segment associated with the reference trajectory based on curvatures of the reference trajectory, determine a lateral offset bias associated with the segment based at least in part on, for example, one or more of a speed or acceleration of the vehicle, and determine a candidate trajectory for the autonomous vehicle based at least in part on the lateral offset bias. The candidate trajectory may then be used to control the autonomous vehicle.
A vehicle safety system within an autonomous or semi-autonomous vehicle may predict and avoid collisions between the vehicle and other moving objects in the environment. The vehicle safety system may determine one or more perturbed trajectories for another object in the environment, for example, by perturbing the state parameters of a perceived trajectory associated with the object. Each perturbed trajectory may be evaluated to determine whether it intersects or potentially collides the planned trajectory of the vehicle. In some examples, the vehicle safety system may aggregate the results of analyses of multiple perturbed trajectories to determine a collision probability and/or additional weights or adjustment factors associated with the collision prediction, and may determine actions for the vehicle to take based on the collision predictions and probabilities.
B60W 30/095 - Predicting travel path or likelihood of collision
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
The techniques discussed herein may comprise an autonomous vehicle guidance system that generates a path for controlling an autonomous vehicle based at least in part on a static object map and/or one or more dynamic object maps. The guidance system may identify a path based at least in part on determining set of nodes and a cost map associated with the static and/or dynamic object, among other costs, pruning the set of nodes, and creating further nodes from the remaining nodes until a computational or other limit is reached. The path output by the techniques may be associated with a cheapest node of the sets of nodes that were generated.
Techniques to predict object behavior in an environment are discussed herein. For example, such techniques may include inputting data into a model and receiving an output from the model representing a discretized representation. The discretized representation may be associated with a probability of an object reaching a location in the environment at a future time. A vehicle computing system may determine a trajectory and a weight associated with the trajectory using the discretized representation and the probability. A vehicle, such as an autonomous vehicle, can be controlled to traverse an environment based on the trajectory and the weight output by the vehicle computing system.