The technology relates to fine maneuver control of large autonomous vehicles that employ multiple sets of independently actuated wheels. The control is able to optimize the turning radius, effectively negotiate curves, turns, and clear static objects of varying heights. Each wheel or wheel set is configured to adjust individually via control of an on-board computer system. Received sensor data and a physical model of the vehicle can be used for route planning and selecting maneuver operations in accordance with the additional degrees of freedom provided by the independently actuated wheels. This can include making turns, moving into or out of parking spaces, driving along narrow or congested roads, construction zones, loading docks, etc. A given maneuver may include maintaining a minimum threshold distance from a neighboring vehicle or other object.
G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
B62D 7/14 - Steering linkage; Stub axles or their mountings for individually-pivoted wheels, e.g. on king-pins the pivotal axes being situated in more than one plane transverse to the longitudinal centre line of the vehicle, e.g. all-wheel steering
B62D 7/15 - Steering linkage; Stub axles or their mountings for individually-pivoted wheels, e.g. on king-pins the pivotal axes being situated in more than one plane transverse to the longitudinal centre line of the vehicle, e.g. all-wheel steering characterised by means varying the ratio between the steering angles of the steered wheels
2.
Methods for Localizing Light Detection and Ranging (Lidar) Calibration Targets
Example embodiments relate to methods for localizing light detection and ranging (lidar) calibration targets. An example method includes generating a point cloud of a region based on data from a light detection and ranging (lidar) device. The point cloud may include points representing at least a portion of a calibration target. The method also includes determining a presumed location of the calibration target. Further, the method includes identifying, within the point cloud, a location of a first edge of the calibration target. In addition, the method includes performing a comparison between the identified location of the first edge of the calibration target and a hypothetical location of the first edge of the calibration target within the point cloud if the calibration target were positioned at the presumed location. Still further, the method includes revising the presumed location of the calibration target based on at least the comparison.
The subject matter of this specification relates to a light detection and ranging (LiDAR) system. In at least one implementation, the LiDAR system comprises a first signal source; a second signal source; a combiner to generate a hybrid transmission signal from signals generated by the first signal source and the second signal source; a first photodetector to measure a first component of a reflection signal related to range of a target; and a second photodetector to measure a second component of the reflection signal related to velocity of the target, wherein the system is configured to derive the range and velocity of the target from the first component and the second component, respectively.
G01S 17/26 - Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves wherein the transmitted pulses use a frequency-modulated or phase-modulated carrier wave, e.g. for pulse compression of received signals
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
G01S 7/481 - Constructional features, e.g. arrangements of optical elements
G01S 17/58 - Velocity or trajectory determination systems; Sense-of-movement determination systems
G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles
4.
Methods and Systems for Automatic Problematic Maneuver Detection and Adapted Motion Planning
Example embodiments relate to methods and systems for automatic problematic maneuver detection and adapted motion planning. A computing device may obtain a route for navigation by a vehicle and a set of vehicle parameters corresponding to the vehicle. Each vehicle parameter can represent a physical attribute of the vehicle. The computing device may generate a virtual vehicle that represents the vehicle based on the set of vehicle parameters and perform a simulation that involves the virtual vehicle navigating the route. Based on the results of the simulation, the computing device may provide the original route or a modified route to the vehicle for the vehicle to subsequently navigate to its destination. In some cases, the simulation may further factor additional conditions, such as potential weather and traffic conditions that are likely to occur during the time when the vehicle plans on navigating the route.
Aspects of the disclosure relate to routing an autonomous vehicle. For instance, the vehicle may be maneuvered along a route in a first lane using map information identifying a first plurality of nodes representing locations within the first lane and a second plurality of nodes representing locations within a second lane different from the first lane. While maneuvering, when the vehicle should make a lane change may be determined by assessing a cost of connecting a first node of the first plurality of nodes with a second node of a second plurality of nodes. The assessment may be used to make the lane change from the first lane to the second lane.
G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
The present disclosure relates to limitation of noise on light detectors using an aperture. One example embodiment includes a system. The system includes a lens disposed relative to a scene and configured to focus light from the scene onto a focal plane. The system also includes an aperture defined within an opaque material disposed at the focal plane of the lens. The aperture has a cross-sectional area. In addition, the system includes an array of light detectors disposed on a side of the focal plane opposite the lens and configured to intercept and detect diverging light focused by the lens and transmitted through the aperture. A cross-sectional area of the array of light detectors that intercepts the diverging light is greater than the cross-sectional area of the aperture.
Example embodiments relate to calibration and localization of a light detection and ranging (lidar) device using a previously calibrated and localized lidar device. An example embodiment includes a method. The method includes receiving, by a computing device associated with a second vehicle, a first point cloud captured by a first lidar device of a first vehicle. The first point cloud includes points representing the second vehicle. The method also includes receiving, by the computing device, pose information indicative of a pose of the first vehicle. In addition, the method includes capturing, using a second lidar device of the second vehicle, a second point cloud. Further, the method includes receiving, by the computing device, a third point cloud representing the first vehicle. Yet further, the method includes calibrating and localizing, by the computing device, the second lidar device.
Methods, systems, and apparatus for generation and use of surfel maps to plan for occlusions. One of the methods includes receiving a previously-generated surfel map depicting an area in which a vehicle is located, the surfel map comprising a plurality of surfels, each surfel corresponding to a respective different location in the area in which a vehicle is located; receiving, from one or more sensors, sensor data representing the area in which the vehicle is located; determining, based on the sensor data, that the area in which a vehicle is located includes a dynamic object having a changed shape relative to its representation in the surfel map; and generating an updated path for the vehicle to travel that avoids an occlusion by the changed shape of the dynamic object of a line of sight of one or more sensors to an area of interest.
G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
Aspects of the disclosure provide for the generation of a driving difficulty heat map for autonomous vehicles. For instance, log data generated by a vehicle being driven in a manual driving mode for a segment of a route may be input into a disengage model in order to generate an output identifying a likelihood of a vehicle driving in an autonomous driving mode requiring a disengage from the autonomous driving mode along the segment of the route. The log data may have been collected within a geographic area. A grid for the geographic area may be generated. The grid may include a plurality of cells. The output is assigned to one of the plurality of cells. The plurality of cells and assigned output may be used to generate a driving difficulty heat map for the geographic area.
G01C 5/02 - Measuring height; Measuring distances transverse to line of sight; Levelling between separated points; Surveyors' levels involving automatic stabilisation of the line of sight
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
G01C 21/00 - Navigation; Navigational instruments not provided for in groups
G07C 5/02 - Registering or indicating driving, working, idle, or waiting time only
10.
PULL-OVER LOCATION SELECTION USING MACHINE LEARNING
Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for selecting a pull-over location using machine learning. One of the methods includes obtaining data specifying a target pull-over location for an autonomous vehicle travelling on a roadway. A plurality of candidate pull-over locations in a vicinity of the target pull-over location are identified. For each candidate pull-over location, an input that includes features of the candidate pull-over location is processed using a machine learning model to generate a respective likelihood score representing a predicted likelihood that the candidate pull-over location is an optimal location. The features of the candidate pull-over location include one or more features that compare the candidate pull-over location to the target pull-over location. Using the respective likelihood scores, one of the candidate pull-over locations is selected as an actual pull-over location for the autonomous vehicle.
Aspects of the technology relate to assisting a passenger in an autonomous vehicle without a driver. For instance, after a door of the vehicle is opened, a predetermined period of time may be waited by processors of computing devices of the vehicle. After waiting the predetermined period of time and when the vehicle's door remains open, a set of instructions for closing the vehicle's door may be played by the processors through a speaker of the vehicle. Once the door of the vehicle is closed, an announcement may be played by the processors through the speaker requesting that the passenger press a first button to initiate a ride to a destination. In response to the first button being pressed, the ride to the destination may be initiated by the processors by maneuvering the vehicle autonomously to the destination.
B60K 35/28 - characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
The present disclosure relates to limitation of noise on light detectors using an aperture. One example implementation includes a system. The system includes a lens disposed relative to a scene. The lens focuses light from the scene. The system also includes an aperture defined within an opaque material. The system also includes a waveguide having a first side that receives light focused by the lens and transmitted through the aperture. The waveguide guides the received light toward a second side of the waveguide opposite to the first side. The waveguide has a third side extending between the first side and the second side. The system also includes an array of light detectors that intercepts and detects light propagating out of the third side of the waveguide.
G01S 7/481 - Constructional features, e.g. arrangements of optical elements
G01S 17/89 - Lidar systems, specially adapted for specific applications for mapping or imaging
G02B 6/08 - Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings formed by bundles of fibres the relative position of the fibres being the same at both ends, e.g. for transporting images with fibre bundle in form of plate
13.
RESOURCE ALLOCATION FOR AN AUTONOMOUS VEHICLE TRANSPORTATION SERVICE
Aspects of the disclosure relate to generating a model to assess maximum numbers of concurrent trips for an autonomous vehicle transportation service. For instance, historical trip data, including when requests for assistance were made, response times for those requests for assistance, and a number of available resources when each of the requests for assistance were made may be received. In addition, a number of concurrent trips, or trips that overlap in time, occurring when each of the requests for assistance were made may be received. The model may be trained using the historical trip data and the numbers of concurrent trips. The model may be configured to provide a maximum number of concurrent trips given a period of time, a number of available resources, and a response time requirement.
The technology relates to a system for cleating a sensor cover. The system may comprise a wiper comprising a wiper support, a wiper blade, and a sensor cover. The wiper blade may be configured to clear the sensor cover of debris, and the sensor cover may be configured to house one or more sensors. A wiper motor may rotate the wiper and a sensor motor may rotate the sensor cover. The system wiper blade may comprise a first edge attached to the wiper support and a second edge which may be configured to be in contact with the sensor cover. The wiper blade may extend in a corkscrew shape around the wiper support. The wiper motor may be configured to rotate the wiper in a first direction and the sensor motor may be configured to rotate the sensor cover in a second direction opposite the first direction.
One example system for preventing data loss during memory blackout events comprises a memory device, a sensor, and a controller operably coupled to the memory device and the sensor. The controller is configured to perform one or more operations that coordinate at least one memory blackout event of the memory device and at least one data transmission of the sensor.
Aspects of the technology involve controlling a vehicle configured to operate in an autonomous driving mode. This includes receiving a set of environmental inputs including temperature information from different temperature sources, receiving initial steering information from a steering system of the vehicle, and obtaining an initial rack position command by a motion control module of the vehicle. The system determines, based on the environmental inputs, the initial steering information and an initial rack position, a likelihood that a steering actuator of the steering system of the vehicle is blocked or likely to become blocked. The system determines whether a threshold excitation amount has been applied to the steering system within a selected amount of time or a selected driving distance. When the threshold amount of excitation is not met, an excitation profile is applied to the steering system in order to modify the initial rack position.
Example embodiments relate to enhanced depth of focus cameras using variable apertures and pixel binning. An example embodiment includes a device. The device includes an image sensor. The image sensor includes an array of light-sensitive pixels and a readout circuit. The device also includes a variable aperture. Additionally, the device includes a controller that is configured to cause: the variable aperture to adjust to a first aperture size when a high-light condition is present, the variable aperture to adjust to a second aperture size when a low-light condition is present, the readout circuit to perform a first level of pixel binning when the high-light condition is present, and the readout circuit to perform a second level of pixel binning when the low-light condition is present. The second aperture size is larger than the first aperture size. The second level of pixel binning is greater than the first level of pixel binning.
H04N 5/347 - Extracting pixel data from an image sensor by controlling scanning circuits, e.g. by modifying the number of pixels having been sampled or to be sampled by combining or binning pixels in SSIS
G01S 17/89 - Lidar systems, specially adapted for specific applications for mapping or imaging
G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
H04N 5/378 - Readout circuits, e.g. correlated double sampling [CDS] circuits, output amplifiers or A/D converters
H04N 25/46 - Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by combining or binning pixels
H04N 25/75 - Circuitry for providing, modifying or processing image signals from the pixel array
Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for performing rare example mining in driving log data. In one aspect, a method includes obtaining a sensor input; processing the sensor input using an encoder neural network to generate one or more feature vectors for the sensor input; processing each of the one or more feature vectors using a density estimation model to generate a density score for the feature vector; and generating a rareness score for each of the one or more feature vectors from the density score. For example, the rareness score can represent a degree to which a classification of an object depicted in the sensor input is rare relative to other objects. As another example, the rareness score can represent a degree to which a predicted behavior of an agent depicted in the sensor input is rare relative to other objects.
G06V 10/764 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
G06V 10/82 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
Example embodiments relate to reducing auto-exposure latency. An example embodiment includes a method of reducing auto-exposure latency. The method includes determining, by a processor, a first setting of an exposure parameter for a first frame to be captured by an image sensor. The first setting of the exposure parameter is determined based at least in part on characteristics of a previous frame captured by the image sensor. The first setting of the exposure parameter is determined during a first frame period associated with capturing the first frame. The method also includes initiating, by the processor, a first frame exposure operation based on the first setting of the exposure parameter. During the first frame exposure operation, the image sensor captures the first frame during the first frame period.
An example method includes receiving point cloud information about a field of view of a lidar system. The point cloud information includes spatiotemporal and amplitude information about return light received. The method also includes determining, based on the point cloud information, a set of bright light returns from at least one highly reflective object. The bright light returns include return light having an amplitude above a photon threshold and a corresponding bright light return range. The method yet further includes determining, based on the point cloud information, a set of crosstalk returns. The crosstalk returns include return light having a corresponding crosstalk return range. The method includes adjusting, based on a normalized number of crosstalk returns, at least one of: a cleaning system, an operating mode of a lidar system, or an operating mode of a vehicle.
Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for performing rare example mining in driving log data. In one aspect, a method includes maintaining a plurality of density estimation models that each correspond to a different rareness type with respect to historical sensor inputs in a driving log generated by sensors on-board a vehicle; receiving a query that references a sensor input; generating, from the sensor input, a corresponding density estimation model input for each of the plurality of density estimation models; processing, using each of the plurality of density estimation models, the corresponding density estimation model input to generate a corresponding density score; generating, for the sensor input, and from the density scores, a rareness score associated with each different rareness type; and providing the rareness scores in response to receiving the query.
G07C 5/08 - Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle, or waiting time
An example system includes a light detection and ranging (LIDAR) device that scans a field-of-view defined by a pointing direction of the LIDAR device. The system also includes an actuator that adjusts the pointing direction of the LIDAR device. The system also includes a communication interface that receives timing information from an external system. The system also includes a controller that causes the actuator to adjust the pointing direction of the LIDAR device based on at least the received timing information.
Aspects of the technology relate to exception handling for a vehicle. For instance, a current trajectory for the vehicle and sensor data corresponding to one or more objects may be received. Based on the received sensor data, projected trajectories of the one or more objects may be determined. Potential collisions with the one or more objects may be determined based on the projected trajectories and the current trajectory. One of the potential collisions that is earliest in time may be identified. Based on the one of the potential collisions, a safety-time-horizon (STH) may be identified. When a runtime exception occurs, before performing a precautionary maneuver to avoid a collision, waiting no longer than the STH for the runtime exception to resolve.
The technology relates to determining general weather conditions affecting the roadway around a vehicle, and how such conditions may impact driving and route planning for the vehicle when operating in an autonomous mode. For instance, the on-board sensor system may detect whether the road is generally icy as opposed to a small ice patch on a specific portion of the road surface. The system may also evaluate specific driving actions taken by the vehicle and/or other nearby vehicles. Based on such information, the vehicle's control system is able to use the resultant information to select an appropriate braking level or braking strategy. As a result, the system can detect and respond to different levels of adverse weather conditions. The on-board computer system may share road condition information with nearby vehicles and with remote assistance, so that it may be employed with broader fleet planning operations.
G01S 13/95 - Radar or analogous systems, specially adapted for specific applications for meteorological use
G01S 15/88 - Sonar systems specially adapted for specific applications
G01S 17/95 - Lidar systems, specially adapted for specific applications for meteorological use
G01W 1/02 - Instruments for indicating weather conditions by measuring two or more variables, e.g. humidity, pressure, temperature, cloud cover or wind speed
27.
Methods and Systems for Detecting Adverse Road Conditions using Radar
Example embodiments relate to techniques for detecting adverse road conditions using radar. A computing device may generate a first radar representation that represents a field of view for a radar unit coupled to a vehicle and during clear weather conditions and store the first radar representation in memory. The computing device may receive radar data from the radar unit during navigation of the vehicle on a road and determine a second radar representation based on the radar data. The computing device may also perform a comparison between the first radar representation and the second radar representation and determine a road condition for the road based on the comparison. The road condition may represent a quantity of precipitation located on the road and provide control instructions to the vehicle based on the road condition for the road.
G01S 7/41 - RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES - Details of systems according to groups , , of systems according to group using analysis of echo signal for target characterisation; Target signature; Target cross-section
B60W 30/09 - Taking automatic action to avoid collision, e.g. braking and steering
Example implementations may relate to sun-aware vehicle routing. In particular, a computing system of a vehicle may determine an expected position of the sun relative to a geographic area. Based on the expected position, the computing system may make a determination that travel of the vehicle through certain location(s) within the geographic area is expected to result in the sun being proximate to an object within a field of view of the vehicle's image capture device. Responsively, the computing system may generate a route for the vehicle in the geographic area based at least on the route avoiding travel of the vehicle through these certain location(s), and may then operate the vehicle to travel in accordance with the generated route. Ultimately, this may help reduce or prevent situations where quality of image(s) degrades due to sunlight, which may allow for use of these image(s) as basis for operating the vehicle.
In one example, a method is provided that includes receiving lidar data obtained by a lidar device. The lidar data includes a plurality of data points indicative of locations of reflections from an environment of the vehicle. The method includes receiving images of portions of the environment captured by a camera at different times. The method also includes determining locations in the images that correspond to a data point of the plurality of data points. Additionally, the method includes determining feature descriptors for the locations of the images and comparing the feature descriptors to determine that sensor data associated with at least one of the lidar device, the camera, or a pose sensor is accurate or inaccurate.
G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
G06F 18/213 - Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
G06V 10/75 - Image or video pattern matching; Proximity measures in feature spaces using context analysis; Selection of dictionaries
Methods, systems, and apparatus, including computer programs encoded on computer storage media, for automatically designating traffic scenarios as safety-relevant traffic conflicts between agents in a driving environment. One of the methods includes receiving data representing a traffic scenario involving two agents; computing a safety-relevant metric for a first plurality of time points of the traffic scenario; computing a surprise metric for a second plurality of time points of the traffic scenario; determining that the surprise metric satisfies a surprise threshold within a threshold time window of the safety-relevant metric satisfying a safety-relevant threshold; and in response, designating the traffic scenario as a safety-relevant traffic conflict.
Aspects of the disclosure relate to providing transportation services with autonomous vehicles. For instance, a first route to a first destination may be determined. The first route may have a first cost. Weather information for the first destination may be received. A characteristic is determined based on the weather information. A second destination having the characteristic may be selected. The second destination may be different from the first destination. A second route to the second destination may be determined. The second route may have a second cost. The first cost may be compared to the second cost, and the vehicle may use the comparison to set one of the first destination or the second destination as a current destination for a vehicle to cause the vehicle to control itself in an autonomous driving mode to the current destination.
The technology relates to identifying and addressing aberrant driver behavior. Various driving operations may be evaluated over different time scales and driving distances. The system can detect driving errors and suboptimal maneuvering, which are evaluated by an onboard driver assistance system and compared against a model of expected driver behavior. The result of this comparison can be used to alert the driver or take immediate corrective driving action. It may also be used for real-time or offline training or sensor calibration purposes. The behavior model may be driver-specific, or may be a nominal driver model based on aggregated information from many drivers. These approaches can be employed with drivers of passenger vehicles, busses, cargo trucks and other vehicles.
The present disclosure relates to systems, vehicles, and methods relating to imaging and object detection using polarization-based detection of infrared light. An example system includes at least one infrared detector configured to detect infrared light corresponding to a target object within a field of view. The infrared light includes at least one of a first polarization or a second polarization. The system also includes a controller configured to carry out operations. The operations include receiving, from the at least one infrared detector, information indicative of infrared light corresponding to the target object. The operations also include determining, based on the received information, a polarization ratio corresponding to the target object. The polarization ratio comprises a first polarization intensity divided by a second polarization intensity. The operations also include determining, based on the polarization ratio, that the infrared light corresponding to the target object comprises direct light or reflected light.
Aspects of the disclosure relate generally to generating and providing route options for an autonomous vehicle. For example, a user may identify a destination, and in response the vehicle's computer may provide routing options to the user. The routing options may be based on typical navigating considerations such as the total travel time, travel distance, fuel economy, etc. Each routing option may include not only an estimated total time, but also information regarding whether and which portions of the route may be maneuvered under the control of the vehicle alone (fully autonomous), a combination of the vehicle and the driver (semiautonomous), or the driver alone. The time of the longest stretch of driving associated with the autonomous mode as well as map information indicating portions of the routes associated with the type of maneuvering control may also be provided.
G01C 21/36 - Input/output arrangements for on-board computers
G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
H04L 67/12 - Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
35.
Methods and Systems to Determine a Strategy for a Drop Process Associated with a Light Detection and Ranging (LIDAR) Device
Example implementations may relate to determining a strategy for a drop process associated with a light detection and ranging (LIDAR) device. In particular, the LIDAR device could emit light pulses and detect return light pulses, and could generate a set of data points representative of the detected return light pulses. The drop process could involve a computing system discarding data point(s) of the set and/or preventing emission of light pulse(s) by the LIDAR device. Accordingly, the computing system could detect a trigger to engage in the drop process, and may responsively (i) use information associated with the environment around the vehicle, operation of the vehicle, and/or operation of the LIDAR device as a basis to determine the strategy for the drop process, and (ii) engage in the drop process in accordance with the determined strategy.
Aspects of the disclosure provide a method of facilitating communications from an autonomous vehicle to a user. For instance, a method may include, while attempting to pick up the user and prior to the user entering an vehicle, inputting a current location of the vehicle and map information into a model in order to identify a type of communication action for communicating a location of the vehicle to the user; enabling a first communication based on the type of the communication action; determining whether the user has responded to the first communication from received sensor data; and enabling a second communication based on the determination of whether the user has responded to the communication.
B60Q 5/00 - Arrangement or adaptation of acoustic signal devices
B60Q 1/26 - Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
G08G 1/00 - Traffic control systems for road vehicles
Aspects of the present disclosure relate to a system having a memory, a plurality of self-driving systems for controlling a vehicle, and one or more processors. The processors are configured to receive at least one fallback task in association with a request for a primary task and at least one trigger of each fallback task. Each trigger is a set of conditions that, when satisfied, indicate when a vehicle requires attention for proper operation. The processors are also configured to send instructions to the self-driving systems to execute the primary task and receive status updates from the self-driving systems. The processors are configured to determine that a set of conditions of a trigger is satisfied based on the status updates and send further instructions based on the associated fallback task to the self-driving systems.
G08G 1/00 - Traffic control systems for road vehicles
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
B60W 50/029 - Adapting to failures or work around with other constraints, e.g. circumvention by avoiding use of failed parts
G05D 1/223 - Command input arrangements on the remote controller, e.g. joysticks or touch screens
G05D 1/227 - Handing over between remote control and on-board control; Handing over between remote control arrangements
Aspects of the disclosure relate to controlling a vehicle in an autonomous driving mode where vehicle has a drive by wire braking system. For instance, while the vehicle is being controlled in the autonomous driving mode, a signal corresponding to input at a brake pedal of the drive by wire braking system may be received. An amount of braking may be determined based on the received signal. The amount of braking may be used to determine a trajectory for the vehicle to follow. The vehicle may be controlled in the autonomous driving mode using the trajectory.
Example embodiments relate to radar image video compression techniques using per-pixel Doppler measurements, which can involve initially receiving radar data from a radar unit to generate a radar representation that represents surfaces in the environment. Based on Doppler scores in the radar representation, a range rate can be determined for each pixel that indicates a radial direction motion for a surface represented by the pixel. The range rates and backscatter values can then be used to estimate a radar representation prediction for subsequent radar data received from the radar unit, which enables a generation of a compressed radar data file that represents the difference between the radar representation prediction and the actual representation determined for the subsequent radar data. The compressed radar data file can be stored in memory, transmitted to other devices, and decompressed and used to train models via machine learning.
G01S 13/89 - Radar or analogous systems, specially adapted for specific applications for mapping or imaging
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
G01S 7/41 - RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES - Details of systems according to groups , , of systems according to group using analysis of echo signal for target characterisation; Target signature; Target cross-section
Aspects of the disclosure relate to repositioning a rooftop sensor of an autonomous vehicle when needed to reduce the overall height of the autonomous vehicle. For instance, while an autonomous vehicle is being controlled in an autonomous driving mode, a low clearance zone may be identified. An activation location may be determined based on the low clearance zone and a current speed of the autonomous vehicle. Once the activation location is reached by the autonomous vehicle, a motor may be caused to reposition the rooftop sensor. In addition, in some instances, after the autonomous vehicle has passed the low clearance zone, the motor may be caused to reposition the rooftop sensor again.
B60W 10/30 - Conjoint control of vehicle sub-units of different type or different function including control of auxiliary equipment, e.g. air-conditioning compressors or oil pumps
B60R 11/00 - Arrangements for holding or mounting articles, not otherwise provided for
B60R 11/04 - Mounting of cameras operative during drive; Arrangement of controls thereof relative to the vehicle
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
G01S 13/86 - Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
G01S 13/931 - Radar or analogous systems, specially adapted for specific applications for anti-collision purposes of land vehicles
A system and method include scanning a light detection and ranging (LIDAR) device through a range of orientations corresponding to a scanning zone while emitting light pulses from the LIDAR device. The method also includes receiving returning light pulses corresponding to the light pulses emitted from the LIDAR device and determining initial point cloud data based on time delays between emitting the light pulses and receiving the corresponding returning light pulses and the orientations of the LIDAR device. The initial point cloud data has an initial angular resolution. The method includes identifying, based on the initial point cloud data, a reflective feature in the scanning zone and determining an enhancement region and an enhanced angular resolution for a subsequent scan to provide a higher spatial resolution in at least a portion of subsequent point cloud data from the subsequent scan corresponding to the reflective feature.
G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
G01S 7/48 - RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES - Details of systems according to groups , , of systems according to group
Aspects of the disclosure provide for the selection of a route for a vehicle having an autonomous driving mode. For instance, an initial location of the vehicle may be identified. This location may be used to determine a set of possible routes to a destination location. A cost for each route of the plurality is determined by inputting time of day information, map information, and details of that route into one or more models in order to determine whether the vehicle is likely to be stranded along that route and assessing the cost based at least in part on the determination of whether the vehicle is likely to be stranded along that route. One of the routes of the set of possible routes may be selected based on any determined costs. The vehicle may be controlled in the autonomous driving mode using the selected one.
Example embodiments relate to methods and systems for automated generation of radar interference reduction training data for autonomous vehicles. In an example, a computing device causes a radar unit to transmit radar signals in an environment of a vehicle. The computing device may include a model trained based on a labeled interferer dataset that represents interferer signals generated by an emitter located remote from the vehicle. The interferer signals are based on one or more radar signal parameter models. The computing device may use the model to determine whether received electromagnetic energy corresponds to transmitted radar signals or an interferer signal. Based on determining that the electromagnetic energy corresponds to the transmitted radar signals, the computing device may generate a representation of the environment of the vehicle using the electromagnetic energy.
G01S 7/02 - RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES - Details of systems according to groups , , of systems according to group
42 - Scientific, technological and industrial services, research and design
Goods & Services
Research and development into autonomous vehicles; research, design, and development of computer hardware and software for use with autonomous vehicle on-board computers for monitoring and controlling motor vehicle operation; research, design, and development of computer hardware and software for autonomous vehicle coordination, navigation, calibrating, direction, and management; research, design, and development of sensors and for structural parts thereof; software as a service (SaaS) services featuring computer software for use as an application programming interface (API) for use in connection with autonomous vehicles; advanced product research, design, and development in the field of artificial intelligence in connection with autonomous vehicles
47.
Methods and Systems for Modifying Power Consumption by an Autonomy System
Example embodiments relate to techniques for modifying power consumption of an autonomy system. For instance, a vehicle autonomy system may use sensor data from vehicle sensors to determine information about the surrounding environment and estimate one or more conditions expected for a threshold duration during subsequent navigation of the path by the vehicle. The autonomy system can then adjust operation of one or more of its components (sensors, compute cores, actuators) based on the one or more conditions expected for the threshold duration and power consumption data corresponding to the components. The vehicle can then be controlled based on subsequent sensor data obtained after adjusting operation of the components of the autonomy system thereby increasing the efficiency of the autonomy system in accordance with the vehicle's surrounding environment.
B60W 50/04 - Monitoring the functioning of the control system
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
48.
Methods and Systems for Detecting and Mitigating Automotive Radar Interference
Example embodiments relate to techniques that involve detecting and mitigating automotive interference. Electromagnetic signals propagating in the environment can be received by a radar unit that limits the signals received to a particular angle of arrival with reception antennas that limit the signals received to a particular polarization. Filters can be applied to the signals to remove portions that are outside an expected time range and an expected frequency range that depend on radar signal transmission parameters used by the radar unit. In addition, a model representing an expected electromagnetic signal digital representation can be used to remove portions of the signals that are indicative of spikes and plateaus associated with signal interference. A computing device can then generate an environment representation that indicates positions of surfaces relative to the vehicle using the remaining portions of the signals.
G01S 13/931 - Radar or analogous systems, specially adapted for specific applications for anti-collision purposes of land vehicles
G01S 7/02 - RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES - Details of systems according to groups , , of systems according to group
G01S 13/89 - Radar or analogous systems, specially adapted for specific applications for mapping or imaging
49.
LIGHT DETECTION AND RANGING (LIDAR) DEVICES HAVING VERTICAL-CAVITY SURFACE-EMITTING LASER (VCSEL) EMITTERS
Example embodiments relate to light detection and ranging (lidar) devices having vertical-cavity surface-emitting laser (VCSEL) emitters. An example lidar device includes an array of individually addressable VCSELs configured to emit light pulses into an environment surrounding the lidar device. The lidar device also includes a firing circuit configured to selectively fire the individually addressable VCSELs in the array. In addition, the lidar device includes a controller configured to control the firing circuit using a control signal. Further, the lidar device includes a plurality of detectors. Each detector in the plurality of detectors is configured to detect reflections of light pulses that are emitted by one or more individually addressable VCSELs in the array and reflected by one or more objects in the environment surrounding the lidar device.
G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles
G01S 7/481 - Constructional features, e.g. arrangements of optical elements
H01S 5/183 - Surface-emitting [SE] lasers, e.g. having both horizontal and vertical cavities having only vertical cavities, e.g. vertical cavity surface-emitting lasers [VCSEL]
Aspects of the disclosure provide for evaluation of a planned trajectory for an autonomous vehicle. For instance, for each of a plurality of objects, a predicted trajectory may be received. The planned trajectory may identify locations and times that the vehicle will be at those locations. For each of the plurality of objects, a grid including a plurality of cells may be generated. Occupancy of each grid for each of the plurality of objects may be determined based on the predicted trajectories. A cell of each grid which will be occupied by the vehicle at a location and time of the planned trajectory may be identified. The planned trajectory may be evaluated based on whether any identified cell is occupied by any of the plurality of objects at the time.
A system is provided that includes an image sensor coupled to a vehicle, and control circuitry configured to perform operations including receiving, from the image sensor, an input stream comprising high dynamic range (HDR) image data associated with an environment of the vehicle, and processing the input stream at the vehicle by applying a global tone mapping, followed by offline image processing that can include applying a local tone mapping to the globally tone mapped images of the same input stream.
A vehicle configured to operate in an autonomous mode may operate a sensor to determine an environment of the vehicle. The sensor may be configured to obtain sensor data of a sensed portion of the environment. The sensed portion may be defined by a sensor parameter. Based on the environment of the vehicle, the vehicle may select at least one parameter value for the at least one sensor parameter such that the sensed portion of the environment corresponds to a region of interest. The vehicle may operate the sensor, using the selected at least one parameter value for the at least one sensor parameter, to obtain sensor data of the region of interest, and control the vehicle in the autonomous mode based on the sensor data of the region of interest.
G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
B60W 30/00 - Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for selecting simulators for evaluating control software for autonomous vehicles. In one aspect, a system comprises: receiving data specifying a driving scenario in an environment; receiving an actual value of a low-level statistic measuring a corresponding property of the driving scenario; generating simulations of the driving scenario using a simulator; determining, for each simulation, a respective predicted value of the low-level statistic that measures the corresponding property of the simulation; determining, from the respective predicted values for the simulations, a likelihood assigned to the actual value of the low-level statistic by the simulations; and determining, from the likelihood, a low-level metric for the simulator and for the driving scenario that measures a realism of the simulator with respect to the corresponding property of the driving scenario.
Aspects of the disclosure relate to testing situational awareness of a test driver tasked with monitoring the driving of a vehicle operating in an autonomous driving mode. For instance, a signal that indicates that the test driver may be distracted may be identified. Based on the signal, that a question can be asked of the test driver may be determined. A plurality of factors relating to a driving context for the vehicle may be identified. Based on the determination, a question may be generated based on the plurality of factors. The question may be provided to the test driver. Input may be received from the test driver providing an answer to the question.
G09B 7/02 - Electrically-operated teaching apparatus or devices working with questions and answers of the type wherein the student is expected to construct an answer to the question which is presented or wherein the machine gives an answer to the question presented by the student
Methods, systems, and apparatus, including computer programs encoded on computer storage media, for computing a backward looking surprise metric for autonomously driven vehicles. One of the methods includes obtaining first data representing one or more previously predicted states of an agent along one or more predicted trajectories of the agent at a first time step. Second data representing one or more states of the agent at a subsequent time step is obtained. A surprise score is computed from a measure of a difference between the first data computed for the one or more predicted trajectories for the prior time step and the second data computed for the one or more predicted states for the subsequent time step.
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
G07C 5/08 - Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle, or waiting time
56.
Retroreflector Detection and Avoidance in a LIDAR Device
A light detection and ranging (LIDAR) device includes a light emitter configured to emit light pulses into a field of view and a detector configured to detect light in the field of view. The light emitter emits a first light pulse. The detector detects, during a first measurement period, at least one reflected light pulse that is indicative of reflection by a retroreflector based on a shape of a reflected light pulse, a magnitude of a reflected light pulse, and/or a time separation between two reflected light pulses. In response to detecting the at least one reflected light pulse indicative of reflection by a retroreflector, the light emitter is deactivated for one or more subsequent measurement periods. Additionally, the LIDAR device may inform one or more other LIDAR devices by transmitting to a computing device information indicative of the retroreflector being within the field of view of the light emitter.
Aspects of the disclosure relate to detecting and responding to objects in a vehicle's environment. For example, an object may be identified in a vehicle's environment, the object having a heading and location. A set of possible actions for the object may be generated using map information describing the vehicle's environment and the heading and location of the object. A set of possible future trajectories of the object may be generated based on the set of possible actions. A likelihood value of each trajectory of the set of possible future trajectories may be determined based on contextual information including a status of the detected object. A final future trajectory is determined based on the determined likelihood value for each trajectory of the set of possible future trajectories. The vehicle is then maneuvered in order to avoid the final future trajectory and the object.
Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for tracking objects in an environment across time. In one aspect, a method comprises: receiving a set of current object detections, each characterizing features of a respective detected object; maintaining data, including track query feature representations, that identifies one or more object tracks (each associated with respective earlier object detections classified as characterizing the same object; and, for each object track: (i) selecting a subset of the current object detections as candidate object detections for the object track, (ii) generating a respective association score for each candidate object detection based on an input derived from the candidate object detections and the track query feature representation for the object track using a track-detection interaction neural network, and (iii) determining whether to associate any of the current object detections with the object track based on the respective association scores.
G01S 17/89 - Lidar systems, specially adapted for specific applications for mapping or imaging
G06V 10/44 - Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
59.
TRAJECTORY PREDICTION BY SAMPLING SEQUENCES OF DISCRETE MOTION TOKENS
Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for generating trajectory predictions for one or more agents in an environment. In one aspect, a method comprises: obtaining scene context data characterizing a scene in an environment at a current time point and generating a respective predicted future trajectory for each of a plurality of agents in the scene at the current time point by sampling a sequence of discrete motion tokens that defines a joint future trajectory for the plurality of agents using a trajectory prediction neural network that is conditioned on the scene context data.
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
Methods and systems for the use of detected objects for image processing are described. A computing device autonomously controlling a vehicle may receive images of the environment surrounding the vehicle from an image-capture device coupled to the vehicle. In order to process the images, the computing device may receive information indicating characteristics of objects in the images from one or more sources coupled to the vehicle. Examples of sources may include RADAR, LIDAR, a map, sensors, a global positioning system (GPS), or other cameras. The computing device may use the information indicating characteristics of the objects to process received images, including determining the approximate locations of objects within the images. Further, while processing the image, the computing device may use information from sources to determine portions of the image to focus upon that may allow the computing device to determine a control strategy based on portions of the image.
G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
B60W 30/00 - Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
G01S 13/02 - Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
G01S 17/02 - Systems using the reflection of electromagnetic waves other than radio waves
G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
61.
AUTOMATIC GENERATING OF BLOCKAGES IN MAP INFORMATION FOR USE BY A FLEET OF AUTONOMOUS VEHICLES
Aspects of the disclosure provide methods of generating blockages in map information for use by a fleet of autonomous vehicles. For instance, information from an autonomous vehicle of the fleet may be received by one or more processors of a server computing system. That the received information meets at least one set of rules for generating blockages may be determined. Based on the determination that the received information meets at least one set of rules for generating blockages, blockage data identifying a specific area of a blockage may be generated. The blockage data may be sent to the autonomous vehicles of the fleet in order to enable the autonomous vehicles of the fleet to avoid the specific area of the blockage.
Example embodiments relate to methods for localizing light detection and ranging (lidar) calibration targets. An example method includes generating a point cloud of a region based on data from a light detection and ranging (lidar) device. The point cloud may include points representing at least a portion of a calibration target. The method also includes determining a presumed location of the calibration target. Further, the method includes identifying, within the point cloud, a location of a first edge of the calibration target. In addition, the method includes performing a comparison between the identified location of the first edge of the calibration target and a hypothetical location of the first edge of the calibration target within the point cloud if the calibration target were positioned at the presumed location. Still further, the method includes revising the presumed location of the calibration target based on at least the comparison.
The present disclosure relates to optical systems and vehicles, which may incorporate lidar sensors. An example optical system includes a light-emitter device configured to emit emission light. The optical system also includes an optical element including a first surface and an opposing second surface. The first surface includes a diffusing surface configured to diffuse the emission light to form diffused light. The second surface includes a focusing surface configured to focus the diffused light to provide an intensity profile of light emitted within a field of view of the optical system.
G01S 17/10 - Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
B60Q 1/00 - Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
G01S 7/481 - Constructional features, e.g. arrangements of optical elements
G01S 17/89 - Lidar systems, specially adapted for specific applications for mapping or imaging
G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles
G02B 1/04 - Optical elements characterised by the material of which they are made; Optical coatings for optical elements made of organic materials, e.g. plastics
Examples relate to near-field radar filters that can enhance measurements near a radar unit. An example may involve receiving a first set of radar reflection signals at a radar unit coupled to a vehicle and determining a filter configured to offset near-field effects of radar reflection signals received at the radar unit. In some instances, the filter depends on an azimuth angle and a distance for surfaces in the environment causing the first set of radar reflection signals. The example may also involve receiving, at the radar unit, a second set of radar reflection signals and determining, using the filter, an azimuth angle and a distance for surfaces in the environment causing the second set of radar reflection signals. The vehicle may be controlled based in part on the azimuth angle and the distance for the surfaces causing the second plurality of radar reflection signals.
Aspects of the disclosure relate to enabling playing of content at an autonomous vehicle. For example, a request to transport a user on a trip may be received. The autonomous vehicle may be assigned to the trip. Whether the user has enabled a content feature may be determined. In response to determining that the user has enabled the content feature a request for a device identifier is sent to the autonomous vehicle. The device identifier generated at the autonomous vehicle is received. The received device identifier may be sent to a content-enabling computing system including one or more processors in order to enable the user to play content from the client computing device at the autonomous vehicle during the trip.
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
G06Q 10/02 - Reservations, e.g. for tickets, services or events
G06Q 50/34 - Betting or bookmaking, e.g. Internet betting
The present disclosure relates to optical transmitter modules, lidar systems, and methods of their manufacture. An example optical transmitter module includes a transparent substrate and a plurality of wires disposed along the transparent substrate. The optical transmitter module includes driver circuitry electrically-coupled to at least a portion of the plurality of wires and one or more light-emitter devices electrically-coupled to at least a portion of the plurality of wires. The light-emitter device(s) are configured to emit light pulses. The optical transmitter module also includes a fast axis collimation lens disposed along the transparent substrate. The fast axis collimation lens is configured to collimate the light pulses so as to provide collimated light. The optical transmitter module also includes one or more waveguide structures disposed along the transparent substrate within an optical region. The optical transmitter module also includes a lid configured to provide a sealed interior volume.
Aspects and implementations of the present disclosure relate to performance and safety improvements for autonomous trucking systems, such as reactive suspensions for maximizing aerodynamic performance and minimizing mechanical impact from road imperfections, automated placement of emergency signaling devices, and techniques of enhanced illumination of stopped and stranded vehicles.
G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
B60G 17/016 - Resilient suspensions having means for adjusting the spring or vibration-damper characteristics, for regulating the distance between a supporting surface and a sprung part of vehicle or for locking suspension during use to meet varying vehicular or s the regulating means comprising electric or electronic elements characterised by their responsiveness, when the vehicle is travelling, to specific motion, a specific condition, or driver input
B60Q 1/04 - Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights
B60Q 1/30 - Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating rear of vehicle, e.g. by means of reflecting surfaces
B60Q 1/34 - Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating change of drive direction
B60W 10/20 - Conjoint control of vehicle sub-units of different type or different function including control of steering systems
B60W 40/12 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to parameters of the vehicle itself
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
B62D 35/00 - Vehicle bodies characterised by streamlining
G01S 13/931 - Radar or analogous systems, specially adapted for specific applications for anti-collision purposes of land vehicles
G01S 15/931 - Sonar systems specially adapted for specific applications for anti-collision purposes of land vehicles
G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles
G09F 7/00 - Signs, name or number plates, letters, numerals, or symbols; Panels or boards
A system and method are provided for early detection of objects by a perception system of a vehicle, and triggering a precautionary action by the vehicle in response without waiting for a more precise detection. The vehicle has a multi-level sensor range, wherein a first level of the sensor range is adjacent an outer bounds of the sensor range and has a first confidence value, and a second level of the sensor range is within the first range and has a second, higher confidence value. In situations when oncoming traffic is traveling at a high rate of speed, the vehicle responds to noisier detections, or objects perceived with a lower degree of confidence, rather than waiting for verification which may come too late.
Example embodiments relate to self-supervisory and automatic response techniques and systems. A computing system may use sensor data from an autonomous vehicle sensor to detect an object in the environment of the vehicle as the vehicle navigates a path. The computing system may then determine a detection distance between the object and the sensor responsive to detecting the object. The computing system may then perform a comparison between the detection distance and a baseline detection distance that depends on one or more prior detections of given objects that are in the same classification group as the object. The computing system may then adjust a control strategy for the vehicle based on the comparison.
Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for intervention behavior prediction. One of the methods includes receiving data characterizing a scene that includes a first agent and a second agent in an environment. A confounder prediction input generated from the data is processed using a confounder prediction model. A plurality of predicted conditional probability distributions is generated, wherein each predicted conditional probability distribution is conditioned on: (i) a planned intervention by the second agent, and (ii) the confounder variable belonging to a corresponding confounder class. An intervention behavior prediction for the first agent is generated based on the plurality of the predicted conditional probability distributions and the confounder distribution, wherein the intervention behavior prediction includes a probability distribution over the plurality of the possible behaviors for the first agent in reaction to the second agent performing the planned intervention.
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
09 - Scientific and electric apparatus and instruments
12 - Land, air and water vehicles; parts of land vehicles
39 - Transport, packaging, storage and travel services
42 - Scientific, technological and industrial services, research and design
Goods & Services
Transportation services, namely, business operations management relating to vehicles; transportation services, namely, tracking, locating, and monitoring of vehicles for commercial purposes; transportation logistics services, namely, arranging the transportation of goods for others; freight logistics management; monitoring, managing, and tracking of transportation of persons and delivery of goods and packages in transit, for business purposes; providing a website featuring information for business management of transportation logistics services, namely, providing business advisory services in the field of planning, coordinating, and tracking the transportation of goods, freight, people, and deliveries; providing a website featuring information for transportation logistics management services, namely, planning and scheduling shipments for users of transportation services; business data analysis; fleet management services in the nature of tracking, locating, and monitoring of fleet vehicles for commercial purposes; advertising services; promoting the goods and services of others; administering discount programs that enable customers to obtain discounts on goods and services; administration of a customer loyalty program which provides discounted ride-hail rides; providing a website featuring information about discount and rewards programs; arranging and conducting incentive rewards programs to incentivize engagement with a ride-hailing platform in order to promote that platform; charitable services, namely, organizing and conducting volunteer programs and community service projects Downloadable software for arranging, engaging, scheduling, managing, obtaining, booking, and coordinating travel, transportation, transportation services, ride-hailing, deliveries, and delivery services; downloadable software for the scheduling and dispatch of motorized vehicles; downloadable software for monitoring, managing, and tracking delivery of goods; downloadable software for requesting and ordering delivery services; downloadable software for planning, scheduling, controlling, monitoring, and providing information on transportation of assets and goods; downloadable software for tracking and providing information concerning pick-up and delivery of assets and goods in transit; downloadable software for accessing and providing online grocery and retail store services; downloadable software for providing and managing delivery of consumer goods, food, and groceries; downloadable real-time map software for tracking vehicles, trips, and deliveries; downloadable software for displaying transit routes; downloadable software for providing information on transportation and delivery services; downloadable software featuring information about food, grocery, and consumer products; downloadable software for users to administer, access, monitor, and manage loyalty programs and rewards; downloadable software for earning, tracking, and redeeming loyalty rewards, points, and discounts; downloadable software for issuing, setting up, distributing, and redeeming promotions, coupons, discounts, deals, vouchers, rebates, rewards, incentives, and special offers to customers; computer hardware and recorded software for the autonomous driving of motor vehicles; downloadable software in the nature of vehicle operating system software; downloadable software for autonomous vehicle operation, navigation, steering, calibration, and management; downloadable software for vehicle fleet management, namely, tracking fleet vehicles for commercial purposes; downloadable computer software for use as an application programming interface (API); recorded software and computer hardware for vehicle fleet launching, coordination, calibrating, direction, scheduling, booking, dispatching, and management; computer hardware and recorded software for use with accessing and operating vehicle cameras; recorded software utilizing artificial intelligence (AI), machine learning, and deep learning for data processing, personalization of autonomous vehicles and their systems, performing predictive analytics, operating autonomous vehicles, their systems, their devices, and their machinery; recorded software for data processing and contextual prediction, personalization of autonomous vehicles and their systems, performing predictive analytics in the fields of artificial intelligence (AI), machine learning, and deep learning; downloadable computer programs and downloadable software for operating autonomous vehicles, their systems, their devices, and their machinery in the fields of artificial intelligence (AI), machine learning, and deep learning; recorded software and computer hardware for operating autonomous vehicles, their systems, their devices, and their machinery; computer hardware and recorded software for operating vehicle cameras; downloadable computer software for use in operating and calibrating lidar; computer hardware for operating autonomous vehicles; navigational instruments for vehicles; laser object detectors for use on vehicles; laser device for sensing objects for use on vehicles; laser device for sensing outdoor terrain for use on vehicles; audio detectors; laser device for sensing distance to objects in the nature of laser rangefinders; electric sensors for determining position, velocity, direction, and acceleration; perimeter sensors in the nature of sensors that measure the presence of objects in the environment and the speed, trajectory, and heading of objects; environmental sensors for measuring the presence of objects in the environment and the speed of objects; vehicle sensors, namely, environmental sensors for measuring the presence of objects in the environment and the speed of objects; sensors for determining position, velocity, direction, and acceleration; vehicle safety equipment and hardware, namely, electronic signal receivers, brake controllers, cameras, and electronic sensors for monitoring vehicle functions, the presence of objects in the environment, and the speed, trajectory, and heading of objects; vehicle detection equipment and hardware, namely, cameras, global positioning system (GPS) receivers, and electronic sensors for monitoring vehicle functions, the presence of objects in the environment, and the speed, trajectory, and heading of objects; safety and driving assistant systems comprised of electronic sensors for determining position, velocity, direction, and acceleration of land vehicles; cameras; cameras for use with vehicles; downloadable data sets in the fields of machine perception and autonomous driving technology Self-driving transport vehicles; trucks; freight vehicles, namely, trucks and vans; shared transit land vehicles; freight vehicles in the nature of land vehicles; autonomous land vehicles and structural parts thereof Car rental; truck rental; rental of autonomous vehicles; truck transport; car transport; transport of passengers and goods by land vehicles; transport of persons; transport of goods; delivery of goods; delivery services by road; transportation and delivery services by autonomous vehicles; freight and cargo transportation services; freight transportation; supply chain logistics and reverse logistics services, namely, storage, transportation, and delivery of goods for others by land vehicle or truck; providing autonomous vehicle booking services for transportation purposes; travel arrangement, namely, arranging time-based ride-hailing; transportation services, namely, coordinating the pickup and dropoff of passengers at designated or directed locations for passenger and goods transportation purposes; transportation management services for others, namely, planning, coordinating, and tracking of people and deliveries for transportation purposes; providing transportation information; providing a website featuring information regarding autonomous car transportation and delivery services; providing a website featuring information in the field of transportation; providing a website featuring information regarding transportation services and bookings for transportation services; providing a website featuring information regarding delivery services and bookings for delivery services; providing a website featuring information about tracking the transportation of goods, freight, and deliveries; charitable services, namely, transportation and delivery services by road Providing online non-downloadable software for arranging, engaging, scheduling, managing, obtaining, booking, and coordinating travel, transportation, transportation services, ride-hailing, deliveries, and delivery services; providing online non-downloadable software for tracking, locating, and monitoring vehicles; providing online non-downloadable software for coordinating the transport and delivery of goods; providing online non-downloadable software for arranging, procuring, scheduling, engaging, coordinating, managing, and booking transportation and deliveries; providing online non-downloadable software for providing and managing delivery services; providing online non-downloadable software for providing and managing delivery of consumer goods, food, and groceries; providing online non-downloadable software for accessing and viewing transit information, schedules, routes, and prices; providing temporary use of online non-downloadable real-time map software for tracking vehicles and deliveries; providing a website featuring online non-downloadable software that enables users to request transportation; providing temporary use of online non-downloadable computer software for identifying trip delays and vehicle location; providing temporary use of online non-downloadable software for accessing transportation services, bookings for transportation services and dispatching motorized vehicles; providing online non-downloadable software for issuing, setting up, distributing, redeeming, and accessing promotions, coupons, discounts, deals, vouchers, rebates, rewards, incentives, and special offers; providing online non-downloadable software for vehicle fleet management; software as a service (SaaS) services featuring computer software for use as an application programming interface (API); providing online non-downloadable software for vehicle coordination, navigation, calibrating, direction, and management of vehicle on-board computers; software as a service (SaaS) services featuring software for vehicle coordination, navigation, calibrating, direction, and management of vehicle on-board computers; providing online non-downloadable software for analyzing the location of deliveries, as well as and data related to their transportation; electronic monitoring and reporting of transportation data using computers or sensors; providing online non-downloadable software for the autonomous driving of motor vehicles; providing online non-downloadable software for autonomous vehicle navigation, steering, calibration, and management; providing online non-downloadable software for visualization, manipulation, and integration of digital graphics and images; providing online non-downloadable software for utilizing artificial intelligence (AI), machine learning, and deep learning for data processing and contextual prediction, personalization of autonomous vehicles and their systems, performing predictive analytics, operating autonomous vehicles, their systems, their devices, and their machinery; providing online non-downloadable software for use in operating and calibrating lidar; providing online non-downloadable software used for data analytics in the field of transportation; providing online non-downloadable software used for data analytics in the field of transportation fleet management; providing online non-downloadable open source software for use in data management; land and road surveying; surveying services and data collection and analysis in connection therewith; mapping services; providing online non-downloadable software for accessing location, GPS, and motion sensor data for safety and emergency response purposes; providing online non-downloadable software for requesting and receiving emergency assistance; providing online non-downloadable software for vehicle safety monitoring and incident detection; providing a website for gaining access to data sets in the fields of machine perception and autonomous driving technology; providing information about autonomous-vehicle and machine-perception research via a website; research, design, and development in the field of artificial intelligence hardware and software; research, design, and development in the field of autonomous hardware and software technologies; research, design, and development in the field of machine motion and object perception, namely, the technological detection of the presence of people or objects and their motion prediction; research, design, and development of computer hardware and software for use with vehicle on-board computers for monitoring and controlling motor vehicle operation; installation, updating, and maintenance of computer software for use with vehicle on-board computers for monitoring and controlling motor vehicle operation; research, design, and development of computer hardware and software for vehicle coordination, navigation, calibrating, direction, and management; research, design, and development of electronic sensors and for structural parts thereof; research, design, and development of lasers for sensing objects and distance to objects, lasers for sensing indoor and outdoor terrain, lasers for measuring purposes, laser measuring systems, lidar being light detection and ranging apparatus, and laser hardware equipment; advanced product research, design, and development in the field of artificial intelligence in connection with autonomous vehicles; design and development of computer hardware and software; research, design, and development of vehicle software; technological, scientific and research services in the field of computer robotics, self-driving car software and hardware, and autonomous vehicle software and hardware; providing virtual computer systems and virtual computer environments through cloud computing for the purpose of training self-driving cars, autonomous vehicles and robots; virtual testing of the functions of self-driving cars, autonomous vehicles and robots using computer simulations; creation, development, programming and implementation of simulation software in the field of self-driving cars, autonomous vehicles and robots; creating simulation computer software programs for autonomous vehicles
Data representing a set of predicted trajectories and a planned trajectory for an autonomous vehicle is obtained. A predictability score for the planned trajectory can be determined based on a comparison of the planned trajectory to the set of predicted trajectories for the autonomous vehicle. The predictability score indicates a level of predictability of the planned trajectory. A determination can be made, based at least on the predictability score, whether to initiate travel with the autonomous vehicle along the planned trajectory. In response to determining to initiate travel with the autonomous vehicle along the planned trajectory, a control system can be directed to maneuver the autonomous vehicle along the planned trajectory.
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
G06F 18/214 - Generating training patterns; Bootstrap methods, e.g. bagging or boosting
G06F 18/2415 - Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
Methods, computer systems, and apparatus, including computer programs encoded on computer storage media, for predicting future trajectories for an agent in an environment. A system obtains scene context data characterizing the environment. The scene context data includes data that characterizes a trajectory of an agent in a vicinity of a vehicle in an environment up to a current time point. The system identifies a plurality of initial target locations in the environment. The system further generates, for each of a plurality of target locations that each corresponds to one of the initial target locations, a respective predicted likelihood score that represents a likelihood that the target location will be an intended final location for a future trajectory of the agent starting from the current time point. For each target location in a first subset of the target locations, the system generates a predicted future trajectory for the agent that is a prediction of the future trajectory of the agent given that the target location is the intended final location for the future trajectory. The system further selects, as likely future trajectories of the agent starting from the current time point, one or more of the predicted future trajectories.
Aspects and implementations of the present disclosure address shortcomings of existing technology by enabling autonomous vehicle simulations based on retro-reflection optical data. The subject matter of this specification can be implemented in, among other things, a method that involves initiating a simulation of an environment of an autonomous driving vehicle, the simulation including a plurality of simulated objects, each having an identification of a material type of the respective object. The method can further involve accessing simulated reflection data based on the plurality of simulated objects and retro-reflectivity data for the material types of the simulated objects, and determining, using an autonomous vehicle control system for the autonomous vehicle, a driving path relative to the simulated objects, the driving path based on the simulated reflection data.
G06F 30/20 - Design optimisation, verification or simulation
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
G01S 17/10 - Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
G01S 17/89 - Lidar systems, specially adapted for specific applications for mapping or imaging
A method and apparatus are provided for determining whether a driving environment has changed relative to previously stored information about the driving environment. The apparatus may include an autonomous driving computer system configured to detect one or more vehicles in the driving environment, and determine corresponding trajectories for those detected vehicles. The autonomous driving computer system may then compare the determined trajectories to an expected trajectory of a hypothetical vehicle in the driving environment. Based on the comparison, the autonomous driving computer system may determine whether the driving environment has changed and/or a probability that the driving environment has changed, relative to the previously stored information about the driving environment.
This technology relates to a system for cooling sensor components. The cooling system may include a sensor which has a sensor housing, a motor, a main vent, and a side vent. Internal sensor components may be positioned within the sensor housing. The motor may be configured to rotate the sensor housing around an axis. The rotation of the sensor housing may pull air into an interior portion of the sensor housing through the main vent, and the air pulled into the interior portion of the sensor housing may be exhausted out of the interior portion of the sensor housing through the side vent.
A method includes applying, by a switching circuit, pulses of an input voltage to an input of an inductor. The method includes charging, in accordance with an off state of a switch, a charge storage device through the inductor using the pulses of the input voltage such that the circuit node develops a charge voltage that is greater than the input voltage. The method includes discharging, in accordance with an on state of the switch, the charge storage device such that a first portion of the charge voltage is applied to a light emitter and a second portion of the charge voltage is applied to parasitic inductance. The method includes controlling, by a controller, a timing of the pulses of the input voltage applied by the switching circuit based on a parasitic inductance from a previous charging cycle of the charge storage device, so as to control the charge voltage.
G01S 17/10 - Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
G01S 17/14 - Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves wherein a voltage or current pulse is initiated and terminated in accordance with the pulse transmission and echo reception respectively, e.g. using counters
H03K 3/017 - Adjustment of width or dutycycle of pulses
H03K 3/53 - Generators characterised by the type of circuit or by the means used for producing pulses by the use of an energy-accumulating element discharged through the load by a switching device controlled by an external signal and not incorporating positive feedback
Example embodiments relate to a method for cut-in identification and classification. An example embodiment includes a obtaining operational data about one or more vehicles; based on the operational data, identifying the presence of one or more cut-ins within the operational data; extracting, from the operational data, cut-in data that depicts one or more of the cut-ins identified within the operational data; and, based on the extracted cut-in data, training a model for controlling an autonomous vehicle. Identifying the presence of a given cut-in includes: determining that at least one vertex of a bounding box surrounding a vehicle was located more than a threshold distance within a lane being navigated by a given vehicle; and determining that the ability of the given vehicle to maintain its course and speed was impeded by the presence of the particular additional vehicle within the lane.
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
G05B 13/02 - Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion electric
Methods, systems, and apparatus, including computer programs encoded on computer storage media, for selecting actions for an agent at a specific real-world location using historical data generated at the same real-world location. One of the methods includes determining a current geolocation of an agent within an environment; obtaining historical data for geolocations in a vicinity of the current geolocation of the agent from a database that maintains historical data for a plurality of geolocations within the environment, the historical data for each geolocation comprising observations generated at least in part from sensor readings of the geolocation captured by vehicles navigating through the environment; generating an embedding of the obtained historical data; and providing the embedding as an input to a policy decision-making system that selects actions to be performed by the agent.
The disclosure relates to assessing operation of a camera. In one instance, a volume of space corresponding to a first vehicle in an environment of a second vehicle may be identified using sensor data generated by a LIDAR system of the second vehicle. An image captured by a camera of the second vehicle may be identified. The camera may have an overlapping field of view of the LIDAR system at a time when the sensor data was generated. An area of the image corresponding to the volume of space may be identified and processed in order to identify a vehicle light. The operation of the camera may be assessed based on the processing.
G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
Example embodiments relate to methods of increasing a temperature of a computer module to start the computer at environmental temperatures below a threshold temperature. An example embodiment includes receiving, at one or more computing components thermally coupled to a main computer via a liquid-cooled plate, a set of program instructions. The method can also include running the set of program instructions on at least one computing component. Running the set of program instructions on the computing component can generate heat that flows to the main computer via the liquid-cooled plate. The method can additionally include detecting, from at least one thermal sensor coupled to the liquid-cooled plate, a temperature reading indicative of a temperature of the main computer. The method can further include determining that the temperature reading has reached a predetermined temperature threshold and based on the temperature reading reaching the predetermined temperature threshold, powering on the main computer.
Aspects of the disclosure relate to providing sensor data on a display of a vehicle. For instance, data points generated by a lidar sensor may be received. The data points may be representative of one or more objects in an external environment of the vehicle. A scene including a representation of the vehicle from a perspective of a virtual camera, a first virtual object corresponding to at least one of the one or more objects, and a second virtual object corresponding to at least one object identified from pre-stored map information may be generated. Supplemental points corresponding to a surface of the at least one object identified from the pre-stored map information may be generated. A pulse including at least some of the data points generated by the sensor and the supplemental points may be generated. The scene may be displayed with the pulse on the display.
A system includes an image sensor having a plurality of pixels that form a plurality of regions of interest (ROIs), image processing resources, and a scheduler configured to perform operations including determining a priority level for a particular ROI of the plurality of ROIs based on a feature detected by one or more image processing resources of the image processing resources within initial image data associated with the particular ROI. The operations also include selecting, based on the feature detected within the initial image data, a particular image processing resource of the image processing resources by which subsequent image data generated by the particular ROI is to be processed. The operations further include inserting, based on the priority level, the subsequent image data into a processing queue of the particular image processing resource to schedule the subsequent image data for processing by the particular image processing resource.
H04N 25/443 - Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by partially reading an SSIS array by reading pixels from selected 2D regions of the array, e.g. for windowing or digital zooming
G06F 9/50 - Allocation of resources, e.g. of the central processing unit [CPU]
G06V 10/22 - Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
H04N 23/61 - Control of cameras or camera modules based on recognised objects
H04N 23/80 - Camera processing pipelines; Components thereof
H04N 25/79 - Arrangements of circuitry being divided between different or multiple substrates, chips or circuit boards, e.g. stacked image sensors
H04N 25/13 - Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
H04N 25/133 - Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements including elements passing panchromatic light, e.g. filters passing white light
Aspects of the disclosure provide for controlling an autonomous vehicle to respond to queuing behaviors at pickup or drop-off locations. As an example, a request to pick up or drop off a passenger at a location may be received. The location may be determined to likely have a queue for picking up and dropping off passengers. Based on sensor data received from a perception system, whether a queue exists at the location may be determined. Once it is determined that a queue exists, it may be determined whether to join the queue to avoid inconveniencing other road users. Based on the determination to join the queue, the vehicle may be controlled to join the queue.
This technology relates to a display mounted messaging system. The display mounted messaging system may include a light emitting diode (LED) display attached to a housing of a sensor. The housing of the sensor may rotate. The display mounted messaging system may also include an LED controller which is configured to selectively activate and deactivate at least one LED in the LED display, to provide a message in the direction of an intended recipient.
G09F 13/04 - Signs, boards, or panels, illuminated from behind the insignia
G09F 9/33 - Indicating arrangements for variable information in which the information is built-up on a support by selection or combination of individual elements in which the desired character or characters are formed by combining individual elements being semiconductor devices, e.g. diodes
G09F 13/22 - Illuminated signs; Luminous advertising with luminescent surfaces or parts electroluminescent
G09F 13/30 - Illuminated signs; Luminous advertising with moving light sources, e.g. rotating luminous tubes
G09F 21/04 - Mobile visual advertising by land vehicles
98.
Single Antenna with Dual Circular Polarizations and Quad Feeds for Millimeter Wave Applications
Example embodiments relate to a substrate integrated waveguide (SIW) with dual circular polarizations. An example SIW may include a dielectric substrate and a first metallic layer coupled to a top surface of the dielectric substrate with a through-hole extending through the dielectric substrate and the first metallic layer. The SIW also includes a dielectric layer coupled to a top surface of the first metallic layer. A second metallic layer is coupled to a top surface of the dielectric layer. The second metallic layer includes a non-conductive opening, a plurality of feeds with a first end in the non-conductive opening and a second end including a single-ended termination, and an impedance transformer. The SIW also includes a third metallic layer coupled to a bottom of the dielectric substrate, and a set of metallic via-holes proximate the non-conductive opening and coupling the second metallic layer to the third metallic layer.
H01P 3/16 - Dielectric waveguides, i.e. without a longitudinal conductor
H01Q 1/50 - Structural association of antennas with earthing switches, lead-in devices or lightning protectors
H01Q 21/24 - Combinations of antenna units polarised in different directions for transmitting or receiving circularly and elliptically polarised waves or waves linearly polarised in any direction