LRR of Fig. 3D) may be arranged at different locations within a vehicle to provide notifications, alerts and control options. Information may be dynamically and automatically switched between these displays, as well as a rider's own personal communication device(s) (506 of Fig. 5). What information to present on each screen can depend on factors including how many riders are in the vehicle, their seating within the vehicle (1002 of Fig. 10), how their attention is focused (420 of Fig. 4B) and/or display location and size. Certain information may be mirrored or otherwise duplicated among multiple screens (880 of Fig. 8E) while other information can be presented asymmetrically on different screens (820 of Fig. SB). Presented information may include a "monologue" from the vehicle explaining why a driving action is taken or not taken, alerts about important conditions (660 of Fig. 6D), buttons to control certain functionality of the vehicle (760 of Fig. 7D), or other information that may be of interest to the rider.
B60W 50/14 - Means for informing the driver, warning the driver or prompting a driver intervention
B60W 40/08 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to drivers or passengers
B60K 35/00 - Arrangement or adaptations of instruments
B60K 37/06 - Arrangement of fittings on dashboard of controls, e.g. control knobs
2.
OPTIMIZED MULTICHANNEL OPTICAL SYSTEM FOR LIDAR SENSORS
The subject matter of this specification can be implemented in, among other things, systems and methods of optical sensing that utilize optimized processing of multiple sensing channels for efficient and reliable scanning of environments. The optical sensing includes multiple optical communication lines that include coupling portions configured to facilitate efficient collection of various received beams. The optical sensing system further includes multiple light detectors configured to process collected beams and produce data representative of a velocity of an object that generated the received beam and/or a distance to that object.
A method for controlling an autonomous vehicle (100) includes using one or more computing devices (320, 330) to transmit a request for a trip. The trip is from a pickup location to a destination location. The method also includes determining the autonomous vehicle for the trip is within a predetermined distance from the pickup location, providing a set of component controls to receive user input at a user interface (324) after the determining. The set of component controls includes interactive controls for identifying or accessing the autonomous vehicle. A first user input is received at the user interface for one or more of the set of component controls, and control instructions for the autonomous vehicle based on the first user input are transmitted.
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
B60W 40/08 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to drivers or passengers
B60W 50/08 - Interaction between the driver and the control system
G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
An example method and system for filtering point cloud data includes obtaining point cloud data from a LIDAR device. The point cloud data may include at least a first pulse-length range and a second pulse-length range. The first range may include one or more first-length pulses and the second range may include one or more second-length pulses. The method may further include filtering the point cloud data by determining respective magnitudes of each of the one or more first-length pulses and each of the one or more second-length pulses, comparing the magnitudes of the first-length pulses to a first threshold, comparing the magnitudes of the second-length pulses to a second threshold, and removing any pulses having a magnitude less than the respective thresholds. The method may further include determining, based on the filtered point cloud data, objects in an environment around the LIDAR.
A multi-dimensional trajectory planning system is disclosed that includes planning and actuator modules. The planning module executes the planning application to: determine a first dimensionality including first dimensions for a first stage, where each of the first dimensions are active, and where the first dimensions include two or more dimensions; determine a second dimensionality including second dimensions for a second stage, where the second dimensions include the first dimensions or a subset of the first dimensions, and where the second stage has a lower level of dimensionality than the first stage; based on map data and sensor data, estimates first possible future states of the first dimensions for the first stage, and estimates second possible future states of the second dimensions for the second stage based on the first possible future states; and selects a trajectory plan based on the second possible future states.
G05D 1/02 - Control of position or course in two dimensions
B60W 30/095 - Predicting travel path or likelihood of collision
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
6.
OPTICAL SENSOR FOR MIRROR ZERO ANGLE IN A SCANNING LIDAR
The present disclosure relates to systems and methods that provide an accurate angle measurement of a rotatable mirror. An example method includes receiving, from a detector device, a reflected light signal. The reflected light signal is indicative of primary reflection light and secondary reflection light. The primary reflection light corresponds to a first portion of emission light that reflects directly from the reflective surface of the rotatable mirror toward the detector device. The secondary reflection light corresponds to a second portion of emission light that: 1) reflects from the reflective surface of the rotatable mirror toward a secondary mirror surface; 2) reflects from the secondary mirror surface toward the reflective surface of the rotatable mirror; and 3) reflects from the reflective surface of the rotatable mirror toward the detector device. The method also includes determining, based on the reflected light signal, the rotational angle of the rotatable mirror.
G01B 11/27 - Measuring arrangements characterised by the use of optical techniques for testing the alignment of axes for testing the alignment of axes
G01S 7/481 - Constructional features, e.g. arrangements of optical elements
G02B 7/182 - Mountings, adjusting means, or light-tight connections, for optical elements for mirrors for mirrors
H01S 5/183 - Surface-emitting [SE] lasers, e.g. having both horizontal and vertical cavities having only vertical cavities, e.g. vertical cavity surface-emitting lasers [VCSEL]
Methods, systems, and apparatus, including computer programs encoded on computer storage media, for predicting scene flow. One of the methods includes obtaining a current point cloud representing an observed scene at a current time point; obtaining object label data that identifies a first three-dimensional region in the observed scene; determining, for each current three-dimensional point that is within the first three-dimensional region and using the object label data, a respective preceding position of the current three-dimensional point at a preceding time point in a reference frame of the sensor at the current time point; and generating, using the preceding positions, a scene flow label for the current point cloud that comprises a respective ground truth motion vector for each of a plurality of the current three-dimensional points.
H04N 19/139 - Analysis of motion vectors, e.g. their magnitude, direction, variance or reliability
G06T 7/70 - Determining position or orientation of objects or cameras
G06V 20/70 - Labelling scene content, e.g. deriving syntactic or semantic representations
G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
Aspects of the disclosure provide for generating a visualization of a three-dimensional (3D) world view from the perspective of a camera of a vehicle 100. For example, images of a scene captured by a camera of the vehicle and 3D content for the scene may be received (810). A virtual camera model for the camera of the vehicle may be identified (820). A set of matrices may be generated using the virtual camera model (830). The set of matrices may be applied to the 3D content to create a 3D world view (840). The visualization may be generated using the 3D world view as an overlay with the image, and the visualization provides a real-world image from the perspective of the camera of the vehicle with one or more graphical overlays of the 3D content (850).
Systems and methods are disclosed to identify a presence of a volumetric medium in an environment associated with a LIDAR system. In some implementations, the LIDAR system may emit a light pulse into the environment, receive a return light pulse corresponding to reflection of the emitted light pulse by a surface in the environment, and determine a pulse width of the received light pulse. The LIDAR system may compare the determined pulse width with a pulse width of light reflected from a road surface to determine a pulse elongation difference value. The LIDAR system may identify the surface as debris on or near the road surface based, at least in part, on the determined pulse elongation difference value.
Aspects of the disclosure relate to timing pickups of passengers autonomous vehicles. For instance, while an autonomous vehicle 100 is maneuvering itself to a pickup location 690 for picking up a passenger, an estimated time of arrival for the passenger to reach the pickup location may be identified. The estimated time of arrival may be used to plan a route 980 to the pickup location. The vehicle may be maneuvered to the pickup location using the route.
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
Example embodiments relate to substrate integrated waveguide (SIW) transitions. An example SIW may include a dielectric substrate having a top surface and a bottom surface and a first metallic layer portion coupled to the top surface of the dielectric substrate that includes a single-ended termination, an impedance transformer, and a metallic rectangular patch located within an open portion in the first metallic layer portion such that the open portion forms a non-conductive loop around the metallic rectangular patch. The SIW also includes a second metallic layer portion coupled to the bottom surface of the dielectric substrate and metallic via-holes electrically coupling the first metallic layer to the second metallic layer. The SIW may be implemented in a radar unit to couple antennas to a printed circuit board (PCB). In some examples, the SIW may be implemented with only a non-conductive opening that lacks the metallic rectangular patch.
Example embodiments relate to low elevation side lobe antennas with fan-shaped beams. An example radar unit may include a radiating plate having a first side and a second side with an illuminator, a waveguide hom, a waveguide opening, and a radiating sleeve extending into the first side of the radiating plate. The waveguide opening is positioned on the first end of the first side and the radiating sleeve is positioned on the second end of the first side. The radar unit also includes a metallic cover coupled to the first side of the radiating plate such that the metallic cover and the radiating plate form waveguide structures. The waveguide hom is configured to receive, from an external source, electromagnetic energy provided through the waveguide opening via a first waveguide and provide a portion of the electromagnetic energy to the illuminator via a second waveguide such that the portion of the electromagnetic energy radiates off the illuminator and through the radiating sleeve into an environment of the radar unit as one or more radar signals.
H01Q 1/32 - Adaptation for use in or on road or rail vehicles
H01Q 1/38 - Structural form of radiating elements, e.g. cone, spiral, umbrella formed by a conductive layer on an insulating support
G01S 7/03 - RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES - Details of systems according to groups , , of systems according to group - Details of HF subsystems specially adapted therefor, e.g. common to transmitter and receiver
Example embodiments relate to GNSS time synchronization in redundant systems. A redundant system configured with two subsystems may initially synchronize clocks from both subsystems to GNSS time from a GNSS receiver. The synchronization of the first subsystem's clock may involve using a first communication link that enables communication between the first subsystem and the GNSS receiver while the synchronization of the second subsystem's clock may involve using both the first communication link and a second communication link that enables communication between the subsystems. The redundant system may then synchronize the first subsystem's clock to the second subsystem's clock while the second subsystem's clock is still synchronized to GNSS time from the GNSS receiver based on timepulses traversing a pair of wires that connect the subsystems and the GNSS receiver.
The technology employs a variable motion control envelope that enables an on-board computing system (200) of a self-driving vehicle (100) to estimate future vehicle driving behavior along an upcoming path, in order to maintain a desired amount of control during autonomous driving. Factors including intrinsic vehicle properties (610-616), extrinsic environmental influences (608) and road friction information (604) are evaluated. Such factors can be evaluated (618-622) to derive an available acceleration model (624), which defines an envelope of maximum longitudinal and lateral accelerations for the vehicle. This model, which may identify dynamically varying acceleration limits (720) that can be affected by road conditions and road configurations, may be used by the on-board control system (e.g., a planner module (323) of the processing system) to control driving operations of the vehicle in an autonomous driving mode (628).
B60W 40/12 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to parameters of the vehicle itself
B60W 10/18 - Conjoint control of vehicle sub-units of different type or different function including control of braking systems
B60W 10/20 - Conjoint control of vehicle sub-units of different type or different function including control of steering systems
B60W 10/04 - Conjoint control of vehicle sub-units of different type or different function including control of propulsion units
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
15.
SYSTEMS, APPARATUS, AND METHODS FOR RETRIEVING IMAGE DATA OF IMAGE FRAMES
Described examples relate to an apparatus comprising a memory tor storing a sequence of image frames and at least one processor. The at least one processor may be configured to receive a. first image frame of a. sequence of image frames from an image capture device and select a first portion of a first image frame. The at least one processor may also be configured to obtain alignment information and determine a first portion and a second portion of a second image frame based on the alignment information Further, the at least one processor may be configured to determine a bounding region within the second linage frame and fetch image data corresponding to the bounding region of the second image frame from memory. In some examples, the first image frame may comprise a base image and the second image frame may comprise an alternative image frame. Further., the first image frame may comprise any one of the image frames of the sequence of image frames.
H04N 21/231 - Content storage operation, e.g. caching movies for short term storage, replicating data over plural servers or prioritizing data for deletion
H04N 21/274 - Storing end-user specific content or additional data in response to end-user request
H04N 21/232 - Content retrieval operation within server, e.g. reading video streams from disk arrays
H04N 5/77 - Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
16.
SYSTEMS, APPARATUS, AND METHODS FOR ENHANCED IMAGE CAPTURE
Described examples relate to an apparatus comprising one or more image sensors coupled to a vehicle and at least one processor. The at least one processor may be configured to capture, in a burst sequence using the one or more image sensors, multiple frames of an image of a scene, the multiple frames having respective, relative offsets of the image across the multiple frames and perform super-resolution computations using the captured, multiple frames of the image of the scene. The at least one processor may also be configured to accumulate, based on the superresolution computations, color planes and combine, using the one or more processors, the accumulated color planes to create a super-resolution image of the scene.
Described examples relate to an apparatus comprising a first sensor configured to scan an area of interest during a first time period and a second sensor configured to capture a plurality of images of a field of view. The apparatus may include at least one controller configured to receive the plurality of images captured by the second sensor, compare the timestamp information associated with at least one image of the plurality of images to at least one time period of the first time period, and select a base image from the plurality of images based on the comparison.
The subject matter of this specification can be implemented in, among other things, a system that includes a light source to produce a first beam and one or more first optical elements to impart an orbital angular momentum (OAM) to at least some of a plurality of output beams obtained from the first beam. The system further includes an output optical device to direct the output beams towards a target object and a plurality of photodetectors to generate signals representative of a difference between an input phase information, carried by one or more beams reflected from the target object, and an output phase information, carried by one of local copies of the output beams. The system further includes a processing device to determine, using the difference between the input phase information and the output phase information, one or more orthogonal components of a velocity of the target object.
G01P 3/36 - Devices characterised by the use of optical means, e.g. using infrared, visible, or ultraviolet light
G01P 3/68 - Devices characterised by the determination of the time taken to traverse a fixed distance using optical means, i.e. using infrared, visible, or ultraviolet light
A system includes an image sensor having a plurality of pixels that form a plurality of regions of interest (ROIs), and configured to operate at a frame rate higher than a threshold rate. The system also includes an image processing resource. The system further includes control circuitry configured to perform operations that include obtaining, from the image sensor, a full-resolution image of an environment. The full-resolution image contains each respective ROI of the plurality of ROIs. The operations also include selecting a particular ROI based on the full-resolution image, and detecting an object of interest in the particular ROI. The operations include determining a mode of operation by which subsequent image data generated by the particular ROI is to be processed. The operations further include processing, based on the mode of operation and the frame rate, the image data comprising a plurality of ROI images of the object of interest.
H04N 5/343 - Extracting pixel data from an image sensor by controlling scanning circuits, e.g. by modifying the number of pixels having been sampled or to be sampled by switching between different modes of operation using different resolutions or aspect ratios, e.g. between still and video mode or between interlaced and non-interlaced mode
The present disclosure relates to systems, vehicles, and methods for detecting optical defects in an optical path of a camera system. An example system may include an image sensor configured to provide images of a field of view via an optical path that extends through an optical window. The system also includes at least one phase-detection device and a controller. The controller is configured to execute instructions stored in the memory so as to carry out various operations, including receiving, from the image sensor, first pixel information indicative of an image of the field of view. The operations additionally include receiving, from the at least one phase-detection device, second pixel information indicative of a portion of the field of view. The operations yet further include determining, based on the first pixel information and the second pixel information, at least one optical defect associated with the optical path.
Computing devices, systems, and methods described in various embodiments herein may relate to a light detection and ranging (lidar) system, An example computing device could include a controller having at least one processor and at least one memory. The at least one processor is configured to execute program instructions stored in the at least one memory so as to carry out operations, Hie operations include receiving information identifying an environmental condition surrounding a vehicle, the environmental condition being at least one of fog, mist, snow, dust, or rain. The operations also include determining a range of interest within a field of view of the lidar system based on the received information. The operations also include adjusting at least one of: a return light detection time period, sampling rate, or filtering threshold, for at least, a portion of the field of view based on the determined range of interest.
The subject matter of this specification can be implemented in, among other things, systems and methods of optical sensing that utilize time and frequency multiplexing of sensing signals. Described are, among other things, a light source subsystem to produce a first beam having a first frequency and a second beam having a second frequency, a modulator to impart a modulation to the second beam, and an optical interface subsystem to receive a third beam caused by interaction of the first beam with an object and a fourth beam caused by interaction of the second beam with the object. Also described are one or more circuits to determine, based on a first phase information carried by the third beam, a velocity of the object, and then determine, based on a second phase information carried by the third beam and the first phase information, a distance to the object.
G01S 17/50 - Systems of measurement based on relative movement of target
G01S 7/4861 - Circuits for detection, sampling, integration or read-out
G01S 7/4865 - Time delay measurement, e.g. time-of-flight measurement, time of arrival measurement or determining the exact position of a peak
G01S 17/34 - Systems determining position data of a target for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated using transmission of continuous, frequency-modulated waves while heterodyning the received signal, or a signal derived therefrom, with a locally-generated signal related to the contemporaneously transmitted signal
G01S 17/93 - Lidar systems, specially adapted for specific applications for anti-collision purposes
23.
COUPLED LASERS FOR COHERENT DISTANCE AND VELOCITY MEASUREMENTS
The subject matter of this specification can be implemented in, among other things, systems and methods that enable lidar channel multiplexing using optical locking of separate lasers while simultaneously imparting different frequency offsets to beams output by different lasers. As a result, received beams reflected from various objects, which are present in an outside environment, can have Doppler-shifted frequencies that do not overlap, facilitating concurrent identification of distances to and velocities of multiple objects.
G01S 7/4913 - Circuits for detection, sampling, integration or read-out
G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles
B60W 30/08 - Predicting or avoiding probable or impending collision
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
24.
IMPACT RESISTANT HEATED WINDOW MOUNT FOR THERMAL CAMERA
The present disclosure relates to optical systems, vehicles, and methods for providing improved mechanical performance of a camera and corresponding optical elements. An example optical system includes an outer housing and an inner support member. The optical system also includes an optical window coupled to the outer housing and the inner support member. The optical window is configured to be temperature-controllable. The optical system also includes a camera coupled to the inner support member. The camera is optically coupled to the optical window. Additionally, the outer housing, the optical window, and the camera are configured to be impact resistant.
Examples described relate to systems, methods, and apparatus for transmitting image data. The apparatus may comprise a memory buffer configured to store data elements of an image frame and generate a signal indicating that each of the data elements of the image frame have been written to the memory buffer, a processor configured to initiate a flush operation for reading out at least one data element of the image frame from the memory' buffer and to output the at least one data element at a first rate, a rate adjustment unit configured to receive the at least one data element from the processor at the first rate and to output the at least one data element at a second rate, and a multiplexer configured to receive the at least one data element from the processor at the first, rate and configured to receive the at least one data element from the rate adjustment unit at the second rate. The multiplexer may select the at least one data element at the second rate for transmitting on a bus in response to receiving the signal from the memory buffer.
H04N 21/231 - Content storage operation, e.g. caching movies for short term storage, replicating data over plural servers or prioritizing data for deletion
H04N 21/274 - Storing end-user specific content or additional data in response to end-user request
H04N 21/238 - Interfacing the downstream path of the transmission network, e.g. adapting the transmission rate of a video stream to network bandwidth; Processing of multiplex streams
H04N 5/77 - Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
H04N 5/907 - Television signal recording using static stores, e.g. storage tubes or semiconductor memories
26.
SYSTEMS, APPARATUS, AND METHODS FOR REORDERING IMAGE DATA
Described examples relate to an apparatus for rearranging or reordering image data of a stream. The apparatus may include control circuitry coupled to a memory array, The control circuitry may be configured to receive the image data of the stream that includes a first image comprising first data elements organized in a row-wise format or a column-wise format and a second image comprising second data elements organized in a row-wise format or a column-wise format. The control circuitry may also be configured to write the first data elements to the memory array in a first order according to a first addressing sequence, read the first data elements from tire memory array in a second order according to a second addressing sequence, and write the second data elements in the memory' array according to the second addressing sequence.
H04N 21/231 - Content storage operation, e.g. caching movies for short term storage, replicating data over plural servers or prioritizing data for deletion
H04N 21/433 - Content storage operation, e.g. storage operation in response to a pause request or caching operations
H04N 5/77 - Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
The subject matter of this specification can be implemented in, among other things, a system that includes a light source to produce a first beam, a diffraction optical element (DOE) to generate, based on the first beam configured to have a first phase information, one or more second beams. The system further includes a DOE control module to configure the DOE, for each of a plurality of times, into a respective one of a plurality of DOE configurations, and cause each of the one or more second beams to have a phase information that is different from a phase information of the first beam, wherein the phase information of each of the one or more second beams is determined by a time sequence of the plurality of DOE configurations.
A light detection and ranging (lidar) device may be coupled to a vehicle and configured to scan a surrounding environment to determine ranges to one or more objects in the surrounding environment of the vehicle. The lidar device may generate data that can be used to form a range image, which includes or is based on range data determined for the one or more objects. The lidar device may also generate data that can be used to form a corresponding background image, which includes background light intensity data that the lidar device measures during the scan. The background image or background image data may be used to add range data to the range image, correct range data in the range image, and/or evaluate the quality of the range data in the range image. In this way, the background image or background image data can be used to generate an enhanced range image that includes range data that is more comprehensive and/or more reliable than the range data included in the original range image.
G01S 7/481 - Constructional features, e.g. arrangements of optical elements
G01S 7/4863 - Detector arrays, e.g. charge-transfer gates
G01S 17/18 - Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves wherein range gates are used
The technology involves communicating the reachability status associated with an autonomous vehicle (100) to a user such as a rider within the vehicle, a person awaiting pickup, or a customer that scheduled a package delivery. Reachability information (512, 514, 1106) about pickup and/or drop off locations is presentable via an app (510) on a user device, which helps set expectations with customers about where the vehicle is most likely to be able to perform a pickup and/or drop off. This may include indicating how much variance there may be based on current congestion, parking or idling regulations, or weather conditions. The reachability information may be presented via one or more visualization tools (602, 608, 612, 622) to indicate the uncertainty and/or likely final location. Presenting such contextual information may be done based on real time information, and the presentation may be updated as needed. Historical information about the location may also be used to lower the level of uncertainty.
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
30.
PROCESSING SPARSE TOP-DOWN INPUT REPRESENTATIONS OF AN ENVIRONMENT USING NEURAL NETWORKS
Methods, computer systems, and apparatus, including computer programs encoded on computer storage media, for generating a prediction that characterizes an environment. The system obtains an input including data characterizing observed trajectories one or more agents and data characterizing one or more map features identified in a map of the environment. The system generates, from the input, an encoder input that comprises representations for each of a plurality of points in a top-down representation of the environment. The system processes the encoder input using a point cloud encoder neural network to generate a global feature map of the environment, and processes a prediction input including the global feature map using a predictor neural network to generate a prediction output characterizing the environment.
Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for estimating a 3-D pose of an object of interest from image and point cloud data. In one aspect, a method includes obtaining an image of an environment; obtaining a point cloud of a three-dimensional region of the environment; generating a fused representation of the image and the point cloud; and processing the fused representation using a pose estimation neural network and in accordance with current values of a plurality of pose estimation network parameters to generate a pose estimation network output that specifies, for each of multiple keypoints, a respective estimated position in the three-dimensional region of the environment.
Methods, computer systems, and apparatus, including computer programs encoded on computer storage media, for performing object detection. The system obtains a respective range image corresponding to each point cloud in a set of point clouds captured by one or more sensors. The system processes each range image using a segmentation neural network to generate range image features and a segmentation output. The system generates a feature representation of the set of point clouds from only the feature representations of the foreground points. The system processes the feature representation of the set of point clouds using a prediction neural network to generate a prediction characterizing the set of point clouds.
Aspects and implementations of the present disclosure address shortcomings of the existing technology by enabling routing of an autonomous vehicles (AV) by identifying routes from a first location to a second location, identifying a target efficiency value of autonomous driving along a respective route, determining, in view of historical data for the respective route and using one or more randomized conditions, a confidence level associated with the target efficiency value, selecting, based on the target efficiency values and the associated confidence level, a preferred route, and causing the AV to select the first route for travel from the first location to the second location.
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
34.
RADAR INTERFERENCE REDUCTION TECHNIQUES FOR AUTONOMOUS VEHICLES
Example embodiments relate to methods and systems for implementing radar electronic support measure operations. A vehicle's processing unit may receive information relating to electromagnetic energy radiating in an environment of the vehicle that is detected using a vehicle radar system. The electromagnetic energy originated from one or more external emitters, such as radar signals transmitted by other vehicles. The processing unit may determine a spectrum occupancy representation that indicates spectral regions occupied by the electromagnetic energy and subsequently adjust operation of the vehicle radar system based on the spectrum occupancy representation to reduce or mitigate interference with the external emitters in the vehicle's environment. In some examples, the vehicle radar system may be switched to a passive receive-only mode to measure the electromagnetic energy radiating in the environment from other emitters.
G01S 7/02 - RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES - Details of systems according to groups , , of systems according to group
G01S 13/931 - Radar or analogous systems, specially adapted for specific applications for anti-collision purposes of land vehicles
G01S 13/86 - Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
G01S 3/46 - Systems for determining direction or deviation from predetermined direction using antennas spaced apart and measuring phase or time difference between signals therefrom, i.e. path-difference systems
G01S 7/00 - RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES - Details of systems according to groups , ,
G01S 13/44 - Monopulse radar, i.e. simultaneous lobing
G01S 13/58 - Velocity or trajectory determination systems; Sense-of-movement determination systems
35.
DETECTING RETROREFLECTORS IN NIR IMAGES TO CONTROL LIDAR SCAN
A system includes a near-infrared (NIR) illuminator, an NIR image sensor, a light detection and ranging (LIDAR) device, and control circuitry configured to perform operations. The operations include causing the NIR illuminator to illuminate a portion of an environment, and obtaining, from the NIR image sensor, NIR image data representing the portion of the environment illuminated by the NIR illuminator. The operations also include detecting a retroreflector within the NIR image data and, based on detecting the retroreflector within the NIR image, determining a position of the retroreflector within the environment. The operations further include, based on the position of the retroreflector within the environment, adjusting at least one parameter of the LIDAR device in connection with scanning the retroreflector.
Example embodiments relate to light detection and ranging (lidar) devices having vertical-cavity surface-emitting laser (VCSEL) emitters. An example lidar device includes an array of individually addressable VCSELs configured to emit light pulses into an environment surrounding the lidar device. The lidar device also includes a firing circuit configured to selectively fire the individually addressable VCSELs in the array. In addition, the lidar device includes a controller configured to control the firing circuit using a control signal Further, the lidar device includes a plurality of detectors. Each detector in the plurality of detectors is configured to detect reflections of light pulses that are emitted by one or more individually addressable VCSELs in the array and reflected by one or more objects in the environment surrounding the lidar device.
G01S 7/481 - Constructional features, e.g. arrangements of optical elements
G01S 7/4863 - Detector arrays, e.g. charge-transfer gates
H01S 5/183 - Surface-emitting [SE] lasers, e.g. having both horizontal and vertical cavities having only vertical cavities, e.g. vertical cavity surface-emitting lasers [VCSEL]
37.
CLASSIFICATION OF OBJECTS BASED ON MOTION PATTERNS FOR AUTONOMOUS VEHICLE APPLICATIONS
Aspects and implementations of the present disclosure address shortcomings of the existing technology by enabling motion pattern-assisted object classification of objects in an environment of an autonomous vehicle (AV) by obtaining, from a sensing system of the AV, a plurality of return points, each return point comprising one or more velocity values and one or more coordinates of a reflecting region that reflects a signal emitted by the sensing system, identifying an association of the plurality of return points with an object in an environment of the AV, identifying, in view of the one or more velocity values of at least some of the plurality of return points, a type of the object or a type of a motion of the object, and causing a driving path of the AV to be determined in view of the identified type of the object.
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
The technology employs a holistic approach to passenger pickups and other wayfinding situations. This includes identifying where passengers (606) are relative to the vehicle (602) and/or the pickup location (604). Information synthesis from different sensors (238), agent behavior prediction models (240), and real-time situational awareness are employed to identify the likelihood that the passenger to be picked up is at a given location at a particular point in time, with sufficient confidence. The system can provide adaptive navigation by helping passengers understand their distance and direction to the vehicle, for instance using various cues via an app on the person's device. Rider support tools (804) may be provided, which enable a remote agent to interact with a customer via that person's device, such as using the camera on the device to provide wayfinding support to enable the person to find their vehicle. Ride support may also use sensor information from the vehicle when providing wayfinding support.
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
B60W 10/20 - Conjoint control of vehicle sub-units of different type or different function including control of steering systems
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
39.
POINT CLOUD SEGMENTATION USING A COHERENT LIDAR FOR AUTONOMOUS VEHICLE APPLICATIONS
Aspects and implementations of the present disclosure address shortcomings of the existing technology by enabling Doppler-assisted segmentation of points in a point cloud for efficient object identification and tracking in autonomous vehicle (AV) applications, by: obtaining, by a sensing system of the AV, a plurality of return points comprising one or more velocity values and one or more coordinates of a reflecting region that reflects a signal emitted by the sensing system, the one or more velocity values and the one or more coordinates obtained for the same instance of time, identifying that the set of the return points is associated with an object in an environment, and causing a driving path of the AV to be determined in view of the object.
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
40.
VELOCITY ESTIMATION AND OBJECT TRACKING FOR AUTONOMOUS VEHICLE APPLICATIONS
Aspects and implementations of the present disclosure address shortcomings of the existing technology by enabling velocity estimation for efficient object identification and tracking in autonomous vehicle (AV) applications, including: obtaining, by a sensing system of the AV, a plurality of return points, each return point having a velocity value and coordinates of a reflecting region that reflects a signal emitted by the sensing system, identifying an association of the velocity values and the coordinates of return points with a motion of a physical object, the motion being a combination of a translational motion and a rotational motion of a rigid body, and causing a driving path of the AV to be determined in view of the motion of the physical object.
G01S 17/58 - Velocity or trajectory determination systems; Sense-of-movement determination systems
G01S 17/34 - Systems determining position data of a target for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated using transmission of continuous, frequency-modulated waves while heterodyning the received signal, or a signal derived therefrom, with a locally-generated signal related to the contemporaneously transmitted signal
G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles
G01S 7/481 - Constructional features, e.g. arrangements of optical elements
G01S 7/491 - RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES - Details of systems according to groups , , of systems according to group - Details of non-pulse systems
G01S 17/66 - Tracking systems using electromagnetic waves other than radio waves
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
G05D 1/02 - Control of position or course in two dimensions
41.
CAMERA MODULE WITH IR LEDS FOR UNIFORM ILLUMINATION
A camera module includes a housing with an opening and a portion that surrounds the opening, wherein the portion of the housing is transparent to near infrared (MR) light. A fisheye lens is disposed within the opening such that a portion of the tlsheye lens protrudes through the opening. An image sensor is disposed within the housing and optically coupled to the fisheye lens. The image sensor is sensitive to visible light and MR light. A plurality of NIR light emitters is disposed within the housing. The MR light emitters are configured to emit NIR light through the NIR-transparent portion of the housing. The MR-transparent portion of the housing may include a light-diffusing structure, such as a pattern of microlenses formed on an inner surface of the NIR-transparent portion of the housing, to spread out the MR light emitted by the NIR light emitters.
One example system comprises an active sensor that includes a transmitter and a receiver, a first camera that detects external light originating from one or more external light sources to generate first image data, a second camera that detects external light originating from one or more external light sources to generate second image data, and a controller. The controller is configured to perform operations comprising determining a first distance estimate to a first object based on a comparison of the first image data and the second image data, determining a second distance estimate to the first object based on active sensor data, comparing the first distance estimate and the second distance estimate, and determining a third distance estimate to a second object based on the first image data, the second image data, and the comparison of the first and second distance estimates.
A light detection and ranging (LIDAR) device includes a first light emitter, a second light emitter, a first light detector, and a second light detector, wherein the first light emitter is configured to emit light pulses in a first direction and the second light emitter is configured to emit light pulses in a second direction. During a scan of the LIDAR device, the first direction intersects an object at a first time and the second direction intersects the object at a second time. A relative speed of the object can he determined based on a first range to the object when the first direction intersects the object and a second range to the object when the second direction intersects the object.
The subject matter of this specification relates to a light detection and ranging (LiDAR) device that comprises, in some implementations, a pulsed-laser source configured to generate a pulsed optical signal, a continuous wave (CW) laser source configured to generate a CW optical signal, one or more optical amplifier circuits configured to amplify at least the pulsed optical signal, a combiner configured to combine the pulsed optical signal and the CW optical signal into a hybrid transmission signal, and at least one photodetector configured to receive a reflection signal produced by reflection of the hybrid transmission signal by a target.
G01S 17/87 - Combinations of systems using electromagnetic waves other than radio waves
G01S 17/32 - Systems determining position data of a target for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated
G01S 17/10 - Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles
H01S 3/094 - Processes or apparatus for excitation, e.g. pumping using optical pumping by coherent light
Aspects of the disclosure relate to cleaning rotating sensors having a sensor housing (310) with a sensor input surface (350). For instance, a first signal indicating that there is a contaminant (1310) on the sensor input surface may be received. In response to receiving the first signal, a second signal may be sent in order to cause one or more transducers (452) to generate waves in order to attempt to remove the contaminant from the sensor input surface.
Aspects of the disclosure provide for displaying notifications on a display of an autonomous vehicle 100. In one instance, a distance from the vehicle to a destination of the vehicle or a passenger may be determined. When the distance is between a first distance and a second distance, a first notification 650 may be displayed on the display. The second distance may be less than the first distance. When the distance is less than the second distance, a second notification 660 may be displayed on the display. The second notification provides additional information not provided by the first notification.
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
Methods, systems, and apparatus, including computer programs encoded on computer storage media, for the generation and use of a surfel map with semantic labels. One of the methods includes receiving a surfel map that includes a plurality of surfels, wherein each surfel has associated data that includes one or more semantic labels; obtaining sensor data for one or more locations in the environment, the sensor data having been captured by one or more sensors of a first vehicle; determining one or more surfels corresponding to the one or more locations of the obtained sensor data; identifying one or more semantic labels for the one or more surfels corresponding to the one or more locations of the obtained sensor data; and performing, for each surfel corresponding to the one or more locations of the obtained sensor data, a label-specific detection process for the surfel.
Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for determining estimated ground truth object keypoint labels for sensor readings of objects. In one aspect, a method comprises obtaining a plurality of sets of label data for a sensor reading of an object; obtaining respective quality control data corresponding to each of the plurality of sets of label data, the respective quality control data comprising: data indicating whether the labeled location of the first object keypoint in the corresponding set of label data is accurate; and determining an estimated ground truth location for the first object keypoint in the sensor data keypoint from (i) the labeled locations that were indicated as accurate by the corresponding quality control data and (ii) not from the labeled locations that were indicated as not accurate by the corresponding quality control data.
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
B60W 30/08 - Predicting or avoiding probable or impending collision
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
49.
HIGH FIDELITY SIMULATIONS FOR AUTONOMOUS VEHICLES BASED ON RETRO-REFLECTION METROLOGY
Aspects and implementations of the present disclosure address shortcomings of existing technology by enabling autonomous vehicle simulations based on retro-reflection optical data. The subject matter of this specification can be implemented in, among other things, a method that involves initiating a simulation of an environment of an autonomous driving vehicle, the simulation including a plurality of simulated objects, each having an identification of a material type of the respective object. The method can further involve accessing simulated reflection data based on the plurality of simulated objects and retro-reflectivity data for the material types of the simulated objects, and determining, using an autonomous vehicle control system for the autonomous vehicle, a driving path relative to the simulated objects, the driving path based on the simulated reflection data.
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
B60W 50/14 - Means for informing the driver, warning the driver or prompting a driver intervention
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
50.
OBJECT-CENTRIC THREE-DIMENSIONAL AUTO LABELING OF POINT CLOUD DATA
Methods, computer systems, and apparatus, including computer programs encoded on computer storage media, for performing three-dimensional auto-labeling on sensor data. The system obtains a sensor data segment that includes a temporal sequence of three-dimensional point clouds generated from sensor readings of an environment by one or more sensors. The system identifies, from the sensor data segment, (i) a plurality of object tracks that each corresponds to a different object in the environment and (ii) for each object track, respective initial three-dimensional regions in each of one or more of the point clouds in which the corresponding object appears. The system generates, for each object track, extracted object track data that includes at least the points in the respective initial three-dimensional regions for the object track. The system further generates, for each object track and from the extracted object track data for the object track, an auto labeling output that defines respective refined three-dimensional regions in each of the one or more point clouds.
A system includes multiple microphone arrays positioned at different locations on a roof of an autonomous vehicle. Each microphone array includes two or more microphones, internal clocks of each microphone array are synchronized by a processor and used to generate timestamps indicating when microphones capture a sound. Based on the timestamps, the processor is configured to localize a source of the sound.
H04R 1/40 - Arrangements for obtaining desired frequency or directional characteristics for obtaining desired directional characteristic only by combining a number of identical transducers
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
A system includes a microphone unit coupled to a roof of an autonomous vehicle. The microphone unit includes a microphone board having a first opening. The microphone unit also includes a first microphone positioned over the first opening and coupled to the microphone board. The microphone unit further includes an accelerometer. The system also includes a processor coupled to the microphone unit.
H04R 3/04 - Circuits for transducers for correcting frequency response
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
The technology employs a contrasting color scheme on different surfaces (402, 404, 406, 408, 410, 412, 414) for sensor housing assemblies mounted on exterior parts of a vehicle that is configured to operate in an autonomous driving mode. Lighter and darker colors may be chosen on different surfaces according to a thermal budget for a given sensor housing assembly, due to the different types of sensors arranged along particular surfaces, or to provide color contrast for different regions of the assembly. For instance, differing colors such as black (404)/white (406) or blue (466, 503)/white (402), and different finishes such as matte (410) or glossy (414), may be selected to enhance certain attributes or to minimize issues associated with a sensor housing assembly.
B60R 11/02 - Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof
B60R 21/0134 - Electrical circuits for triggering safety arrangements in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over responsive to imminent contact with an obstacle
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
B60R 11/00 - Arrangements for holding or mounting articles, not otherwise provided for
54.
AGENT TRAJECTORY PREDICTION USING TARGET LOCATIONS
Methods, computer systems, and apparatus, including computer programs encoded on computer storage media, for predicting future trajectories for an agent in an environment. A system obtains scene context data characterizing the environment. The scene context data includes data that characterizes a trajectory of an agent in a vicinity of a vehicle in an environment up to a current time point. The system identifies a plurality of initial target locations in the environment. The system further generates, for each of a plurality of target locations that each corresponds to one of the initial target locations, a respective predicted likelihood score that represents a likelihood that the target location will be an intended final location for a future trajectory of the agent starting from the current time point. For each target location in a first subset of the target locations, the system generates a predicted future trajectory for the agent that is a prediction of the future trajectory of the agent given that the target location is the intended final location for the future trajectory. The system further selects, as likely future trajectories of the agent starting from the current time point, one or more of the predicted future trajectories.
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
55.
PROCESSING PERSPECTIVE VIEW RANGE IMAGES USING NEURAL NETWORKS
Methods, computer systems, and apparatus, including computer programs encoded on computer storage media, for processing a perspective view range image generated from sensor measurements of an environment. The perspective view range image includes a plurality of pixels arranged in a two-dimensional grid and including, for each pixel, (i) features of one or more sensor measurements at a location in the environment corresponding to the pixel and (ii) geometry information comprising range features characterizing a range of the location in the environment corresponding to the pixel relative to the one or more sensors. The system processes the perspective view range image using a first neural network to generate an output feature representation. The first neural network comprises a first perspective point-set aggregation layer comprising a geometry-dependent kernel.
A sensor module comprising a housing defining an internal cavity, the housing including an aperture, at least one microphone positioned in the internal cavity spaced from the aperture, a first barrier proximate the aperture, and a second barrier positioned between the at least one microphone and the first barrier.
H04R 1/28 - Transducer mountings or enclosures designed for specific frequency response; Transducer enclosures modified by provision of mechanical or acoustic impedances, e.g. resonator, damping means
57.
DETECTING TRAFFIC SIGNALING STATES WITH NEURAL NETWORKS
Machine-learning models are described detecting the signaling state of a traffic signaling unit. A system can obtain an image of the traffic signaling unit, and select a model of the traffic signaling unit that identifies a position of each traffic lighting element on the unit. First and second neural network inputs are processed with a neural network to generate an estimated signaling state of the traffic signaling unit. The first neural network input can represent the image of the traffic signaling unit, and the second neural network input can represent the model of the traffic signaling unit. Using the estimated signaling state of the traffic signaling unit, the system can inform a driving decision of a vehicle.
The present disclosure relates to transmitter modules, vehicles, and methods associated with lidar sensors. An example transmitter module could include a light-emitter die and a plurality of light-emitter devices coupled to the light-emitter die. Each light-emitter of the plurality of light-emitter devices is configured to emit light from a respective emitter surface. The transmitter module also includes a cylindrical lens optically coupled to the plurality of light-emitter devices and arranged along an axis. The light-emitter die is disposed such that the respective emitter surfaces of the plurality of light-emitter devices form a non-zero yaw angle with respect to the axis.
Example embodiments relate to selective deactivation of light emitters for interference mitigation in light detection and ranging (lidar) devices. An example method includes deactivating one or more light emitters within a lidar device during a firing cycle. The method also includes identifying whether interference is influencing measurements made by the lidar device. Identifying whether interference is influencing measurements made by the lidar device includes determining, for each light detector of the lidar device that is associated with the one or more light emitters deactivated during the firing cycle, whether a light signal was detected during the firing cycle.
ABB) and a deep learning model (1008) to classify road wetness and/or to perform a regression analysis on road wetness based on a set of input information. Such information includes on-board (802) and/or off-board signals (804) obtained from one or more sources including on-board perception sensors, other on-board modules, external weather measurement, external weather services, etc. The ground truth includes measurements of water film thickness and/or ice coverage on road surfaces. The ground truth, on-board and off-board signals are used to build the model. The constructed model can be deployed in autonomous vehicles for classifying/regressing (1016) the road wetness with on-board and/or off-board signals as the input, without referring to the ground truth. The model can be applied in a variety of ways to enhance autonomous vehicle operation, for instance by altering current driving actions, modifying planned routes or trajectories, activating on-board cleaning systems, etc.
Methods, systems, and apparatus, including computer programs encoded on computer storage media, for using a surfel map to generate a prediction for a state of an environment. One of the methods includes obtaining surfel data comprising a plurality of surfels, wherein each surfel corresponds to a respective different location in an environment, and each surfel has associated data that comprises an uncertainty measure; obtaining sensor data for one or more locations in the environment, the sensor data having been captured by one or more sensors of a first vehicle; determining one or more particular surfels corresponding to respective locations of the obtained sensor data; and combining the surfel data and the sensor data to generate a respective object prediction for each of the one or more locations of the obtained sensor data.
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
G05D 1/02 - Control of position or course in two dimensions
62.
ARBITRATING FRICTION AND REGENERATIVE BRAKING FOR AUTONOMOUS VEHICLES
Aspects of the disclosure provide for a method of controlling a vehicle (100) in an autonomous driving mode. For instance, the method may include receiving, by one or more processors of a brake controller (230) of the vehicle, a braking profile for a trajectory for the vehicle to follow into the future. The brake controller may determine whether to use one or both of regenerative and friction braking based on the braking profile. The vehicle is controlled according to the braking profile based on the determination of whether to use one or both of regenerative and friction braking.
B60W 10/18 - Conjoint control of vehicle sub-units of different type or different function including control of braking systems
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
63.
DETERMINING PUDDLE SEVERITY FOR AUTONOMOUS VEHICLES
Aspects of the disclosure provide methods for controlling a first vehicle (100) having an autonomous driving mode. In one instance, sensor data generated by one or more sensors of the first vehicle may be received. A splash and characteristics of the splash may be detected from the sensor data using a classifier. A severity of a puddle (680), (682) may be determined based on the characteristics of the splash and a speed of a second vehicle (670), (672) that caused the splash. The first vehicle may be controlled based on the severity. In another instance, a location of a puddle relative to a tire (870) of a second vehicle is estimated using sensor data generated by one or more sensors of the first vehicle. A severity of the puddle may be determined based on the estimated location. The first vehicle may be controlled based on the severity.
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
64.
CROSSTALK REDUCTION FOR LIGHT DETECTION AND RANGING (LIDAR) DEVICES USING WAVELENGTH LOCKING
Example embodiments relate to crosstalk reduction for light detection and ranging (lidar) devices using wavelength locking. An example embodiment includes a lidar device. The lidar device includes a first light emitter configured to emit a first light signal and a second light emitter configured to emit a second light signal. The Iidar device also includes a first light guide and a second light guide. In addition, the lidar device includes a first light detector and a second light detector. Further, the lidar device includes a first wavelength-locking mechanism configured to use a portion of the first light signal to maintain a wavelength of the first light signal and a second wavelength- locking mechanism configured to use a portion of the second light signal to maintain a wavelength of the second light signal. The wavelengths of the first light signal and the second light signal are different.
An article including an optically transparent, superomniphobic coating that is durable and relatively easy to keep clean, is disclosed. In one aspect, the present disclosure provides an article comprising a substrate and a graded layer, the graded layer having a first side disposed adjacent the substrate, the first side comprising 45-85 wt.% silicon oxide in a first glass phase and 10-40 wt.% boron oxide in a second glass phase, and opposed the first side, a second side comprising at least 45 wt.% silicon oxide, no more than 5 wt.% boron oxide, and 10-50 wt.% aerogel, the aerogel present in the graded layer as a plurality of distinct domains.
C03C 17/34 - Surface treatment of glass, e.g. of devitrified glass, not in the form of fibres or filaments, by coating with at least two coatings having different compositions
C03C 17/02 - Surface treatment of glass, e.g. of devitrified glass, not in the form of fibres or filaments, by coating with glass
C03C 17/00 - Surface treatment of glass, e.g. of devitrified glass, not in the form of fibres or filaments, by coating
C03C 8/02 - Frit compositions, i.e. in a powdered or comminuted form
C03C 8/14 - Glass frit mixtures having non-frit additions, e.g. opacifiers, colorants, mill additions
C03C 3/089 - Glass compositions containing silica with 40% to 90% silica by weight containing boron
A method for preparing an optically transparent, superomniphobic coating on a substrate, such as an optical substrate, is disclosed. The method includes providing a glass layer disposed on a substrate, the glass layer having a first side adjacent the substrate and an opposed second side, the glass layer comprising 45-85 wt.% silicon oxide in a first glass phase and 10-40 wt.% boron oxide in a second glass phase, such that a glass layer has a composition in a spinodal decomposition region. The method further includes heating the second side of the glass layer to form a phase-separated portion of the layer, the phase- separated portion comprising an interpenetrating network of silicon oxide domains and boron oxide domains, and removing at least a portion of the boron oxide domains from the phase-separated portion to provide a graded layer disposed on the substrate. The graded layer has a first side disposed adjacent the substrate, the first side comprising 45-85 wt.% silicon oxide and 10-40 wt.% boron oxide, and opposite the first side, a porous second side comprising at least 45 wt.% silicon oxide and no more than 5 wt.% boron oxide.
C03C 17/02 - Surface treatment of glass, e.g. of devitrified glass, not in the form of fibres or filaments, by coating with glass
C03C 17/34 - Surface treatment of glass, e.g. of devitrified glass, not in the form of fibres or filaments, by coating with at least two coatings having different compositions
C03C 17/00 - Surface treatment of glass, e.g. of devitrified glass, not in the form of fibres or filaments, by coating
C03C 17/22 - Surface treatment of glass, e.g. of devitrified glass, not in the form of fibres or filaments, by coating with other inorganic material
C03C 8/00 - Enamels; Glazes; Fusion seal compositions being frit compositions having non-frit additions
C03C 3/089 - Glass compositions containing silica with 40% to 90% silica by weight containing boron
Methods, systems, and apparatus, including computer programs encoded on computer storage media, for predicting occupancies of agents. One of the methods includes obtaining scene data characterizing a current scene in an environment; and processing a neural network input comprising the scene data using a neural network to generate a neural network output, wherein: the neural network output comprises respective occupancy outputs corresponding to a plurality of agent types at one or more future time points; the occupancy output for each agent type at a first future time point comprises respective occupancy probabilities for a plurality of locations in the environment; and in the occupancy output for each agent type at the first future time point, the respective occupancy probability for each location characterizes a likelihood that an agent of the agent type will occupy the location at the first future time point.
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
B60W 50/14 - Means for informing the driver, warning the driver or prompting a driver intervention
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
68.
FATIGUE MONITORING SYSTEM FOR DRIVERS TASKED WITH MONITORING A VEHICLE OPERATING IN AN AUTONOMOUS DRIVING MODE
Aspects of the disclosure relate to models for estimating the likelihood of fatigue in test drivers. In some instances, training data including videos of the test drivers while such test drivers are tasked with monitoring driving of a vehicle 100 operating in an autonomous driving mode may be identified. The training data also includes driver drowsiness values generated from one or more human operators observing the videos. The training inputs and outputs may be used to train the model such that when a new video of a first test driver is input into the model, the model will output an estimate of a likelihood of fatigue for that test driver.
B60W 40/08 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to drivers or passengers
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
G05D 1/02 - Control of position or course in two dimensions
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
69.
TESTING SITUATIONAL AWARENESS OF DRIVERS TASKED WITH MONITORING A VEHICLE OPERATING IN AN AUTONOMOUS DRIVING MODE
Aspects of the disclosure relate to testing situational awareness of a test driver tasked with monitoring the driving of a vehicle 100 operating in an autonomous driving mode. For instance, a signal that indicates that the test driver may be distracted may be identified. Based on the signal, that a question can be asked of the test driver may be determined. A plurality of factors relating to a driving context for the vehicle may be identified. Based on the determination, a question may be generated based on the plurality of factors. The question may be provided to the test driver. Input may be received from the test driver providing an answer to the question.
B60W 40/08 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to drivers or passengers
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
B60R 21/0134 - Electrical circuits for triggering safety arrangements in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over responsive to imminent contact with an obstacle
G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
70.
MONITORING HEAD MOVEMENTS OF DRIVERS TASKED WITH MONITORING A VEHICLE OPERATING IN AN AUTONOMOUS DRIVING MODE
Aspects of the disclosure relate to analyzing head movements in a test driver tasked with monitoring the driving of a vehicle 100 operating in an autonomous driving mode. For instance, a sensor may be used to capture sensor data of a test driver 730's head 740 for a period of time. The sensor data may be analyzed to determine whether the test driver's head moved sufficiently enough to suggest that the test driver is engaged in monitoring the driving of the vehicle. Based on the determination of whether the test driver's head moved sufficiently enough, an intervention response may be initiated.
B60W 40/08 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to drivers or passengers
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
B60R 21/0134 - Electrical circuits for triggering safety arrangements in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over responsive to imminent contact with an obstacle
An example method involves identifying one or more potential route segments that collectively connect at least two geographical points, receiving spatiotemporal weather information that predicts future weather conditions along each of the potential segments, and, for each potential segment, evaluating a partial cost function that comprises a summation of a set of segment-weighted cost factors, where at least one segment-weighted cost factor comprises an adverse weather risk factor based on the future weather conditions along the potential segment. The method also involves selecting, based on a minimization of a total cost function, a set of selected segments and corresponding segment target speeds for the vehicle to utilize while traversing between the at least two geographical points so as to avoid adverse weather conditions, the total cost function being the sum of partial cost functions associated with a set of segments that collectively connect the at least two geographical points.
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
B60W 40/10 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to vehicle motion
G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for training a machine learning model to perform a machine learning task by processing input data to the model. For example, the input data can include image, video, or point cloud data, and the task can be a perception task such as classification or detection task. In one aspect, the method includes receiving training data including a plurality of training inputs; receiving a plurality of data augmentation policy parameters that define different transformation operations for transforming training inputs before the training inputs are used to train the machine learning model; maintaining a plurality of candidate machine learning models; for each of the plurality of candidate machine learning models: repeatedly determining an augmented batch of training data; training the candidate machine learning model using the augmented batch of the training data; and updating the maintained data.
Example embodiments relate to calibration and localization of a light detection and ranging (lidar) device using a previously calibrated and localized lidar device. An example embodiment includes a method. The method includes receiving, by a computing device associated with a beneficiary vehicle, a first point cloud captured by a first lidar device of a first unit. The first point cloud includes points representing the beneficiary vehicle. The method also includes receiving, by the computing device, pose information indicative of a pose of the first unit. In addition, the method includes capturing, using a second lidar device of the beneficiary vehicle, a second point cloud. Further, the method includes receiving, by the computing device, a third point cloud representing the first unit. Yet further, the method includes calibrating and localizing, by the computing device, the second lidar device.
G01S 17/89 - Lidar systems, specially adapted for specific applications for mapping or imaging
G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
Computing devices, systems, and methods described in various embodiments herein may relate to light detection and ranging (LIDAR or lidar) systems. An example computing device could include a controller having at least one processor and at least one memory. The at least one processor is configured to execute program instructions stored in the at least one memory so as to carry out operations. The operations include receiving information indicative of transmit light emitted from a lidar system along a light-emission axis. The operations also include determining, based on the received information, a maximum instrumented distance. The maximum instrumented distance includes a known unobstructed region defined by a ray segment extending between the lidar system and a point along the light-emission axis.
G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles
G01S 17/08 - Systems determining position data of a target for measuring distance only
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
Aspects of the disclosure relate to systems for cleaning a sensor (280, 282, 284, 300). For example, the sensor may include a housing (310) as well as internal sensor components (320) housed within the housing. The housing may include a sensor input surface (330) through which signals may pass. The system may also include a motor (340) configured to rotate the internal sensor components relative to a mount 350 as well as the mount to which the motor is fixed. The system may also include a wiper 380 including a wiper blade (382). The wiper may be attached to the mount such that rotating the housing causes the wiper to contact the sensor input surface in order to clean the sensor.
B60S 1/60 - Cleaning windscreens, windows, or optical devices specially adapted for cleaning other parts or devices than front windows or windscreens for signalling devices, e.g. reflectors
B60S 1/56 - Cleaning windscreens, windows, or optical devices specially adapted for cleaning other parts or devices than front windows or windscreens
Methods, systems, and apparatus, including computer programs encoded on computer storage media, for selecting actions for an agent at a specific real-world location using historical data generated at the same real-world location. One of the methods includes determining a current geolocation of an agent within an environment; obtaining historical data for geolocations in a vicinity of the current geolocation of the agent from a database that maintains historical data for a plurality of geolocations within the environment, the historical data for each geolocation comprising observations generated at least in part from sensor readings of the geolocation captured by vehicles navigating through the environment; generating an embedding of the obtained historical data; and providing the embedding as an input to a policy decision-making system that selects actions to be performed by the agent.
Example embodiments relate to slanted radomes for protecting radar units. An example radar system may include a radar unit that includes at least one antenna having a radiation pattern. The radar unit is configured to transmit a radar signal based on the radiation pattern and receive radar signals. In addition, the radar system further includes a radome located in a direction of transmission of the radiation pattern. Particularly, the radome is aligned at an angle relative to a plane of the at least one antenna such that reflections of the transmitted radar signal caused by the radome are directed towards at least one of a null of the radiation pattern and an absorption component.
G01S 7/03 - RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES - Details of systems according to groups , , of systems according to group - Details of HF subsystems specially adapted therefor, e.g. common to transmitter and receiver
G01S 13/931 - Radar or analogous systems, specially adapted for specific applications for anti-collision purposes of land vehicles
H01Q 1/42 - Housings not intimately mechanically associated with radiating elements, e.g. radome
H01Q 1/32 - Adaptation for use in or on road or rail vehicles
The technology relates to an exterior sensor system for a vehicle configured to operate in an autonomous driving mode. The technology includes a close-in sensing (CIS) camera system to address blind spots around the vehicle. The CIS system is used to detect objects within a few meters of the vehicle. Based on object classification, the system is able to make real-time driving decisions. Classification is enhanced by employing cameras in conjunction with lidar sensors (800, 1804). The specific arrangement of multiple sensors in a single sensor housing is also important to object detection and classification. Thus, the positioning of the sensors and support components are selected to avoid occlusion and to otherwise prevent interference between the various sensor housing elements.
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
B60W 10/18 - Conjoint control of vehicle sub-units of different type or different function including control of braking systems
Example embodiments relate to identification of proxy calibration targets for a fleet of sensors. An example method includes collecting, using a sensor coupled to a vehicle, data about one or more objects within an environment of the vehicle. The sensor has been calibrated using a ground-truth calibration target. The method also includes identifying, based on the collected data, at least one candidate object, from among the one or more objects, to be used as a proxy calibration target for other sensors coupled to vehicles Within a fleet of vehicles. Further, the method includes providing, by the vehicle, data about the candidate object for use by one or more vehicles within the fleet of vehicles.
Example embodiments described herein involve techniques for orthogonal Doppler coding for a radar system. An example method may involve causing, by a computing system coupled to a vehicle, a radar unit to transmit a plurality of radar signals into an environment of the vehicle using a two-dimensional (2D) transmission antenna array, wherein the radar unit is configured to use time division multiple access (TDMA) to isolate transmit channels along a horizontal direction of the 2D transmission antenna array and Doppler coding to isolate transmit channels along a vertical direction of the 2D transmission antenna array. The method may further involve receiving, by the computing system and from the radar unit, radar reflections corresponding to the plurality of radar signals, determining information representative of the environment based on the radar reflections, and providing control instructions to the vehicle based on the information representative of the environment.
G01S 13/53 - Discriminating between fixed and moving objects or between objects moving at different speeds using transmissions of interrupted pulse modulated waves based upon the phase or frequency shift resulting from movement of objects, with reference to the transmitted signals, e.g. coherent MTi performing filtering on a single spectral line and associated with one or more range gates with a phase detector or a frequency mixer to extract the Doppler information, e.g. pulse Doppler radar
G01S 13/89 - Radar or analogous systems, specially adapted for specific applications for mapping or imaging
G01S 7/28 - RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES - Details of systems according to groups , , of systems according to group - Details of pulse systems
G01S 13/28 - Systems for measuring distance only using transmission of interrupted, pulse modulated waves wherein the transmitted pulses use a frequency- or phase-modulated carrier wave with time compression of received pulses
G01S 13/931 - Radar or analogous systems, specially adapted for specific applications for anti-collision purposes of land vehicles
The technology relates to an exterior sensor system (232) for a vehicle configured to operate in an autonomous driving mode. The technology includes a close-in sensing (CIS) camera system (1404) to address blind spots around the vehicle. The CIS system is used to detect objects within a few meters of the vehicle (1808). Based on object classification, the system is able to make real-time driving decisions (1810, 1812). Classification is enhanced by employing cameras in conjunction with lidar sensors (700). The specific arrangement of multiple sensors in a single sensor housing is also important to object detection and classification. Thus, the positioning of the sensors and support components are selected to avoid occlusion and to otherwise prevent interference between the various sensor housing elements (712, 1414).
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
B60W 30/08 - Predicting or avoiding probable or impending collision
B60W 10/18 - Conjoint control of vehicle sub-units of different type or different function including control of braking systems
B60W 10/20 - Conjoint control of vehicle sub-units of different type or different function including control of steering systems
B60R 21/0134 - Electrical circuits for triggering safety arrangements in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over responsive to imminent contact with an obstacle
G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles
H04N 7/18 - Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
H04N 5/335 - Transforming light or analogous information into electric information using solid-state image sensors [SSIS]
A method includes applying, by a switching circuit, pulses of an input voltage to an input of an inductor. The method includes charging, in accordance with an off state of a switch, a charge storage device through the inductor using the pulses of the input voltage such that the circuit node develops a charge voltage that is greater than the input voltage. The method includes discharging, in accordance with an on state of the switch, the charge storage device such that a first portion of the charge voltage is applied to a light emitter and a second portion of the charge voltage is applied to parasitic inductance. The method includes controlling, by a controller, a timing of the pulses of the input voltage applied by the switching circuit based on a parasitic inductance from a previous charging cycle of the charge storage device, so as to control the charge voltage.
H03K 17/969 - Switches controlled by moving an element forming part of the switch using opto-electronic devices having a plurality of control members, e.g. keyboard
H03K 17/0412 - Modifications for accelerating switching without feedback from the output circuit to the control circuit by measures taken in the control circuit
G01S 7/4863 - Detector arrays, e.g. charge-transfer gates
G01S 17/894 - 3D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
83.
SYSTEMS AND METHODS FOR CONTACT IMMERSION LITHOGRAPHY
The present application relates to contact immersion lithography exposure units and methods of their use. An example contact exposure unit includes a container configured to contain a fluid material and a substrate disposed within the container. The substrate has a first surface and a second surface, and the substrate includes a photoresist material on at least the first surface. The contact exposure unit includes a photomask disposed within the container. The photomask is optically coupled to the photoresist material by way of a gap comprising the fluid material. The contact exposure unit also includes an inflatable balloon configured to be controllably inflated so as to apply a desired force to the second surface of the substrate to controllably adjust the gap between the photomask and the photoresist material.
Disclosed are systems and methods that can be used for adjusting the field of view of one or more sensors of an autonomous vehicle. In the systems and methods, each sensor of the one or more sensors is configured to operate in accordance with a field of view volume up to a maximum field of view volume. The systems and methods include determining an operating environment of an autonomous vehicle. The systems and methods also include based on the determined operating environment of the autonomous vehicle, adjusting a field of view volume of at least one sensor of the one or more sensors from a first field of view volume to an adjusted field of view volume different from the first field of view volume. Additionally, the systems and methods include controlling the autonomous vehicle to operate using the at least one sensor having the adjusted field of view volume.
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
85.
CONDITIONAL BEHAVIOR PREDICTION FOR AUTONOMOUS VEHICLES
Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for conditional behavior prediction for agents in an environment. Conditional behavior predictions are made for agents navigating through the same environment as an autonomous vehicle that are conditioned on a planned future trajectory for the autonomous vehicle, e.g., as generated by a planning system of the autonomous vehicle.
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
G05D 1/02 - Control of position or course in two dimensions
The present disclosure relates to systems and methods for occlusion detection. One example method involves a light detection and ranging (LIDAR) device scanning at least a portion of an external structure within a field-of-view (FOV) of the LIDAR device. The LIDAR device is physically coupled to the external structure. The scanning comprises transmitting light pulses toward the external structure through, an optical window, and receiving reflected light pulses through the optical window. The reflected light pulses comprise reflections of the transmitted light pulses returning back to the LIDAR device from the external structure. The method also involves detecting presence of an occlusion dial at least partially occludes the LIDAR device from scanning the FOV based on at least the scan of the at least portion of the external structure.
The present disclosure relates to devices, lidar systems, and vehicles that include optical redirectors. An example lidar system includes a transmitter and a receiver. The transmitter includes at least one light-emitter device configured to transmit emission light into an environment of the lidar system. The receiver is configured to detect return light from the environment and includes a plurality of apertures, a plurality of photodetectors, and a plurality of optical redirector elements. Each optical redirector element is configured to optically couple a respective portion of return light from a respective aperture to at least one photodetector of the plurality of photodetectors.
The present disclosure relates to systems and methods that utilize an internal optical path in lidar applications. An example method includes causing at least one light-emitter device to emit a plurality of light pulses toward a rotatable mirror. The rotatable mirror (i) reflects at least a first light pulse of the plurality of light pulses into an external environment; and (ii) reflects at least a second light pulse of the plurality of light pulses into an internal optical path. The method also includes receiving, by a photodetector, (i) a reflected light pulse comprising a reflection of the first light pulse caused by an object in the external environment; and (ii) the second light pulse received via the internal optical path. The internal optical path is defined at least in part by one or more internal reflectors that reflect the second light pulse toward the rotatable mirror such that the rotatable mirror reflects the second light pulse toward the photodetector.
G01S 7/481 - Constructional features, e.g. arrangements of optical elements
G01S 7/4863 - Detector arrays, e.g. charge-transfer gates
G01S 7/4865 - Time delay measurement, e.g. time-of-flight measurement, time of arrival measurement or determining the exact position of a peak
G02B 26/08 - Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles
Example embodiments relate to microiensing for real-time sensing of stray light. An example device includes an image sensor that includes a plurality of light-sensitive pixels. The device also includes a first lens positioned over a first subset of light-sensitive pixels selected from the plurality of light-sensitive pixels. Further, the device includes a controller. The controller is configured to determine a first angle of incidence of a first light signal detected by the first subset of light-sensitive pixels. The controller is also configured to, based on the first determined angle of incidence, determine an amount of stray light incident on the image sensor.
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
All example method includes receiving, from one or more sensors associated with an autonomous vehicle, sensor data associated with a target object in an environment of the vehicle during a first environmental condition, where at least one sensor of the sensor(s) is configurable to be associated with one of a plurality of operating field of view volumes. The method also includes based on the sensor data, determining at least one parameter associated with the target object. The method also includes determining a degradation in the parameter(s) between the sensor data and past sensor data, where the past sensor data is associated with the target object in the environment during a second environmental condition different from the first and, based on the degradation, adjusting the operating field of view volume of the at least one sensor to a different one of the operating field of view volumes.
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
B60W 30/08 - Predicting or avoiding probable or impending collision
G05D 1/02 - Control of position or course in two dimensions
G01S 17/89 - Lidar systems, specially adapted for specific applications for mapping or imaging
G01S 17/86 - Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles
The present disclosure relates to systems and devices having a rotatable mirror assembly. An example system includes a housing and a rotatable mirror assembly. The rotatable mirror assembly includes a plurality of reflective surfaces, a shaft defining a rotational axis, and a mirror body coupling the plurality of reflective surfaces to the shaft. The mirror body includes a plurality of flexible support members. The rotatable mirror assembly also includes a coupling bracket configured to removably couple the rotatable mirror assembly to the housing. The system also includes a transmitter configured to emit emission light into an environment of the system after interacting with at least one reflective surface of the plurality of reflective surfaces. The system additionally includes a receiver configured to detect return light from the environment after interacting with the at least one reflective surface of the plurality of reflective surfaces.
G01S 7/481 - Constructional features, e.g. arrangements of optical elements
G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles
G02B 26/08 - Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
G01D 5/12 - Mechanical means for transferring the output of a sensing member; Means for converting the output of a sensing member to another variable where the form or nature of the sensing member does not constrain the means for converting; Transducers not specially adapted for a specific variable using electric or magnetic means
The present disclosure relates to systems and methods for occlusion detection. An example system includes a primary reflective surface and a rotatable mirror configured to rotate about a rotational axis. The rotatable mirror includes a plurality of secondary reflective surfaces. The system also includes an optical element and a camera that is configured to capture at least one image of the optical element by way of the primary reflective surface and at least one secondary reflective surface of the rotatable mirror.
G01S 7/481 - Constructional features, e.g. arrangements of optical elements
G01S 17/89 - Lidar systems, specially adapted for specific applications for mapping or imaging
G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles
G02B 26/08 - Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
H04N 7/18 - Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
93.
ANTENNA STRUCTURE FOR REDUCING BEAM SQUINT AND SIDELOBES
An example radar system includes a transmission array and a reception array, each aligned as a linear array. The radar system also includes a transmitter configured to cause transmission of radar signals having a center frequency by the transmission array. The radar system also includes a receiver configured to receive radar signals having the center frequency that are received by the reception array. The radar system also includes a processor configured to process received radar signals from the receiver, and adjust the center frequency from a first center frequency to a second center frequency. The adjusting of the center frequency from the first center frequency to the second center frequency causes the frequency -dependent transmission radiation pattern of the transmission array to tilt in a first direction and the frequency -dependent reception radiation pattern of the reception array to tilt in an opposite direction from the first direction.
G01S 7/03 - RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES - Details of systems according to groups , , of systems according to group - Details of HF subsystems specially adapted therefor, e.g. common to transmitter and receiver
Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for generating vehicle intent predictions using a neural network. One of the methods includes obtaining an input characterizing one or more vehicles in an environment; generating, from the input, features of each of the vehicles; and for each of the vehicles: processing the features of the vehicle using each of a plurality of intent-specific neural networks, wherein each of the intent-specific neural networks corresponds to a respective intent from a set of intents, and wherein each intent-specific neural network is configured to process the features of the vehicle to generate an output for the corresponding intent.
A method and a radar system are provided in the present disclosure. The radar system includes a radar unit having an antenna array configured to transmit and receive radar signal and a memory configured to store radar calibration parameters and radar channel parameters corresponding to the radar unit. The method provides for operation of the radar system.. The radar system also includes a radar processor. The radar processor is configured to cause transmission of radar signals by the antenna array based on the radar channel parameters. The radar processor is also configured to process received radar signals based on the radar calibration parameters. The radar system further includes a central vehicle controller configured to operate a vehicle based on the processed radar signals.
G01S 7/00 - RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES - Details of systems according to groups , ,
G01S 13/931 - Radar or analogous systems, specially adapted for specific applications for anti-collision purposes of land vehicles
G05D 1/02 - Control of position or course in two dimensions
The present disclosure relates to devices, light detection and ranging (lidar) systems, and vehicles involving solid-state, single photon detectors. An example device includes a substrate defining a primary plane and a plurality of photodetector cells disposed along the primary plane. The plurality of photodetector cells includes at least one large-area cell and at least one small-area cell. The large-area cell has a first area and the small-area cell has a second area and the first area is greater than the second area. The device also includes read out circuitry coupled to the plurality of photodetector cells. The read out circuitry is configured to provide an output signal based on incident light detected by the plurality of photodetector cells.
H01L 31/107 - Devices sensitive to infrared, visible or ultraviolet radiation characterised by only one potential barrier or surface barrier the potential barrier working in avalanche mode, e.g. avalanche photodiode
G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles
G01S 7/4863 - Detector arrays, e.g. charge-transfer gates
Ail improved, efficient method for mapping world points from an environment (e.g., points generated by a LID AR sensor of an autonomous vehicle) to locations (e.g., pixels) within rolling-shutter images taken of the environment is provided. This improved method allows for accurate localization of the world point in a rolling-shutter image via an iterative process that converges in very few iterations. The method poses the localization process as an iterative process for determining die time, within the rolling-shutter exposure period of the image, at which the world point was imaged by the camera. The method reduces the number of times the world point is projected into the normalized space of the camera image, often converging in three or fewer iterations.
Example embodiments relate to pulse energy plans for light detection and ranging (lidar) devices based on areas of interest and thermal budgets. An example lidar device includes a plurality of light emitters configured to emit light pulses into an environment in a plurality of different emission directions. The lidar device also includes circuitry configured to power the plurality of light emitters. Further, the lidar device includes a plurality of detectors configured to detect reflections of light pulses emitted by the plurality of light emitters. In addition, the lidar device includes a controller configured to (i) determine a pulse energy plan based on one or more regions of interest in the environment and a thermal budget and (ii) control the circuitry based on the pulse energy plan. The pulse energy plan specifies a pulse energy level for each light pulse emitted by each light emitter in the plurality of light emitters.
An example circuit includes a light detector and a biasing capacitor having (i) a first terminal that applies to the light detector an output voltage that can either bias or debias the light detector and (ii) a second terminal for controlling the output voltage. The circuit includes a first transistor connected to the second terminal of the biasing capacitor and configured to drive the output voltage to a first voltage level above a biasing threshold of the light detector and thereby biasing the light detector. The circuit includes a second transistor connected to the second terminal of the biasing capacitor and configured to drive the output voltage to a second voltage level below the biasing threshold of the light detector and thereby debiasing the light detector. The second voltage is a non -zero voltage that corresponds to a charge level of the biasing capacitor.
G01S 7/4863 - Detector arrays, e.g. charge-transfer gates
G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles
H02M 3/07 - Conversion of dc power input into dc power output without intermediate conversion into ac by static converters using resistors or capacitors, e.g. potential divider using capacitors charged and discharged alternately by semiconductor devices with control electrode
A light detection and ranging (LiDAR) device includes a light emitter configured. to emit light pulses into a field of view and a detector configured to detect light in the field of view. The light emitter emits a first light pulse. The detector detects, during a first measurement period, at least one reflected light pulse that is indicative of reflection, by a retroreflector based on a shape of a reflected light pulse, a magnitude of a reflected light pulse, and/or a time separation between two reflected tight pulses. In response to detecting the at least one reflected light pulse indicative of reflection by a retroreflector, the light emitter is deactivated for one or more subsequent.measurement periods. Additionally, the LIDAR device may inform one or more other LIDAR devices by transmitting to a computing device information indicative of the retroreflector being within the field of view of the light emitter.