Uatc, LLC

United States of America

Back to Profile

1-100 of 597 for Uatc, LLC Sort by
Query
Patent
United States - USPTO
Aggregations Reset Report
Date
New (last 4 weeks) 1
2024 March 3
2024 February 4
2024 January 7
2023 December 4
See more
IPC Class
G05D 1/02 - Control of position or course in two dimensions 262
G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot 258
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles 69
G01C 21/34 - Route searching; Route guidance 68
G06N 20/00 - Machine learning 68
See more
Status
Pending 113
Registered / In Force 484
Found results for  patents
  1     2     3     ...     6        Next Page

1.

Systems and Methods for Latent Distribution Modeling for Scene-Consistent Motion Forecasting

      
Application Number 18519976
Status Pending
Filing Date 2023-11-27
First Publication Date 2024-03-21
Owner UATC, LLC (USA)
Inventor
  • Casas, Sergio
  • Gulino, Cole Christian
  • Suo, Shun Da
  • Luo, Katie Z.
  • Liao, Renjie
  • Urtasun, Raquel

Abstract

A computer-implemented method for determining scene-consistent motion forecasts from sensor data can include obtaining scene data including one or more actor features. The computer-implemented method can include providing the scene data to a latent prior model, the latent prior model configured to generate scene latent data in response to receipt of scene data, the scene latent data including one or more latent variables. The computer-implemented method can include obtaining the scene latent data from the latent prior model. The computer-implemented method can include sampling latent sample data from the scene latent data. The computer-implemented method can include providing the latent sample data to a decoder model, the decoder model configured to decode the latent sample data into a motion forecast including one or more predicted trajectories of the one or more actor features. The computer-implemented method can include receiving the motion forecast including one or more predicted trajectories of the one or more actor features from the decoder model.

IPC Classes  ?

  • G06V 10/82 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
  • G05D 1/02 - Control of position or course in two dimensions
  • G06F 18/2137 - Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods based on criteria of topology preservation, e.g. multidimensional scaling or self-organising maps
  • G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
  • G06V 30/19 - Recognition using electronic means
  • G08G 1/16 - Anti-collision systems

2.

Sparse Convolutional Neural Networks

      
Application Number 18513119
Status Pending
Filing Date 2023-11-17
First Publication Date 2024-03-14
Owner UATC, LLC (USA)
Inventor
  • Urtasun, Raquel
  • Ren, Mengye
  • Pokrovsky, Andrei
  • Yang, Bin

Abstract

The present disclosure provides systems and methods that apply neural networks such as, for example, convolutional neural networks, to sparse imagery in an improved manner. For example, the systems and methods of the present disclosure can be included in or otherwise leveraged by an autonomous vehicle. In one example, a computing system can extract one or more relevant portions from imagery, where the relevant portions are less than an entirety of the imagery. The computing system can provide the relevant portions of the imagery to a machine-learned convolutional neural network and receive at least one prediction from the machine-learned convolutional neural network based at least in part on the one or more relevant portions of the imagery. Thus, the computing system can skip performing convolutions over regions of the imagery where the imagery is sparse and/or regions of the imagery that are not relevant to the prediction being sought.

IPC Classes  ?

  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
  • G01S 17/86 - Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
  • G01S 17/89 - Lidar systems, specially adapted for specific applications for mapping or imaging
  • G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles

3.

Systems and Methods for Providing a Vehicle Service Via a Transportation Network for Autonomous Vehicles

      
Application Number 18468061
Status Pending
Filing Date 2023-09-15
First Publication Date 2024-03-07
Owner UATC, LLC (USA)
Inventor
  • Woodrow, Alden James
  • Miller, Robert Evan

Abstract

Systems and methods for providing a vehicle service are provided. In one example embodiment, a computer-implemented method includes receiving data indicative of a service request to provide a vehicle service for an entity with respect to one or more cargo items designated for autonomous transport. The method includes obtaining a first cargo item among the one or more cargo items, from a representative of the entity at a dedicated first transfer hub proximate to a first location associated with the first cargo item. The method includes controlling a first autonomous vehicle to transport the first cargo item from the first transfer hub to a dedicated second transfer hub proximate to a second location associated with the first cargo item. The method includes providing the first cargo item to a representative of the entity at the second transfer hub, to provide the vehicle service.

IPC Classes  ?

  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
  • G01C 21/34 - Route searching; Route guidance
  • G05D 1/02 - Control of position or course in two dimensions
  • G06Q 10/047 - Optimisation of routes or paths, e.g. travelling salesman problem
  • G06Q 10/08 - Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
  • G06Q 50/28 - Logistics, e.g. warehousing, loading, distribution or shipping
  • G06Q 50/30 - Transportation; Communications

4.

Vehicle Management System

      
Application Number 18496307
Status Pending
Filing Date 2023-10-27
First Publication Date 2024-02-29
Owner UATC, LLC (USA)
Inventor
  • Poeppel, Scott
  • Letwin, Nicholas G.
  • Kelly, Sean J.

Abstract

Systems, methods, and vehicles for taking a vehicle out-of-service are provided. In one example embodiment, a method includes obtaining, by one or more computing devices on-board an autonomous vehicle, data indicative of one or more parameters associated with the autonomous vehicle. The autonomous vehicle is configured to provide a vehicle service to one or more users of the vehicle service. The method includes determining, by the computing devices, an existence of a fault associated with the autonomous vehicle based at least in part on the one or more parameters associated with the autonomous vehicle. The method includes determining, by the computing devices, one or more actions to be performed by the autonomous vehicle based at least in part on the existence of the fault. The method includes performing, by the computing devices, one or more of the actions to take the autonomous vehicle out-of-service based at least in part on the fault.

IPC Classes  ?

  • G07C 5/08 - Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle, or waiting time
  • G06Q 10/047 - Optimisation of routes or paths, e.g. travelling salesman problem
  • G06Q 10/0631 - Resource planning, allocation, distributing or scheduling for enterprises or organisations
  • G06Q 10/20 - Administration of product repair or maintenance
  • G07C 5/00 - Registering or indicating the working of vehicles

5.

Autonomous Vehicle Compatible Robot

      
Application Number 18495480
Status Pending
Filing Date 2023-10-26
First Publication Date 2024-02-22
Owner UATC, LLC (USA)
Inventor Donnelly, Richard Brian

Abstract

An autonomous robot is provided. In one example embodiment, an autonomous robot can include a main body including one or more compartments. The one or more compartments can be configured to provide support for transporting an item. The autonomous robot can include a mobility assembly affixed to the main body and a sensor configured to obtain sensor data associated with a surrounding environment of the autonomous robot. The autonomous robot can include a computing system configured to plan a motion of the autonomous robot based at least in part on the sensor data. The computing system can be operably connected to the mobility assembly for controlling a motion of the autonomous robot. The autonomous robot can include a coupling assembly configured to temporarily secure the autonomous robot to an autonomous vehicle. The autonomous robot can include a power system and a ventilation system that can interface with the autonomous vehicle.

IPC Classes  ?

  • G05D 1/02 - Control of position or course in two dimensions
  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot

6.

Power and Thermal Management Systems and Methods for Autonomous Vehicles

      
Application Number 18501618
Status Pending
Filing Date 2023-11-03
First Publication Date 2024-02-22
Owner UATC, LLC (USA)
Inventor
  • Rice, David Patrick
  • Boehmke, Scott Klaus

Abstract

Systems and methods for power and thermal management of autonomous vehicles are provided. In one example embodiment, a computing system includes processor(s) and one or more tangible, non-transitory, computer readable media that collectively store instructions that when executed by the processor(s) cause the computing system to perform operations. The operations include obtaining data associated with an autonomous vehicle. The operations include identifying one or more vehicle parameters associated with the autonomous vehicle based at least in part on the data associated with the autonomous vehicle. The operations include determining a modification to one or more operating characteristics of one or more systems onboard the autonomous vehicle based at least in part on the one or more vehicle parameters. The operations include controlling a temperature of at least a portion of the autonomous vehicle via implementation of the modification of the operating characteristic(s) of the system(s) onboard the autonomous vehicle.

IPC Classes  ?

  • G08G 1/0967 - Systems involving transmission of highway information, e.g. weather, speed limits
  • B60H 1/00 - Heating, cooling or ventilating devices
  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot

7.

Systems and Methods for Training Probabilistic Object Motion Prediction Models Using Non-Differentiable Prior Knowledge

      
Application Number 18495434
Status Pending
Filing Date 2023-10-26
First Publication Date 2024-02-15
Owner UATC, LLC (USA)
Inventor
  • Casas, Sergio
  • Gulino, Cole Christian
  • Suo, Shun Da
  • Urtasun, Raquel

Abstract

The present disclosure provides systems and methods for training probabilistic object motion prediction models using non-differentiable representations of prior knowledge. As one example, object motion prediction models can be used by autonomous vehicles to probabilistically predict the future location(s) of observed objects (e.g., other vehicles, bicyclists, pedestrians, etc.). For example, such models can output a probability distribution that provides a distribution of probabilities for the future location(s) of each object at one or more future times. Aspects of the present disclosure enable these models to be trained using non-differentiable prior knowledge about motion of objects within the autonomous vehicle's environment such as, for example, prior knowledge about lane or road geometry or topology and/or traffic information such as current traffic control states (e.g., traffic light status).

IPC Classes  ?

  • G06N 20/00 - Machine learning
  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
  • G05B 13/02 - Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion electric
  • G06N 7/01 - Probabilistic graphical models, e.g. probabilistic networks

8.

Autonomous Vehicle Safe Stop

      
Application Number 18490940
Status Pending
Filing Date 2023-10-20
First Publication Date 2024-02-08
Owner UATC, LLC (USA)
Inventor
  • Kazemi, Moslem
  • Bardapurkar, Sameer

Abstract

Systems, methods, tangible non-transitory computer-readable media, and devices for operating an autonomous vehicle are provided. For example, the disclosed technology can include receiving state data that includes information associated with states of an autonomous vehicle and an environment external to the autonomous vehicle. Responsive to the state data satisfying vehicle stoppage criteria, vehicle stoppage conditions can be determined to have occurred. A severity level of the vehicle stoppage conditions can be selected from a plurality of available severity levels respectively associated with a plurality of different sets of constraints. A motion plan can be generated based on the state data. The motion plan can include information associated with locations for the autonomous vehicle to traverse at time intervals corresponding to the locations. Further, the locations can include a current location of the autonomous vehicle and a destination location at which the autonomous vehicle stops traveling.

IPC Classes  ?

  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
  • B60T 7/22 - Brake-action initiating means for initiation not subject to will of driver or passenger initiated by contact of vehicle, e.g. bumper, with an external object, e.g. another vehicle
  • B60W 30/09 - Taking automatic action to avoid collision, e.g. braking and steering
  • B60W 30/095 - Predicting travel path or likelihood of collision

9.

Vehicle Control System

      
Application Number 18474972
Status Pending
Filing Date 2023-09-26
First Publication Date 2024-01-25
Owner UATC, LLC (USA)
Inventor
  • Jones, Morgan D.
  • Dacko, Michael John
  • Kirby, Brian Thomas

Abstract

Systems and methods for controlling a failover response of an autonomous vehicle are provided. In one example embodiment, a method includes determining, by one or more computing devices on-board an autonomous vehicle, an operational mode of the autonomous vehicle. The autonomous vehicle is configured to operate in at least a first operational mode in which a human driver is present in the autonomous vehicle and a second operational mode in which the human driver is not present in the autonomous vehicle. The method includes detecting a triggering event associated with the autonomous vehicle. The method includes determining actions to be performed by the autonomous vehicle in response to the triggering event based at least in part on the operational mode. The method includes providing one or more control signals to one or more of the systems on-board the autonomous vehicle to perform the one or more actions in response to the triggering event.

IPC Classes  ?

  • B60W 50/08 - Interaction between the driver and the control system
  • B60W 10/18 - Conjoint control of vehicle sub-units of different type or different function including control of braking systems
  • B60W 10/20 - Conjoint control of vehicle sub-units of different type or different function including control of steering systems
  • B60W 50/029 - Adapting to failures or work around with other constraints, e.g. circumvention by avoiding use of failed parts
  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles

10.

Fault-Tolerant Control of an Autonomous Vehicle with Multiple Control Lanes

      
Application Number 18481683
Status Pending
Filing Date 2023-10-05
First Publication Date 2024-01-25
Owner UATC, LLC (USA)
Inventor
  • Greenfield, Aaron L.
  • Yanakiev, Diana
  • Tschanz, Frederic
  • Tytler, Charles J.

Abstract

In one example embodiment, a computer-implemented method includes receiving data representing a motion plan of the autonomous vehicle via a plurality of control lanes configured to implement the motion plan to control a motion of the autonomous vehicle, the plurality of control lanes including at least a first control lane and a second control lane, and controlling the first control lane to implement the motion plan. The method includes detecting one or more faults associated with implementation of the motion plan by the first control lane or the second control lane, or in generation of the motion plan, and in response to one or more faults, controlling the first control lane or the second control lane to adjust the motion of the autonomous vehicle based at least in part on one or more fault reaction parameters associated with the one or more faults.

IPC Classes  ?

  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
  • B60W 50/029 - Adapting to failures or work around with other constraints, e.g. circumvention by avoiding use of failed parts
  • B60W 50/023 - Avoiding failures by using redundant parts
  • G06F 11/20 - Error detection or correction of the data by redundancy in hardware using active fault-masking, e.g. by switching out faulty elements or by switching in spare elements
  • G01C 21/16 - Navigation; Navigational instruments not provided for in groups by using measurement of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation

11.

Sensor housing for a vehicle

      
Application Number 29730410
Grant Number D1011248
Status In Force
Filing Date 2020-04-03
First Publication Date 2024-01-16
Grant Date 2024-01-16
Owner UATC, LLC (USA)
Inventor
  • Haban, Philipp
  • D'Eramo, Christopher Matthew

12.

Systems and Methods for Generating Motion Forecast Data for a Plurality of Actors with Respect to an Autonomous Vehicle

      
Application Number 18240771
Status Pending
Filing Date 2023-08-31
First Publication Date 2024-01-11
Owner UATC, LLC (USA)
Inventor
  • Li, Lingyun
  • Yang, Bin
  • Zeng, Wenyuan
  • Liang, Ming
  • Ren, Mengye
  • Segal, Sean
  • Urtasun Sotil, Raquel

Abstract

A computing system can input first relative location embedding data into an interaction transformer model and receive, as an output of the interaction transformer model, motion forecast data for actors relative to a vehicle. The computing system can input the motion forecast data into a prediction model to receive respective trajectories for the actors for a current time step and respective projected trajectories for the actors for a subsequent time step. The computing system can generate second relative location embedding data based on the respective projected trajectories from the second time step. The computing system can produce second motion forecast data using the interaction transformer model based on the second relative location embedding. The computing system can determine second respective trajectories for the actors using the prediction model based on the second forecast data.

IPC Classes  ?

  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
  • B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
  • G06N 3/044 - Recurrent networks, e.g. Hopfield networks

13.

Autonomous Vehicle Sensor Cleaning System

      
Application Number 18473779
Status Pending
Filing Date 2023-09-25
First Publication Date 2024-01-11
Owner UATC, LLC (USA)
Inventor
  • Rice, Wesly Mason
  • Wittenstein, Nikolaus
  • Jin, Zhizhuo
  • Smith, Paul Kevin
  • Kennelly, Sean Joseph
  • Brueckner, Peter
  • Rice, David Patrick

Abstract

The present disclosure provides a sensor cleaning system that cleans one or more sensors of an autonomous vehicle. Each sensor can have one or more corresponding sensor cleaning units that are configured to clean such sensor using a fluid (e.g., a gas or a liquid). Thus, the sensor cleaning system can include both a gas cleaning system and a liquid cleaning system. According to one aspect, the sensor cleaning system can provide individualized cleaning of the autonomous vehicle sensors. According to another aspect, a liquid cleaning system can be pressurized or otherwise powered by the gas cleaning system or other gas system.

IPC Classes  ?

  • B60S 1/48 - Liquid supply therefor
  • B60S 1/52 - Arrangement of nozzles
  • B60S 1/54 - Cleaning windscreens, windows, or optical devices using gas, e.g. hot air
  • B60S 1/56 - Cleaning windscreens, windows, or optical devices specially adapted for cleaning other parts or devices than front windows or windscreens

14.

Automatically Adjustable Partition Wall for an Autonomous Vehicle

      
Application Number 18467239
Status Pending
Filing Date 2023-09-14
First Publication Date 2024-01-04
Owner UATC, LLC (USA)
Inventor
  • Park, Min Kyu
  • D'Eramo, Christopher Matthew
  • Stegall, Neil

Abstract

Systems and methods for automatically adjusting the interior cabin of an autonomous vehicle are provided. In one example embodiment, an autonomous vehicle can include a main body including a floor and a ceiling that at least partially define an interior cabin of the autonomous vehicle. The autonomous vehicle can include a partition wall that is movable within the interior cabin of the autonomous vehicle. The partition wall can extend between the floor to the ceiling of the main body. The autonomous vehicle can include a computing system configured to receive data indicative of one or more service assignments associated with the autonomous vehicle and to adjust a position of the partition wall within the interior cabin based at least in part on the one or more service assignments.

IPC Classes  ?

  • B60R 13/08 - Insulating elements, e.g. for sound insulation
  • B60N 2/30 - Non-dismountable seats storable in a non-use position, e.g. foldable spare seats
  • B60N 2/02 - Seats specially adapted for vehicles; Arrangement or mounting of seats in vehicles the seat or part thereof being movable, e.g. adjustable
  • G01C 21/34 - Route searching; Route guidance
  • G01G 19/12 - Weighing apparatus or methods adapted for special purposes not provided for in groups for incorporation in vehicles having electrical weight-sensitive devices
  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
  • G05D 1/02 - Control of position or course in two dimensions
  • G06Q 10/0631 - Resource planning, allocation, distributing or scheduling for enterprises or organisations
  • G06Q 30/0283 - Price estimation or determination
  • G06Q 50/28 - Logistics, e.g. warehousing, loading, distribution or shipping
  • G06F 3/0482 - Interaction with lists of selectable items, e.g. menus
  • G08G 1/01 - Detecting movement of traffic to be counted or controlled

15.

Systems and methods for interactive prediction and planning

      
Application Number 17516260
Grant Number 11858536
Status In Force
Filing Date 2021-11-01
First Publication Date 2024-01-02
Grant Date 2024-01-02
Owner UATC, LLC (USA)
Inventor
  • Liu, Jerry Junkai
  • Zeng, Wenyuan
  • Urtasun, Raquel
  • Yumer, Mehmet Ersin

Abstract

Example aspects of the present disclosure describe determining, using a machine-learned model framework, a motion trajectory for an autonomous platform. The motion trajectory can be determined based at least in part on a plurality of costs based at least in part on a distribution of probabilities determined conditioned on the motion trajectory.

IPC Classes  ?

  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
  • G06N 20/00 - Machine learning

16.

Autonomous Vehicle Collision Mitigation Systems and Methods

      
Application Number 18464822
Status Pending
Filing Date 2023-09-11
First Publication Date 2023-12-28
Owner UATC, LLC (USA)
Inventor
  • Wood, Matthew Shaw
  • Leach, William M.
  • Poeppel, Scott C.
  • Letwin, Nicholas G.
  • Zych, Noah

Abstract

Systems and methods for controlling an autonomous vehicle are provided. In one example embodiment, a computer-implemented method includes obtaining, from an autonomy system, data indicative of a planned trajectory of the autonomous vehicle through a surrounding environment. The method includes determining a region of interest in the surrounding environment based at least in part on the planned trajectory. The method includes controlling one or more first sensors to obtain data indicative of the region of interest. The method includes identifying one or more objects in the region of interest, based at least in part on the data obtained by the one or more first sensors. The method includes controlling the autonomous vehicle based at least in part on the one or more objects identified in the region of interest.

IPC Classes  ?

  • G05D 1/02 - Control of position or course in two dimensions
  • B60W 30/09 - Taking automatic action to avoid collision, e.g. braking and steering
  • B62D 15/02 - Steering position indicators
  • G01S 13/931 - Radar or analogous systems, specially adapted for specific applications for anti-collision purposes of land vehicles
  • G01S 15/931 - Sonar systems specially adapted for specific applications for anti-collision purposes of land vehicles
  • G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles
  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles

17.

Multi-Task Machine-Learned Models for Object Intention Determination in Autonomous Driving

      
Application Number 18465128
Status Pending
Filing Date 2023-09-11
First Publication Date 2023-12-28
Owner UATC, LLC (USA)
Inventor
  • Casas, Sergio
  • Urtasun, Raquel
  • Luo, Wenjie

Abstract

Generally, the disclosed systems and methods utilize multi-task machine-learned models for object intention determination in autonomous driving applications. For example, a computing system can receive sensor data obtained relative to an autonomous vehicle and map data associated with a surrounding geographic environment of the autonomous vehicle. The sensor data and map data can be provided as input to a machine-learned intent model. The computing system can receive a jointly determined prediction from the machine-learned intent model for multiple outputs including at least one detection output indicative of one or more objects detected within the surrounding environment of the autonomous vehicle, a first corresponding forecasting output descriptive of a trajectory indicative of an expected path of the one or more objects towards a goal location, and/or a second corresponding forecasting output descriptive of a discrete behavior intention determined from a predefined group of possible behavior intentions.

IPC Classes  ?

  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
  • B60W 30/095 - Predicting travel path or likelihood of collision
  • G05D 1/02 - Control of position or course in two dimensions
  • G06N 20/00 - Machine learning
  • G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
  • G06V 10/764 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
  • G06V 10/82 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
  • G06V 40/20 - Movements or behaviour, e.g. gesture recognition

18.

Systems and Methods for Generating Synthetic Sensor Data via Machine Learning

      
Application Number 18466286
Status Pending
Filing Date 2023-09-13
First Publication Date 2023-12-28
Owner UATC, LLC (USA)
Inventor
  • Manivasagam, Sivabalan
  • Wang, Shenlong
  • Ma, Wei-Chiu
  • Wong, Kelvin Ka Wing
  • Zeng, Wenyuan
  • Urtasun, Raquel

Abstract

The present disclosure provides systems and methods that combine physics-based systems with machine learning to generate synthetic LiDAR data that accurately mimics a real-world LiDAR sensor system. In particular, aspects of the present disclosure combine physics-based rendering with machine-learned models such as deep neural networks to simulate both the geometry and intensity of the LiDAR sensor. As one example, a physics-based ray casting approach can be used on a three-dimensional map of an environment to generate an initial three-dimensional point cloud that mimics LiDAR data. According to an aspect of the present disclosure, a machine-learned model can predict one or more dropout probabilities for one or more of the points in the initial three-dimensional point cloud, thereby generating an adjusted three-dimensional point cloud which more realistically simulates real-world LiDAR data.

IPC Classes  ?

  • G06F 11/263 - Generation of test inputs, e.g. test vectors, patterns or sequences
  • G06T 17/20 - Wire-frame description, e.g. polygonalisation or tessellation
  • G06F 17/18 - Complex mathematical operations for evaluating statistical data
  • G06N 20/00 - Machine learning
  • G06T 15/06 - Ray-tracing
  • G06N 3/047 - Probabilistic or stochastic networks

19.

Systems and Methods for Vehicle Message Signing

      
Application Number 18462202
Status Pending
Filing Date 2023-09-06
First Publication Date 2023-12-21
Owner UATC, LLC (USA)
Inventor
  • Sorensen, Michael David
  • Wood, Matthew Charles Ellis
  • Harris, Matthew James

Abstract

Systems and methods for vehicle message signing are provided. A method includes obtaining, by a vehicle computing system of an autonomous vehicle, a computing system state associated with the vehicle computing system and a message from at least one remote process running a computing device remote from the vehicle computing system. The message is associated with an intended recipient process running on the vehicle computing system. The method includes determining an originating sender for the message. The originating sender is indicative of a remote process that generated the message. The method includes determining a routing action for the message based on a comparison of the originating sender and the computing system state. The routing action includes at least one of a discarding action or a forwarding action to the intended recipient process. The method includes performing the routing action for the message.

IPC Classes  ?

  • H04L 9/32 - Arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system
  • H04W 4/44 - Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P] for communication between vehicles and infrastructures, e.g. vehicle-to-cloud [V2C] or vehicle-to-home [V2H]

20.

Multi-Model Switching on a Collision Mitigation System

      
Application Number 18327379
Status Pending
Filing Date 2023-06-01
First Publication Date 2023-11-16
Owner UATC, LLC (USA)
Inventor
  • Wood, Matthew Shaw
  • Leach, William M.
  • Poeppel, Scott C.
  • Letwin, Nicholas G.
  • Zych, Noah

Abstract

Systems and methods for controlling an autonomous vehicle are provided. In one example embodiment, a computer-implemented method includes receiving data indicative of an operating mode of the vehicle, wherein the vehicle is configured to operate in a plurality of operating modes. The method includes determining one or more response characteristics of the vehicle based at least in part on the operating mode of the vehicle, each response characteristic indicating how the vehicle responds to a potential collision. The method includes controlling the vehicle based at least in part on the one or more response characteristics.

IPC Classes  ?

  • B60W 30/09 - Taking automatic action to avoid collision, e.g. braking and steering
  • G08G 1/16 - Anti-collision systems
  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
  • B60W 10/184 - Conjoint control of vehicle sub-units of different type or different function including control of braking systems with wheel brakes
  • B60W 10/20 - Conjoint control of vehicle sub-units of different type or different function including control of steering systems
  • B60W 30/08 - Predicting or avoiding probable or impending collision
  • G01S 13/86 - Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
  • G05D 1/02 - Control of position or course in two dimensions
  • B60W 50/08 - Interaction between the driver and the control system
  • B60W 30/095 - Predicting travel path or likelihood of collision
  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
  • B60W 50/14 - Means for informing the driver, warning the driver or prompting a driver intervention

21.

End-To-End Interpretable Motion Planner for Autonomous Vehicles

      
Application Number 18358443
Status Pending
Filing Date 2023-07-25
First Publication Date 2023-11-16
Owner UATC, LLC (USA)
Inventor
  • Zeng, Wenyuan
  • Luo, Wenjie
  • Sadat, Abbas
  • Yang, Bin
  • Urtasun, Rachel

Abstract

Systems and methods for generating motion plans including target trajectories for autonomous vehicles are provided. An autonomous vehicle may include or access a machine-learned motion planning model including a backbone network configured to generate a cost volume including data indicative of a cost associated with future locations of the autonomous vehicle. The cost volume can be generated from raw sensor data as part of motion planning for the autonomous vehicle. The backbone network can generate intermediate representations associated with object detections and objection predictions. The motion planning model can include a trajectory generator configured to evaluate one or more potential trajectories for the autonomous vehicle and to select a target trajectory based at least in part on the cost volume generate by the backbone network.

IPC Classes  ?

  • G05D 1/02 - Control of position or course in two dimensions
  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
  • G01C 21/32 - Structuring or formatting of map data
  • G01C 21/34 - Route searching; Route guidance

22.

Adjustable Beam Pattern for LIDAR Sensor

      
Application Number 18351075
Status Pending
Filing Date 2023-07-12
First Publication Date 2023-11-09
Owner UATC, LLC (USA)
Inventor Boehmke, Scott

Abstract

A LIDAR sensor for an autonomous vehicle (AV) can include one or more lasers outputting one or more laser beams, one or more non-mechanical optical components to (i) receive the one or more laser beams, (ii) configure a field of view of the LIDAR sensor, and (iii) output modulated frequencies from the one or more laser beams, and one or more photodetectors to detect return signals based on the outputted modulated frequencies from the one or more laser beams.

IPC Classes  ?

  • G01S 17/42 - Simultaneous measurement of distance and other coordinates
  • G01S 17/89 - Lidar systems, specially adapted for specific applications for mapping or imaging
  • G01S 7/48 - RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES - Details of systems according to groups , , of systems according to group
  • G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles
  • G01S 7/481 - Constructional features, e.g. arrangements of optical elements
  • G02B 26/12 - Scanning systems using multifaceted mirrors

23.

Jointly Learnable Behavior and Trajectory Planning for Autonomous Vehicles

      
Application Number 18355188
Status Pending
Filing Date 2023-07-19
First Publication Date 2023-11-09
Owner UATC, LLC (USA)
Inventor
  • Urtasun, Raquel
  • Lin, Yen-Chen
  • Pokrovsky, Andrei
  • Ren, Mengye
  • Sadat, Abbas
  • Yumer, Ersin

Abstract

Systems and methods for generating motion plans for autonomous vehicles are provided. An autonomous vehicle can include a machine-learned motion planning system including one or more machine-learned models configured to generate target trajectories for the autonomous vehicle. The model(s) include a behavioral planning stage configured to receive situational data based at least in part on the one or more outputs of the set of sensors and to generate behavioral planning data based at least in part on the situational data and a unified cost function. The model(s) includes a trajectory planning stage configured to receive the behavioral planning data from the behavioral planning stage and to generate target trajectory data for the autonomous vehicle based at least in part on the behavioral planning data and the unified cost function.

IPC Classes  ?

  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
  • G05D 1/02 - Control of position or course in two dimensions

24.

Systems and Methods for Actor Motion Forecasting within a Surrounding Environment of an Autonomous Vehicle

      
Application Number 18346518
Status Pending
Filing Date 2023-07-03
First Publication Date 2023-11-02
Owner UATC, LLC (USA)
Inventor
  • Zeng, Wenyuan
  • Liao, Renjie
  • Urtasun, Raquel
  • Liang, Ming

Abstract

Systems and methods are provided for forecasting the motion of actors within a surrounding environment of an autonomous platform. For example, a computing system of an autonomous platform can use machine-learned model(s) to generate actor-specific graphs with past motions of actors and the local map topology. The computing system can project the actor-specific graphs of all actors to a global graph. The global graph can allow the computing system to determine which actors may interact with one another by propagating information over the global graph. The computing system can distribute the interactions determined using the global graph to the individual actor-specific graphs. The computing system can then predict a motion trajectory for an actor based on the associated actor-specific graph, which captures the actor-to-actor interactions and actor-to-map relations.

IPC Classes  ?

25.

Systems and Methods for Generating Synthetic Light Detection and Ranging Data via Machine Learning

      
Application Number 18345431
Status Pending
Filing Date 2023-06-30
First Publication Date 2023-11-02
Owner UATC, LLC (USA)
Inventor
  • Manivasagam, Sivabalan
  • Wang, Shenlong
  • Ma, Wei-Chiu
  • Urtasun, Raquel

Abstract

The present disclosure provides systems and methods that combine physics-based systems with machine learning to generate synthetic LiDAR data that accurately mimics a real-world LiDAR sensor system. In particular, aspects of the present disclosure combine physics-based rendering with machine-learned models such as deep neural networks to simulate both the geometry and intensity of the LiDAR sensor. As one example, a physics-based ray casting approach can be used on a three-dimensional map of an environment to generate an initial three-dimensional point cloud that mimics LiDAR data. According to an aspect of the present disclosure, a machine-learned geometry model can predict one or more adjusted depths for one or more of the points in the initial three-dimensional point cloud, thereby generating an adjusted three-dimensional point cloud which more realistically simulates real-world LiDAR data.

IPC Classes  ?

  • G06T 17/05 - Geographic models
  • G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles
  • G07C 5/02 - Registering or indicating driving, working, idle, or waiting time only
  • G01S 17/89 - Lidar systems, specially adapted for specific applications for mapping or imaging
  • G06T 15/06 - Ray-tracing
  • G06N 20/00 - Machine learning
  • G05D 1/02 - Control of position or course in two dimensions

26.

Autonomous Vehicle Motion Control Systems and Methods

      
Application Number 18308160
Status Pending
Filing Date 2023-04-27
First Publication Date 2023-10-26
Owner UATC, LLC (USA)
Inventor
  • Gochev, Kalin Vasilev
  • Phillips, Michael Lee
  • Bradley, David Mcallister
  • Emi, Bradley Nicholas

Abstract

Systems and methods for controlling the motion of an autonomous are provided. In one example embodiment, a computer-implemented method includes obtaining data associated with an object within a surrounding environment of an autonomous vehicle. The data associated with the object is indicative of a predicted motion trajectory of the object. The method includes determining a vehicle action sequence based at least in part on the predicted motion trajectory of the object. The vehicle action sequence is indicative of a plurality of vehicle actions for the autonomous vehicle at a plurality of respective time steps associated with the predicted motion trajectory. The method includes determining a motion plan for the autonomous vehicle based at least in part on the vehicle action sequence. The method includes causing the autonomous vehicle to initiate motion control in accordance with at least a portion of the motion plan.

IPC Classes  ?

  • B60W 30/095 - Predicting travel path or likelihood of collision
  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
  • G05D 1/02 - Control of position or course in two dimensions
  • G08G 1/16 - Anti-collision systems
  • B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit

27.

Multi-Channel Light Detection and Ranging (LIDAR) Unit Having a Telecentric Lens Assembly and Single Circuit Board for Emitters and Detectors

      
Application Number 18313777
Status Pending
Filing Date 2023-05-08
First Publication Date 2023-10-12
Owner UATC, LLC (USA)
Inventor
  • Haslim, James Allen
  • Borden, Michael Bryan
  • Sing, Daniel Thomas

Abstract

A LIDAR unit includes a housing defining a cavity. The LIDAR unit further include a plurality of emitters disposed on a circuit board within the cavity. Each of the emitters emits a laser beam along a transmit path. The LIDAR system further includes a first telecentric lens assembly positioned within the cavity and along the transmit path such that the laser beam emitted from each of the plurality of emitters passes through the first telecentric lens assembly. The LIDAR further includes a second telecentric lens assembly positioned within the cavity and along a receive path such that a plurality of reflected laser beams entering the cavity pass through the second telecentric lens assembly. The first telecentric lens assembly and the second telecentric lens assembly each include a field flattening lens and at least one other lens.

IPC Classes  ?

  • G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles
  • G01S 17/06 - Systems determining position data of a target
  • G01S 7/481 - Constructional features, e.g. arrangements of optical elements
  • G01S 17/42 - Simultaneous measurement of distance and other coordinates
  • G05D 1/02 - Control of position or course in two dimensions

28.

Autonomous Vehicle Lane Boundary Detection Systems and Methods

      
Application Number 18313794
Status Pending
Filing Date 2023-05-08
First Publication Date 2023-08-31
Owner UATC, LLC (USA)
Inventor
  • Bai, Min
  • Mattyus, Gellert Sandor
  • Homayounfar, Namdar
  • Wang, Shenlong
  • Lakshmikanth, Shrinidihi Kowshika
  • Urtasun, Raquel

Abstract

Systems and methods for facilitating communication with autonomous vehicles are provided. In one example embodiment, a computing system can obtain a first type of sensor data (e.g., camera image data) associated with a surrounding environment of an autonomous vehicle and/or a second type of sensor data (e.g., LIDAR data) associated with the surrounding environment of the autonomous vehicle. The computing system can generate overhead image data indicative of at least a portion of the surrounding environment of the autonomous vehicle based at least in part on the first and/or second types of sensor data. The computing system can determine one or more lane boundaries within the surrounding environment of the autonomous vehicle based at least in part on the overhead image data indicative of at least the portion of the surrounding environment of the autonomous vehicle and a machine-learned lane boundary detection model.

IPC Classes  ?

  • G06V 10/82 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
  • G05D 1/02 - Control of position or course in two dimensions
  • G06N 3/08 - Learning methods
  • G06N 3/04 - Architecture, e.g. interconnection topology
  • G01S 7/48 - RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES - Details of systems according to groups , , of systems according to group
  • G01S 17/89 - Lidar systems, specially adapted for specific applications for mapping or imaging
  • G01S 17/86 - Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
  • G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles
  • G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
  • G06F 18/21 - Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation

29.

Automated Vehicle Seats

      
Application Number 18192348
Status Pending
Filing Date 2023-03-29
First Publication Date 2023-08-17
Owner UATC, LLC (USA)
Inventor
  • D'Eramo, Christopher Matthew
  • Chan, Nicolas Bryan
  • Urbanski, Janna

Abstract

Systems, methods, tangible non-transitory computer-readable media, and devices associated with the operation of a vehicle are provided. For example a vehicle computing system can receive occupancy data that includes information associated with occupancy of a vehicle that includes seats. One or more states of the vehicle can be determined. The states of the vehicle can include a disposition of any object that is within the vehicle. Further, a configuration of the seats in the vehicle can be determined based on the occupancy data and the states of the vehicle. The configuration can include a disposition of the seats inside the vehicle. Furthermore, at least one of the seats can be adjusted based on the configuration that was determined.

IPC Classes  ?

  • B60N 2/01 - Arrangement of seats relative to one another
  • B60N 2/02 - Seats specially adapted for vehicles; Arrangement or mounting of seats in vehicles the seat or part thereof being movable, e.g. adjustable
  • B60N 3/00 - Arrangements or adaptations of other passenger fittings, not otherwise provided for
  • B60N 2/06 - Seats specially adapted for vehicles; Arrangement or mounting of seats in vehicles the seat or part thereof being movable, e.g. adjustable the whole seat being movable slidable
  • B60N 2/00 - Seats specially adapted for vehicles; Arrangement or mounting of seats in vehicles
  • A47C 3/04 - Stackable chairs or nesting chairs
  • B60N 2/14 - Seats specially adapted for vehicles; Arrangement or mounting of seats in vehicles the seat or part thereof being movable, e.g. adjustable the whole seat being movable rotatable, e.g. to permit easy access

30.

Object association for autonomous vehicles

      
Application Number 18297937
Grant Number 11934962
Status In Force
Filing Date 2023-04-10
First Publication Date 2023-08-17
Grant Date 2024-03-19
Owner UATC, LLC (USA)
Inventor
  • Vallespi-Gonzalez, Carlos
  • Sen, Abhishek
  • Gautam, Shivam

Abstract

Systems, methods, tangible non-transitory computer-readable media, and devices for associating objects are provided. For example, the disclosed technology can receive sensor data associated with the detection of objects over time. An association dataset can be generated and can include information associated with object detections of the objects at a most recent time interval and object tracks of the objects at time intervals in the past. A subset of the association dataset including the object detections that satisfy some association subset criteria can be determined. Association scores for the object detections in the subset of the association dataset can be determined. Further, the object detections can be associated with the object tracks based on the association scores for each of the object detections in the subset of the association dataset that satisfy some association criteria.

IPC Classes  ?

  • G06N 5/022 - Knowledge engineering; Knowledge acquisition
  • G06N 5/046 - Forward inferencing; Production systems
  • G06N 20/00 - Machine learning
  • G06T 7/292 - Multi-camera tracking
  • G06T 7/70 - Determining position or orientation of objects or cameras
  • G06V 10/764 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
  • G06V 20/30 - Scenes; Scene-specific elements in albums, collections or shared content, e.g. social network photos or video
  • G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
  • G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads

31.

Light Detection and Ranging (LIDAR) System Having a Polarizing Beam Splitter

      
Application Number 18301503
Status Pending
Filing Date 2023-04-17
First Publication Date 2023-08-10
Owner UATC, LLC (USA)
Inventor
  • Haslim, James Allen
  • Borden, Michael Bryan

Abstract

A LIDAR system includes a plurality of LIDAR units. Each of the LIDAR units includes a housing defining a cavity. Each of the LIDAR units further includes a plurality of emitters disposed within the cavity. Each of the plurality of emitters is configured to emit a laser beam. The LIDAR system includes a rotating mirror and a retarder. The retarder is configurable in at least a first mode and a second mode to control a polarization state of a plurality of laser beams emitted from each of the plurality of LIDAR units. The LIDAR system includes a polarizing beam splitter positioned relative to the retarder such that the polarizing beam splitter receives a plurality of laser beams exiting the retarder. The polarizing beam is configured to transmit or reflect the plurality of laser beams exiting the retarder based on the polarization state of the laser beams exiting the retarder.

IPC Classes  ?

  • G01S 7/499 - RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES - Details of systems according to groups , , of systems according to group using polarisation effects
  • G01S 7/481 - Constructional features, e.g. arrangements of optical elements
  • G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles
  • B60W 30/095 - Predicting travel path or likelihood of collision
  • B60W 30/09 - Taking automatic action to avoid collision, e.g. braking and steering
  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
  • G02B 6/27 - Optical coupling means with polarisation selective and adjusting means

32.

Systems and Methods for Seat Reconfiguration for Autonomous Vehicles

      
Application Number 18297794
Status Pending
Filing Date 2023-04-10
First Publication Date 2023-08-10
Owner UATC, LLC (USA)
Inventor Kanitz, Daniel Adam

Abstract

Systems and methods for reconfiguring seats of an autonomous vehicle is provided. The method includes obtaining service request data that includes a service selection and request characteristics. The method includes obtaining data describing an initial seat configuration for each of a plurality of seats of an autonomous vehicle assigned to the service request. The initial seat configuration can include a seat position and a seat orientation for each of the plurality of seats. The method includes generating, based on the initial cabin configuration and the service request data, seat adjustment instructions configured to adjust the initial seat configuration of at least one of the seats. The method includes providing the seat adjustment instructions to the autonomous vehicle assigned to the service request.

IPC Classes  ?

  • B60N 2/02 - Seats specially adapted for vehicles; Arrangement or mounting of seats in vehicles the seat or part thereof being movable, e.g. adjustable
  • G06N 20/00 - Machine learning
  • G06N 5/04 - Inference or reasoning models
  • B60N 2/809 - Head-rests movable or adjustable vertically slidable
  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
  • B60N 2/01 - Arrangement of seats relative to one another
  • G05D 1/02 - Control of position or course in two dimensions

33.

Semantic Segmentation of Three-Dimensional Data

      
Application Number 18299970
Status Pending
Filing Date 2023-04-13
First Publication Date 2023-08-10
Owner UATC, LLC (USA)
Inventor
  • Zhang, Chris Jia-Han
  • Luo, Wenjie
  • Urtasun, Raquel

Abstract

Systems and methods for performing semantic segmentation of three-dimensional data are provided. In one example embodiment, a computing system can be configured to obtain sensor data including three-dimensional data associated with an environment. The three-dimensional data can include a plurality of points and can be associated with one or more times. The computing system can be configured to determine data indicative of a two-dimensional voxel representation associated with the environment based at least in part on the three-dimensional data. The computing system can be configured to determine a classification for each point of the plurality of points within the three-dimensional data based at least in part on the two-dimensional voxel representation associated with the environment and a machine-learned semantic segmentation model. The computing system can be configured to initiate one or more actions based at least in part on the per-point classifications.

IPC Classes  ?

  • G06V 10/82 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
  • G06T 3/00 - Geometric image transformation in the plane of the image
  • G01S 7/48 - RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES - Details of systems according to groups , , of systems according to group
  • G06V 20/40 - Scenes; Scene-specific elements in video content
  • G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
  • G06V 10/764 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
  • G06V 10/776 - Validation; Performance evaluation
  • G06V 20/10 - Terrestrial scenes
  • G06V 20/64 - Three-dimensional objects

34.

Systems and Methods for Detecting Actors with Respect to an Autonomous Vehicle

      
Application Number 18178641
Status Pending
Filing Date 2023-03-06
First Publication Date 2023-08-03
Owner UATC, LLC (USA)
Inventor
  • Sen, Abhishek
  • Fagg, Ashton James
  • Becker, Brian C.
  • Xu, Yang
  • Pilbrough, Nathan Nicolas
  • Vallespi-Gonzalez, Carlos

Abstract

An autonomous vehicle computing system can include a primary perception system configured to receive a plurality of sensor data points as input generate primary perception data representing a plurality of classifiable objects and a plurality of paths representing tracked motion of the plurality of classifiable objects. The autonomous vehicle computing system can include a secondary perception system configured to receive the plurality of sensor data points as input, cluster a subset of the plurality of sensor data points of the sensor data to generate one or more sensor data point clusters representing one or more unclassifiable objects that are not classifiable by the primary perception system, and generate secondary path data representing tracked motion of the one or more unclassifiable objects. The autonomous vehicle computing system can generate fused perception data based on the primary perception data and the one or more unclassifiable objects.

IPC Classes  ?

  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
  • G01S 7/48 - RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES - Details of systems according to groups , , of systems according to group
  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
  • G01S 17/66 - Tracking systems using electromagnetic waves other than radio waves
  • G01S 17/58 - Velocity or trajectory determination systems; Sense-of-movement determination systems
  • G05D 1/02 - Control of position or course in two dimensions
  • G06N 20/00 - Machine learning
  • G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles

35.

Systems and Methods for Controlling an Autonomous Vehicle with Occluded Sensor Zones

      
Application Number 18295491
Status Pending
Filing Date 2023-04-04
First Publication Date 2023-07-27
Owner UATC, LLC (USA)
Inventor Becker, Brian C.

Abstract

Systems and methods for controlling an autonomous vehicle are provided. In one example embodiment, a computer-implemented method includes obtaining sensor data indicative of a surrounding environment of the autonomous vehicle, the surrounding environment including one or more occluded sensor zones. The method includes determining that a first occluded sensor zone of the occluded sensor zone(s) is occupied based at least in part on the sensor data. The method includes, in response to determining that the first occluded sensor zone is occupied, controlling the autonomous vehicle to travel clear of the first occluded sensor zone.

IPC Classes  ?

  • G05D 1/02 - Control of position or course in two dimensions
  • B60W 30/18 - Propelling the vehicle
  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot

36.

Systems and Methods for Generating Motion Forecast Data for Actors with Respect to an Autonomous Vehicle and Training a Machine Learned Model for the Same

      
Application Number 18186718
Status Pending
Filing Date 2023-03-20
First Publication Date 2023-07-20
Owner UATC, LLC (USA)
Inventor
  • Urtasun, Raquel
  • Liao, Renjie
  • Casas, Sergio
  • Gulino, Cole Christian

Abstract

Systems and methods for generating motion forecast data for actors with respect to an autonomous vehicle and training a machine learned model for the same are disclosed. The computing system can include an object detection model and a graph neural network including a plurality of nodes and a plurality of edges. The computing system can be configured to input sensor data into the object detection model; receive object detection data describing the location of the plurality of the actors relative to the autonomous vehicle as an output of the object detection model; input the object detection data into the graph neural network; iteratively update a plurality of node states respectively associated with the plurality of nodes; and receive, as an output of the graph neural network, the motion forecast data with respect to the plurality of actors.

IPC Classes  ?

  • G06N 3/04 - Architecture, e.g. interconnection topology
  • G06N 3/08 - Learning methods
  • G08G 1/00 - Traffic control systems for road vehicles
  • G01C 21/36 - Input/output arrangements for on-board computers
  • G08G 1/01 - Detecting movement of traffic to be counted or controlled
  • G06N 3/084 - Backpropagation, e.g. using gradient descent
  • G06N 7/04 - Physical realisation
  • G06N 3/06 - Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons
  • G06N 3/045 - Combinations of networks
  • G06N 7/01 - Probabilistic graphical models, e.g. probabilistic networks

37.

LIDAR System Design to Mitigate LIDAR Cross-Talk

      
Application Number 18179019
Status Pending
Filing Date 2023-03-06
First Publication Date 2023-07-06
Owner UATC, LLC (USA)
Inventor Juelsgaard, Soren

Abstract

Aspects of the present disclosure involve systems, methods, and devices for mitigating Lidar cross-talk. Consistent with some embodiments, a Lidar system is configured to include one or more noise source detectors that detect noise signals that may produce noise in return signals received at the Lidar system. A noise source detector comprises a light sensor to receive a noise signal produced by a noise source and a timing circuit to provide a timing signal indicative of a direction of the noise source relative to an autonomous vehicle on which the Lidar system is mounted. A noise source may be an external Lidar system or a surface in the surrounding environment that is reflecting light signals such as those emitted by an external Lidar system.

IPC Classes  ?

  • G01S 7/497 - Means for monitoring or calibrating
  • G01S 7/4865 - Time delay measurement, e.g. time-of-flight measurement, time of arrival measurement or determining the exact position of a peak
  • G01S 7/4863 - Detector arrays, e.g. charge-transfer gates
  • G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles
  • G01S 7/48 - RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES - Details of systems according to groups , , of systems according to group
  • G01S 7/495 - Counter-measures or counter-counter-measures

38.

Passenger Seats and Doors for an Autonomous Vehicle

      
Application Number 18181932
Status Pending
Filing Date 2023-03-10
First Publication Date 2023-07-06
Owner UATC, LLC (USA)
Inventor Kanitz, Daniel Adam

Abstract

An autonomous can include one or more configurable passenger seats to accommodate a plurality of different seating configurations. For instance, the one or more passenger seats can include a passenger seat defining a seating orientation. The passenger seat can be configurable in a first configuration in which the seating orientation is directed towards a forward end of the autonomous vehicle and a second configuration in which the seating orientation is directed towards a rear end of the autonomous vehicle. The passenger seat can include a seatback rotatable about a pivot point on a base of the passenger seat to switch between the first configuration and the second configuration. Alternatively, or additionally, the autonomous vehicle can include a door assembly pivotably fixed to a vehicle body of the autonomous vehicle such that a swept path of the door assembly when moving between an open position and a closed position is reduced.

IPC Classes  ?

  • B60N 2/02 - Seats specially adapted for vehicles; Arrangement or mounting of seats in vehicles the seat or part thereof being movable, e.g. adjustable
  • B60N 2/832 - Head-rests movable or adjustable vertically slidable movable to an inoperative or stowed position
  • B60N 2/00 - Seats specially adapted for vehicles; Arrangement or mounting of seats in vehicles
  • B60N 2/90 - Seats specially adapted for vehicles; Arrangement or mounting of seats in vehicles - Details or parts not otherwise provided for

39.

Planar-Beam, Light Detection and Ranging System

      
Application Number 18170841
Status Pending
Filing Date 2023-02-17
First Publication Date 2023-06-29
Owner UATC, LLC (USA)
Inventor Boehmke, Scott

Abstract

A planar-beam, light detection and ranging (PLADAR) system can include a laser scanner that emits a planar-beam, and a detector array that detects reflected light from the planar beam.

IPC Classes  ?

  • G05D 1/02 - Control of position or course in two dimensions
  • G01S 7/497 - Means for monitoring or calibrating
  • G01S 7/481 - Constructional features, e.g. arrangements of optical elements
  • G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles
  • G01C 21/34 - Route searching; Route guidance
  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot

40.

Systems and Methods for Remote Status Detection of Autonomous Vehicles

      
Application Number 18175748
Status Pending
Filing Date 2023-02-28
First Publication Date 2023-06-29
Owner UATC, LLC (USA)
Inventor
  • Poeppel, Scott C.
  • Zych, Noah
  • Wood, Matthew Shaw
  • Vandenberg, Iii, Dirk John

Abstract

Systems and methods are provided for remotely detecting a status associated with an autonomous vehicle and generating control actions in response to such detections. In one example, a computing system can access a third-party communication associated with an autonomous vehicle. The computing system can determine, based at least in part on the third-party communication, a predetermined identifier associated with the autonomous vehicle. The computing system can determine, based at least in part on the third-party communication, a status associated with the autonomous vehicle, and transmit one or more control messages to the autonomous vehicle based at least in part on the predetermined identifier and the status associated with the autonomous vehicle.

IPC Classes  ?

  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
  • G08G 1/017 - Detecting movement of traffic to be counted or controlled identifying vehicles
  • G07C 5/00 - Registering or indicating the working of vehicles
  • G05D 1/02 - Control of position or course in two dimensions

41.

Systems and Methods for Simulating Traffic Scenes

      
Application Number 18168093
Status Pending
Filing Date 2023-02-13
First Publication Date 2023-06-22
Owner UATC, LLC (USA)
Inventor
  • Tan, Shuhan
  • Wong, Kelvin Ka Wing
  • Wang, Shenlong
  • Manivasagam, Sivabalan
  • Ren, Mengye
  • Urtasun, Raquel

Abstract

Example aspects of the present disclosure describe a scene generator for simulating scenes in an environment. For example, snapshots of simulated traffic scenes can be generated by sampling a joint probability distribution trained on real-world traffic scenes. In some implementations, samples of the joint probability distribution can be obtained by sampling a plurality of factorized probability distributions for a plurality of objects for sequential insertion into the scene.

IPC Classes  ?

  • G08G 1/01 - Detecting movement of traffic to be counted or controlled
  • G06V 20/54 - Surveillance or monitoring of activities, e.g. for recognising suspicious objects of traffic, e.g. cars on the road, trains or boats
  • G06F 18/2415 - Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate

42.

Systems and Methods for Onboard Vehicle Certificate Distribution

      
Application Number 18172735
Status Pending
Filing Date 2023-02-22
First Publication Date 2023-06-22
Owner UATC, LLC (USA)
Inventor
  • Sorensen, Michael David
  • Wood, Matthew Charles Ellis
  • Harris, Matthew James

Abstract

Systems and methods for onboard vehicle certificate distribution are provided. A system can include a plurality of devices including a master device for authenticating processes and one or more requesting devices. The master device can include a master host security service configured to authenticate the one or more processes of the system. The master host security service can run a certificate authority to generate a root certificate and a private root key corresponding to the root certificate. A respective host security service can receive a request for a process manifest for a requesting process of a respective device from a respective orchestration service. The respective host security service can generate the process manifest for the requesting process and provide the process manifest to the requesting process. The requesting process can use the process manifest to communicate with the certificate authority to obtain an operational certificate based on the root certificate.

IPC Classes  ?

  • H04L 9/32 - Arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system
  • G06F 3/06 - Digital input from, or digital output to, record carriers
  • H04L 9/08 - Key distribution
  • B60R 25/24 - Means to switch the anti-theft system on or off using electronic identifiers containing a code not memorised by the user

43.

LIDAR Sensor Assembly Including Joint Coupling Features

      
Application Number 18164995
Status Pending
Filing Date 2023-02-06
First Publication Date 2023-06-15
Owner UATC, LLC (USA)
Inventor Ratner, Daniel

Abstract

A light detection and ranging (LIDAR) sensor assembly can comprise an optics assembly that includes a LIDAR sensor and a set of dovetail joint inserts. The LIDAR sensor assembly can further include a frame comprising a set of dovetail joint septums coupled to the set of dovetail joint inserts of the optics assembly.

IPC Classes  ?

  • G01S 7/481 - Constructional features, e.g. arrangements of optical elements

44.

System and Method for Presenting Autonomy-Switching Directions

      
Application Number 18161500
Status Pending
Filing Date 2023-01-30
First Publication Date 2023-06-08
Owner UATC, LLC (USA)
Inventor
  • Rockmore, Logan
  • Nix, Molly

Abstract

An on-board computing system for a vehicle is configured to generate and selectively present a set of autonomous-switching directions within a navigation user interface for the operator of the vehicle. The autonomous-switching directions can inform the operator regarding changes to the vehicle's mode of autonomous operation. The on-board computing system can generate the set of autonomy-switching directions based on the vehicle's route and other information associated with the route, such as autonomous operation permissions (AOPs) for route segments that comprise the route. The on-board computing device can selectively present the autonomy-switching directions based on locations associated with anticipated changes in autonomous operations determined for the route of the vehicle, the vehicle's location, and the vehicle's speed. In addition, the on-board computing device is further configured to present audio alerts associated with the autonomy-switching directions to the operator of the vehicle.

IPC Classes  ?

  • G01C 21/36 - Input/output arrangements for on-board computers
  • B60W 30/00 - Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units

45.

Autonomous vehicle with independent auxiliary control units

      
Application Number 18162249
Grant Number 11782437
Status In Force
Filing Date 2023-01-31
First Publication Date 2023-06-08
Grant Date 2023-10-10
Owner UATC, LLC (USA)
Inventor
  • Letwin, Nicholas
  • Kelly, Sean

Abstract

An autonomous vehicle which includes multiple independent control systems that provide redundancy as to specific and critical safety situations which may be encountered when the autonomous vehicle is in operation.

IPC Classes  ?

  • B60W 30/08 - Predicting or avoiding probable or impending collision
  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
  • B60T 8/1755 - Brake regulation specially adapted to control the stability of the vehicle, e.g. taking into account yaw rate or transverse acceleration in a curve
  • G05D 1/02 - Control of position or course in two dimensions
  • B60W 50/02 - Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures

46.

Systems and methods for a moveable cover panel of an autonomous vehicle

      
Application Number 18162293
Grant Number 11897406
Status In Force
Filing Date 2023-01-31
First Publication Date 2023-06-08
Grant Date 2024-02-13
Owner UATC, LLC (USA)
Inventor Haban, Philipp

Abstract

Systems and methods for a moveable cover panel of an autonomous vehicle is provided. A vehicle can include a front panel disposed proximate to the front end of the passenger compartment, a vehicle motion control device located at the front panel, and a cover panel located at the front panel. The cover panel moveable relative to the front panel between an isolating position and an exposing position. The cover panel can isolate the vehicle motion control device from the passenger compartment when in the isolating position and expose the vehicle motion control device to the passenger compartment when in the exposing position. A method can include obtaining vehicle data identifying an operational mode, state, and/or status of the vehicle, determining a first position of the cover panel, and initiating a positional change for the cover panel based on the vehicle data and the first position.

IPC Classes  ?

  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
  • B60R 21/205 - Arrangements for storing inflatable members in their non-use or deflated condition; Arrangement or mounting of air bag modules or components in dashboards
  • B62D 1/183 - Steering columns yieldable or adjustable, e.g. tiltable adjustable between in-use and out-of-use positions, e.g. to improve access

47.

Continuous convolution and fusion in neural networks

      
Application Number 18153486
Grant Number 11880771
Status In Force
Filing Date 2023-01-12
First Publication Date 2023-06-01
Grant Date 2024-01-23
Owner UATC, LLC (USA)
Inventor
  • Wang, Shenlong
  • Ma, Wei-Chiu
  • Suo, Shun Da
  • Urtasun, Raquel
  • Liang, Ming

Abstract

Systems and methods are provided for machine-learned models including convolutional neural networks that generate predictions using continuous convolution techniques. For example, the systems and methods of the present disclosure can be included in or otherwise leveraged by an autonomous vehicle. In one example, a computing system can perform, with a machine-learned convolutional neural network, one or more convolutions over input data using a continuous filter relative to a support domain associated with the input data, and receive a prediction from the machine-learned convolutional neural network. A machine-learned convolutional neural network in some examples includes at least one continuous convolution layer configured to perform convolutions over input data with a parametric continuous kernel.

IPC Classes  ?

  • G06F 18/00 - Pattern recognition
  • G06N 3/084 - Backpropagation, e.g. using gradient descent
  • G06N 3/08 - Learning methods
  • G05D 1/02 - Control of position or course in two dimensions
  • G06N 3/045 - Combinations of networks
  • G06V 10/764 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
  • G06V 10/80 - Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
  • G06V 10/82 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks

48.

Automatic Robotically Steered Sensor for Targeted High Performance Perception and Vehicle Control

      
Application Number 18159961
Status Pending
Filing Date 2023-01-26
First Publication Date 2023-06-01
Owner UATC, LLC (USA)
Inventor Calleija, Mark

Abstract

Disclosed are methods, systems, and non-transitory computer readable media that control an autonomous vehicle via at least two sensors. One aspect includes capturing an image of a scene ahead of the vehicle with a first sensor, identifying an object in the scene at a confidence level based on the image, determining the confidence level of the identifying is below a threshold, in response to the confidence level being below the threshold, directing a second sensor having a field of view smaller than the first sensor to generate a second image including a location of the identified object, further identifying the object in the scene based on the second image, controlling the vehicle based on the further identification of the object.

IPC Classes  ?

  • G05D 1/02 - Control of position or course in two dimensions
  • G06T 3/40 - Scaling of a whole image or part thereof
  • G08G 1/04 - Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
  • G08G 1/015 - Detecting movement of traffic to be counted or controlled with provision for distinguishing between motor cars and cycles
  • G08G 1/01 - Detecting movement of traffic to be counted or controlled
  • G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
  • G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
  • G06F 18/25 - Fusion techniques
  • H04N 23/695 - Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
  • G06V 10/80 - Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level

49.

Discrete Decision Architecture for Motion Planning System of an Autonomous Vehicle

      
Application Number 18150880
Status Pending
Filing Date 2023-01-06
First Publication Date 2023-05-25
Owner UATC, LLC (USA)
Inventor
  • Phillips, Michael Lee
  • Burnette, Don
  • Gochev, Kalin Vasilev
  • Liemhetcharat, Somchaya
  • Dayanidhi, Harishma
  • Perko, Eric Michael
  • Wilkinson, Eric Lloyd
  • Green, Colin Jeffrey
  • Liu, Wei
  • Stentz, Anthony Joseph
  • Bradley, David Mcallister
  • Marden, Samuel Philip

Abstract

The present disclosure provides autonomous vehicle systems and methods that include or otherwise leverage a motion planning system that generates constraints as part of determining a motion plan for an autonomous vehicle (AV). In particular, a scenario generator within a motion planning system can generate constraints based on where objects of interest are predicted to be relative to an autonomous vehicle. A constraint solver can identify navigation decisions for each of the constraints that provide a consistent solution across all constraints. The solution provided by the constraint solver can be in the form of a trajectory path determined relative to constraint areas for all objects of interest. The trajectory path represents a set of navigation decisions such that a navigation decision relative to one constraint doesn’t sacrifice an ability to satisfy a different navigation decision relative to one or more other constraints.

IPC Classes  ?

  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
  • B60W 30/095 - Predicting travel path or likelihood of collision
  • B60W 30/12 - Lane keeping
  • B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
  • B60W 30/16 - Control of distance between vehicles, e.g. keeping a distance to preceding vehicle
  • G05D 1/02 - Control of position or course in two dimensions
  • B60W 30/18 - Propelling the vehicle
  • G01C 21/20 - Instruments for performing navigational calculations
  • G01C 21/34 - Route searching; Route guidance

50.

Systems and Methods for Generating Basis Paths for Autonomous Vehicle Motion Control

      
Application Number 18158251
Status Pending
Filing Date 2023-01-23
First Publication Date 2023-05-25
Owner UATC, LLC (USA)
Inventor
  • Liu, Chenggang
  • Bradley, David Mcallister
  • Jia, Daoyuan

Abstract

Systems and methods for basis path generation are provided. In particular, a computing system can obtain a target nominal path. The computing system can determine a current pose for an autonomous vehicle. The computing system can determine, based at least in part on the current pose of the autonomous vehicle and the target nominal path, a lane change region. The computing system can determine one or more merge points on the target nominal path. The computing system can, for each respective merge point in the one or more merge points, generate a candidate basis path from the current pose of the autonomous vehicle to the respective merge point. The computing system can generate a suitability classification for each candidate basis path. The computing system can select one or more candidate basis paths based on the suitability classification for each respective candidate basis path in the plurality of candidate basis paths.

IPC Classes  ?

  • G05D 1/02 - Control of position or course in two dimensions
  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles

51.

Systems and methods to obtain feedback in response to autonomous vehicle failure events

      
Application Number 18154130
Grant Number 11900738
Status In Force
Filing Date 2023-01-13
First Publication Date 2023-05-11
Grant Date 2024-02-13
Owner UATC, LLC (USA)
Inventor
  • Nix, Molly Castle
  • Chin, Sean
  • Zhao, Dennis

Abstract

The present disclosure provides systems and methods to obtain feedback descriptive of autonomous vehicle failures. In particular, the systems and methods of the present disclosure can detect that a vehicle failure event occurred at an autonomous vehicle and, in response, provide an interactive user interface that enables a human located within the autonomous vehicle to enter feedback that describes the vehicle failure event. Thus, the systems and methods of the present disclosure can actively prompt and/or enable entry of feedback in response to a particular instance of a vehicle failure event, thereby enabling improved and streamlined collection of information about autonomous vehicle failures.

IPC Classes  ?

  • G06F 17/00 - Digital computing or data processing equipment or methods, specially adapted for specific functions
  • G07C 5/08 - Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle, or waiting time
  • B60W 50/14 - Means for informing the driver, warning the driver or prompting a driver intervention
  • B60K 35/00 - Arrangement or adaptations of instruments
  • G01C 21/36 - Input/output arrangements for on-board computers
  • G05D 1/02 - Control of position or course in two dimensions

52.

Three-Dimensional Object Detection

      
Application Number 17971069
Status Pending
Filing Date 2022-10-21
First Publication Date 2023-04-27
Owner UATC, LLC (USA)
Inventor
  • Liang, Ming
  • Yang, Bin
  • Wang, Shenlong
  • Ma, Wei-Chiu
  • Urtasun, Raquel

Abstract

Generally, the disclosed systems and methods implement improved detection of objects in three-dimensional (3D) space. More particularly, an improved 3D object detection system can exploit continuous fusion of multiple sensors and/or integrated geographic prior map data to enhance effectiveness and robustness of object detection in applications such as autonomous driving. In some implementations, geographic prior data (e.g., geometric ground and/or semantic road features) can be exploited to enhance three-dimensional object detection for autonomous vehicle applications. In some implementations, object detection systems and methods can be improved based on dynamic utilization of multiple sensor modalities. More particularly, an improved 3D object detection system can exploit both LIDAR systems and cameras to perform very accurate localization of objects within three-dimensional space relative to an autonomous vehicle. For example, multi-sensor fusion can be implemented via continuous convolutions to fuse image data samples and LIDAR feature maps at different levels of resolution.

IPC Classes  ?

  • G01S 17/89 - Lidar systems, specially adapted for specific applications for mapping or imaging
  • G06N 3/08 - Learning methods
  • G06N 3/04 - Architecture, e.g. interconnection topology
  • G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles
  • G01S 7/481 - Constructional features, e.g. arrangements of optical elements
  • G05D 1/02 - Control of position or course in two dimensions
  • G06N 3/02 - Neural networks
  • G06V 20/64 - Three-dimensional objects
  • G06F 18/25 - Fusion techniques
  • G06V 10/764 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
  • G06V 10/80 - Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
  • G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle

53.

Systems and Methods for Vehicle Spatial Path Sampling

      
Application Number 18074770
Status Pending
Filing Date 2022-12-05
First Publication Date 2023-04-20
Owner UATC, LLC (USA)
Inventor
  • Bradley, David Mcallister
  • Liu, Chenggang
  • Jia, Daoyuan

Abstract

Systems and methods for vehicle spatial path sampling are provided. The method includes obtaining an initial travel path for an autonomous vehicle from a first location to a second location and vehicle configuration data indicative of one or more physical constraints of the autonomous vehicle. The method includes determining one or more secondary travel paths for the autonomous vehicle from the first location to the second location based on the initial travel path and the vehicle configuration data. The method includes generating a spatial envelope based on the one or more secondary travel paths that indicates a plurality of lateral offsets from the initial travel path. And, the method includes generating a plurality of trajectories for the autonomous vehicle to travel from the first location to the second location such that each of the plurality of trajectories include one or more lateral offsets identified by the spatial envelope.

IPC Classes  ?

  • G05D 1/02 - Control of position or course in two dimensions
  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot

54.

LIDAR FAULT DETECTION SYSTEM

      
Application Number 18076081
Status Pending
Filing Date 2022-12-06
First Publication Date 2023-03-30
Owner UATC, LLC (USA)
Inventor
  • Vandenberg, Iii, Dirk John
  • Haslim, James Allen
  • Smith, Thomas Lawrence
  • Kenvarg, Adam David

Abstract

Aspects of the present disclosure involve systems, methods, and devices for fault detection in a Lidar system. A fault detection system obtains incoming Lidar data output by a Lidar system during operation of an AV system. The incoming Lidar data includes one or more data points corresponding to a fault detection target on an exterior of a vehicle of the AV system. The fault detection system accesses historical Lidar data that is based on data previously output by the Lidar system. The historical Lidar data corresponds to the fault detection target. The fault detection system performs a comparison of the incoming Lidar data with the historical Lidar data to identify any differences between the two sets of data. The fault detection system detects a fault condition occurring at the Lidar system based on the comparison.

IPC Classes  ?

  • G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles
  • G05D 1/02 - Control of position or course in two dimensions
  • G01S 7/48 - RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES - Details of systems according to groups , , of systems according to group

55.

Systems and methods for remote status detection of autonomous vehicles

      
Application Number 16220406
Grant Number 11614735
Status In Force
Filing Date 2018-12-14
First Publication Date 2023-03-28
Grant Date 2023-03-28
Owner UATC, LLC (USA)
Inventor
  • Poeppel, Scott C.
  • Zych, Noah
  • Wood, Matthew Shaw
  • Vandenberg, Iii, Dirk John

Abstract

Systems and methods are provided for remotely detecting a status associated with an autonomous vehicle and generating control actions in response to such detections. In one example, a computing system can access a third-party communication associated with an autonomous vehicle. The computing system can determine, based at least in part on the third-party communication, a predetermined identifier associated with the autonomous vehicle. The computing system can determine, based at least in part on the third-party communication, a status associated with the autonomous vehicle, and transmit one or more control messages to the autonomous vehicle based at least in part on the predetermined identifier and the status associated with the autonomous vehicle.

IPC Classes  ?

  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
  • G08G 1/017 - Detecting movement of traffic to be counted or controlled identifying vehicles
  • G05D 1/02 - Control of position or course in two dimensions
  • G07C 5/00 - Registering or indicating the working of vehicles

56.

System and methods for controlling state transitions using a vehicle controller

      
Application Number 17994764
Grant Number 11768490
Status In Force
Filing Date 2022-11-28
First Publication Date 2023-03-23
Grant Date 2023-09-26
Owner UATC, LLC (USA)
Inventor
  • Tschanz, Frederic
  • Naik, Maitreya Jayesh
  • Yanakiev, Diana
  • Greenfield, Aaron L.
  • Poeppel, Scott C.

Abstract

The present disclosure is directed to controlling state transitions in an autonomous vehicle. In particular, a computing system can initiate the autonomous vehicle into a no-authorization state upon startup. The computing system can receive an authorization request. The computing system determines whether the authorization request includes a request to enter the first or second mode of operations, wherein the first mode of operations is associated with the autonomous vehicle being operated without a human operator and the second mode of operations is associated with the autonomous vehicle being operable by a human operator. The computing system can transition the autonomous vehicle from the no-authorization state into a standby state in response to determining the authorization request includes a request to enter the first mode of operations or into a manual-controlled state in response to determining the authorization request is a request to enter the second mode of operations.

IPC Classes  ?

  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
  • G05D 1/02 - Control of position or course in two dimensions

57.

Driving surface friction estimations using vehicle steering

      
Application Number 17942756
Grant Number 11919499
Status In Force
Filing Date 2022-09-12
First Publication Date 2023-03-16
Grant Date 2024-03-05
Owner
  • UATC, LLC (USA)
  • VOLVO CAR CORPORATION (Sweden)
Inventor
  • Poeppel, Scott C.
  • Jonasson, Mats

Abstract

Systems and methods are provided for generating data indicative of a friction associated with a driving surface, and for using the friction data in association with one or more vehicles. In one example, a computing system can detect a stop associated with a vehicle and initiate a steering action of the vehicle during the stop. The steering action is associated with movement of at least one tire of the vehicle relative to a driving surface. The computing system can obtain operational data associated with the steering action during the stop of the vehicle. The computing system can determine a friction associated with the driving surface based at least in part on the operational data associated with the steering action. The computing system can generate data indicative of the friction associated with the driving surface.

IPC Classes  ?

  • B60T 8/1763 - Brake regulation specially adapted to prevent excessive wheel slip during vehicle deceleration, e.g. ABS responsive to the coefficient of friction between the wheels and the ground surface
  • B60T 8/171 - Detecting parameters used in the regulation; Measuring values used in the regulation
  • B60T 8/1755 - Brake regulation specially adapted to control the stability of the vehicle, e.g. taking into account yaw rate or transverse acceleration in a curve
  • B60W 40/068 - Road friction coefficient
  • B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
  • G05D 1/02 - Control of position or course in two dimensions
  • G06F 9/54 - Interprogram communication

58.

Systems and methods for identifying unknown instances

      
Application Number 17967710
Grant Number 11769058
Status In Force
Filing Date 2022-10-17
First Publication Date 2023-02-23
Grant Date 2023-09-26
Owner UATC, LLC (USA)
Inventor
  • Urtasun, Raquel
  • Wong, Kelvin Ka Wing
  • Wang, Shenlong
  • Ren, Mengye
  • Liang, Ming

Abstract

Systems and methods of the present disclosure provide an improved approach for open-set instance segmentation by identifying both known and unknown instances in an environment. For example, a method can include receiving sensor point cloud input data including a plurality of three-dimensional points. The method can include determining a feature embedding and at least one of an instance embedding, class embedding, and/or background embedding for each of the plurality of three-dimensional points. The method can include determining a first subset of points associated with one or more known instances within the environment based on the class embedding and the background embedding associated with each point in the plurality of points. The method can include determining a second subset of points associated with one or more unknown instances within the environment based on the first subset of points. The method can include segmenting the input data into known and unknown instances.

IPC Classes  ?

  • G06K 9/00 - Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
  • G06N 20/00 - Machine learning
  • G06K 9/62 - Methods or arrangements for recognition using electronic means
  • G06N 3/084 - Backpropagation, e.g. using gradient descent
  • G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
  • G06F 18/214 - Generating training patterns; Bootstrap methods, e.g. bagging or boosting
  • G06V 10/774 - Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
  • G06V 10/82 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
  • G06V 10/44 - Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
  • G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
  • G06V 10/75 - Image or video pattern matching; Proximity measures in feature spaces using context analysis; Selection of dictionaries

59.

Systems and methods for generating synthetic light detection and ranging data via machine learning

      
Application Number 17958797
Grant Number 11734885
Status In Force
Filing Date 2022-10-03
First Publication Date 2023-02-09
Grant Date 2023-08-22
Owner UATC, LLC (USA)
Inventor
  • Manivasagam, Sivabalan
  • Wang, Shenlong
  • Ma, Wei-Chiu
  • Urtasun, Raquel

Abstract

The present disclosure provides systems and methods that combine physics-based systems with machine learning to generate synthetic LiDAR data that accurately mimics a real-world LiDAR sensor system. In particular, aspects of the present disclosure combine physics-based rendering with machine-learned models such as deep neural networks to simulate both the geometry and intensity of the LiDAR sensor. As one example, a physics-based ray casting approach can be used on a three-dimensional map of an environment to generate an initial three-dimensional point cloud that mimics LiDAR data. According to an aspect of the present disclosure, a machine-learned geometry model can predict one or more adjusted depths for one or more of the points in the initial three-dimensional point cloud, thereby generating an adjusted three-dimensional point cloud which more realistically simulates real-world LiDAR data.

IPC Classes  ?

  • G06T 15/00 - 3D [Three Dimensional] image rendering
  • G06T 17/05 - Geographic models
  • G06T 15/06 - Ray-tracing
  • G06N 20/00 - Machine learning
  • G05D 1/02 - Control of position or course in two dimensions
  • G07C 5/02 - Registering or indicating driving, working, idle, or waiting time only
  • G01S 17/89 - Lidar systems, specially adapted for specific applications for mapping or imaging
  • G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles

60.

Deep Structured Scene Flow for Autonomous Devices

      
Application Number 17962624
Status Pending
Filing Date 2022-10-10
First Publication Date 2023-02-09
Owner UATC, LLC (USA)
Inventor
  • Urtasun, Raquel
  • Ma, Wei-Chiu
  • Wang, Shenlong
  • Xiong, Yuwen
  • Hu, Rui

Abstract

Systems, methods, tangible non-transitory computer-readable media, and devices associated with motion flow estimation are provided. For example, scene data including representations of an environment over a first set of time intervals can be accessed. Extracted visual cues can be generated based on the representations and machine-learned feature extraction models. At least one of the machine-learned feature extraction models can be configured to generate a portion of the extracted visual cues based on a first set of the representations of the environment from a first perspective and a second set of the representations of the environment from a second perspective. The extracted visual cues can be encoded using energy functions. Three-dimensional motion estimates of object instances at time intervals subsequent to the first set of time intervals can be determined based on the energy functions and machine-learned inference models.

IPC Classes  ?

  • G06T 7/246 - Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
  • G06T 7/593 - Depth or shape recovery from multiple images from stereo images
  • G06T 7/00 - Image analysis
  • G06V 10/40 - Extraction of image or video features

61.

Systems and Methods for Detecting Surprise Movements of an Actor with Respect to an Autonomous Vehicle

      
Application Number 17962806
Status Pending
Filing Date 2022-10-10
First Publication Date 2023-02-09
Owner UATC, LLC (USA)
Inventor
  • Haynes, Galen Clark
  • Hogg, Iii, Charles R.
  • Shirdhar, Skanda
  • Traft, Neil

Abstract

Systems and methods for detecting a surprise or unexpected movement of an actor with respect to an autonomous vehicle are provided. An example computer-implemented method can include, for a first compute cycle, obtaining motion forecast data based on first sensor data collected with respect to an actor relative to an autonomous vehicle; and determining, based on the motion forecast data, failsafe region data representing an unexpected path or area where a likelihood of the actor following the unexpected path or entering the unexpected area is below a threshold. For a second compute cycle after the first compute cycle, the method can include obtaining second sensor data; determining, based on the second sensor data and the failsafe region data, that the actor has followed the unexpected path or entered the unexpected area; and in response to such determination, determining a deviation for controlling a movement of the autonomous vehicle.

IPC Classes  ?

  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
  • G05D 1/02 - Control of position or course in two dimensions
  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles

62.

Multi-Task Multi-Sensor Fusion for Three-Dimensional Object Detection

      
Application Number 17972249
Status Pending
Filing Date 2022-10-24
First Publication Date 2023-02-09
Owner UATC, LLC (USA)
Inventor
  • Urtasun, Raquel
  • Yang, Bin
  • Liang, Ming

Abstract

Provided are systems and methods that perform multi-task and/or multi-sensor fusion for three-dimensional object detection in furtherance of, for example, autonomous vehicle perception and control. In particular, according to one aspect of the present disclosure, example systems and methods described herein exploit simultaneous training of a machine-learned model ensemble relative to multiple related tasks to learn to perform more accurate multi-sensor 3D object detection. For example, the present disclosure provides an end-to-end learnable architecture with multiple machine-learned models that interoperate to reason about 2D and/or 3D object detection as well as one or more auxiliary tasks. According to another aspect of the present disclosure, example systems and methods described herein can perform multi-sensor fusion (e.g., fusing features derived from image data, light detection and ranging (LIDAR) data, and/or other sensor modalities) at both the point-wise and region of interest (ROI)-wise level, resulting in fully fused feature representations.

IPC Classes  ?

  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods
  • G06T 11/60 - Editing figures and text; Combining figures or text
  • G06T 7/55 - Depth or shape recovery from multiple images
  • G01S 17/89 - Lidar systems, specially adapted for specific applications for mapping or imaging
  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
  • G05D 1/02 - Control of position or course in two dimensions
  • G06N 20/00 - Machine learning

63.

Providing actionable uncertainties in autonomous vehicles

      
Application Number 17952685
Grant Number 11860636
Status In Force
Filing Date 2022-09-26
First Publication Date 2023-01-26
Grant Date 2024-01-02
Owner UATC, LLC (USA)
Inventor
  • Gulino, Cole Christian
  • Ansari, Alexander Rashid

Abstract

Systems and methods are provided for detecting objects of interest. A computing system can input sensor data to one or more first machine-learned models associated with detecting objects external to an autonomous vehicle. The computing system can obtain as an output of the first machine-learned models, data indicative of one or more detected objects. The computing system can determine data indicative of at least one uncertainty associated with the one or more detected objects and input the data indicative of the one or more detected objects and the data indicative of the at least one uncertainty to one or more second machine-learned models. The computing system can obtain as an output of the second machine-learned models, data indicative of at least one prediction associated with the one or more detected objects. The at least one prediction can be based at least in part on the detected objects and the uncertainty.

IPC Classes  ?

  • G05D 1/02 - Control of position or course in two dimensions
  • G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
  • G06F 18/00 - Pattern recognition

64.

Systems and methods for training predictive models for autonomous devices

      
Application Number 17941335
Grant Number 11762391
Status In Force
Filing Date 2022-09-09
First Publication Date 2023-01-19
Grant Date 2023-09-19
Owner UATC, LLC (USA)
Inventor
  • Cui, Henggang
  • Wang, Junheng
  • Yalamanchi, Sai Bhargav
  • Moorthy, Mohana Prasad Sathya
  • Chou, Fang-Chieh
  • Djuric, Nemanja

Abstract

Systems and methods for training machine-learned models are provided. A method can include receiving a rasterized image associated with a training object and generating a predicted trajectory of the training object by inputting the rasterized image into a first machine-learned model. The method can include converting the predicted trajectory into a rasterized trajectory that spatially corresponds to the rasterized image. The method can include utilizing a second machine-learned model to determine an accuracy of the predicted trajectory based on the rasterized trajectory. The method can include determining an overall loss for the first machine-learned model based on the accuracy of the predictive trajectory as determined by the second machine-learned model. The method can include training the first machine-learned model by minimizing the overall loss for the first machine-learned model.

IPC Classes  ?

  • G05D 1/02 - Control of position or course in two dimensions
  • G06N 20/00 - Machine learning
  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
  • G06N 3/084 - Backpropagation, e.g. using gradient descent

65.

Automated Delivery Systems for Autonomous Vehicles

      
Application Number 17940447
Status Pending
Filing Date 2022-09-08
First Publication Date 2023-01-05
Owner UATC, LLC (USA)
Inventor Kanitz, Daniel Adam

Abstract

Systems and methods are directed to automated delivery systems. In one example, a vehicle is provided including a drive system, a passenger cabin; and a delivery service pod provided relative to the passenger cabin. The delivery service pod includes an access unit configured to allow for loading and unloading of a plurality of delivery crates into the delivery service pod. The delivery service pod further includes a conveyor unit comprising multiple delivery crate holding positions, the delivery crate holding positions being defined by neighboring sidewalls spaced apart within the delivery service pod such that a respective delivery crate of the plurality of delivery crates can be positioned between neighboring sidewalls, wherein the conveyor unit is configured to be rotated to align each of the delivery crate holding positions with the access unit.

IPC Classes  ?

  • B60P 3/00 - Vehicles adapted to transport, to carry or to comprise special loads or objects
  • A47L 7/00 - Suction cleaners adapted for additional purposes; Tables with suction openings for cleaning purposes; Containers for cleaning articles by suction; Suction cleaners adapted to cleaning of brushes; Suction cleaners adapted to taking-up liquids
  • B25J 19/02 - Sensing devices
  • B25J 11/00 - Manipulators not otherwise provided for
  • B60S 1/64 - Other vehicle fittings for cleaning for cleaning vehicle interiors, e.g. built-in vacuum cleaners
  • B60P 1/36 - Vehicles predominantly for transporting loads and modified to facilitate loading, consolidating the load, or unloading using endless chains or belts thereon

66.

Multiple stage image based object detection and recognition

      
Application Number 17942898
Grant Number 11922708
Status In Force
Filing Date 2022-09-12
First Publication Date 2023-01-05
Grant Date 2024-03-05
Owner UATC, LLC (USA)
Inventor
  • Vallespi-Gonzalez, Carlos
  • Amato, Joseph Lawrence
  • Totolos, Jr., George

Abstract

Systems, methods, tangible non-transitory computer-readable media, and devices for autonomous vehicle operation are provided. For example, a computing system can receive object data that includes portions of sensor data. The computing system can determine, in a first stage of a multiple stage classification using hardware components, one or more first stage characteristics of the portions of sensor data based on a first machine-learned model. In a second stage of the multiple stage classification, the computing system can determine second stage characteristics of the portions of sensor data based on a second machine-learned model. The computing system can generate an object output based on the first stage characteristics and the second stage characteristics. The object output can include indications associated with detection of objects in the portions of sensor data.

IPC Classes  ?

  • G06K 9/00 - Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
  • G06F 18/214 - Generating training patterns; Bootstrap methods, e.g. bagging or boosting
  • G06F 18/241 - Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
  • G06F 18/243 - Classification techniques relating to the number of classes
  • G06N 7/01 - Probabilistic graphical models, e.g. probabilistic networks
  • G06N 20/00 - Machine learning
  • G06T 7/521 - Depth or shape recovery from the projection of structured light
  • G06T 15/08 - Volume rendering
  • G06V 10/28 - Quantising the image, e.g. histogram thresholding for discrimination between background and foreground patterns
  • G06V 10/50 - Extraction of image or video features by summing image-intensity values; Projection analysis
  • G06V 10/56 - Extraction of image or video features relating to colour
  • G06V 10/764 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
  • G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
  • G06V 20/64 - Three-dimensional objects
  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot

67.

ONLINE LIDAR INTENSITY NORMALIZATION

      
Application Number 17889865
Status Pending
Filing Date 2022-08-17
First Publication Date 2022-12-22
Owner UATC, LLC (USA)
Inventor
  • Hu, Xiaoyan
  • Liu, Baoan

Abstract

Aspects of the present disclosure involve a vehicle computer system comprising a computer-readable storage medium storing a set of instructions, and a method for online light detection and ranging (Lidar) intensity normalization. Consistent with some embodiments, the method may include accumulating point data output by a channel of a Lidar unit during operation of an autonomous or semi-autonomous vehicle. The accumulated point data includes raw intensity values that correspond to a particular surface type. The method further includes calculating a median intensity value based on the raw intensity values and generating an intensity normalization multiplier for the channel based on the median intensity value. The intensity normalization multiplier, when applied to the median intensity value, results in a reflectivity value that corresponds to the particular surface type. The method further includes applying the intensity normalization multiplier to the point data output by the channel to produce normalized intensity values.

IPC Classes  ?

  • G01S 17/87 - Combinations of systems using electromagnetic waves other than radio waves
  • G01S 17/89 - Lidar systems, specially adapted for specific applications for mapping or imaging
  • G01S 7/48 - RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES - Details of systems according to groups , , of systems according to group
  • G01S 17/06 - Systems determining position data of a target
  • G01S 7/497 - Means for monitoring or calibrating
  • G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles

68.

Power and thermal management systems and methods for autonomous vehicles

      
Application Number 17893829
Grant Number 11842639
Status In Force
Filing Date 2022-08-23
First Publication Date 2022-12-22
Grant Date 2023-12-12
Owner UATC, LLC (USA)
Inventor
  • Rice, David Patrick
  • Boehmke, Scott Klaus

Abstract

Systems and methods for power and thermal management of autonomous vehicles are provided. In one example embodiment, a computing system includes processor(s) and one or more tangible, non-transitory, computer readable media that collectively store instructions that when executed by the processor(s) cause the computing system to perform operations. The operations include obtaining data associated with an autonomous vehicle. The operations include identifying one or more vehicle parameters associated with the autonomous vehicle based at least in part on the data associated with the autonomous vehicle. The operations include determining a modification to one or more operating characteristics of one or more systems onboard the autonomous vehicle based at least in part on the one or more vehicle parameters. The operations include controlling a heat generation of at least a portion of the autonomous vehicle via implementation of the modification of the operating characteristic(s) of the system(s) onboard the autonomous vehicle.

IPC Classes  ?

  • G08G 1/0967 - Systems involving transmission of highway information, e.g. weather, speed limits
  • B60H 1/00 - Heating, cooling or ventilating devices
  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot

69.

Trajectory prediction for autonomous devices

      
Application Number 17873288
Grant Number 11851087
Status In Force
Filing Date 2022-07-26
First Publication Date 2022-12-08
Grant Date 2023-12-26
Owner UATC, LLC (USA)
Inventor
  • Djuric, Nemanja
  • Yalamanchi, Sai Bhargav
  • Haynes, Galen Clark
  • Huang, Tzu-Kuo

Abstract

Systems, methods, tangible non-transitory computer-readable media, and devices associated with trajectory prediction are provided. For example, trajectory data and goal path data can be accessed. The trajectory data can be associated with an object's predicted trajectory. The predicted trajectory can include waypoints associated with waypoint position uncertainty distributions that can be based on an expectation maximization technique. The goal path data can be associated with a goal path and include locations the object is predicted to travel. Solution waypoints for the object can be determined based on application of optimization techniques to the waypoints and waypoint position uncertainty distributions. The optimization techniques can include operations to maximize the probability of each of the solution waypoints. Stitched trajectory data can be generated based on the solution waypoints. The stitched trajectory data can be associated with portions of the solution waypoints and the goal path.

IPC Classes  ?

  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
  • G05D 1/02 - Control of position or course in two dimensions
  • B60W 30/095 - Predicting travel path or likelihood of collision
  • G06N 7/01 - Probabilistic graphical models, e.g. probabilistic networks

70.

Probabilistic prediction of dynamic object behavior for autonomous vehicles

      
Application Number 16777108
Grant Number 11521396
Status In Force
Filing Date 2020-01-30
First Publication Date 2022-12-06
Grant Date 2022-12-06
Owner UATC, LLC (USA)
Inventor
  • Jain, Ajay
  • Casas, Sergio
  • Liao, Renjie
  • Xiong, Yuwen
  • Feng, Song
  • Segal, Sean
  • Urtasun, Raquel

Abstract

Systems and methods are described that probabilistically predict dynamic object behavior. In particular, in contrast to existing systems which attempt to predict object trajectories directly (e.g., directly predict a specific sequence of well-defined states), a probabilistic approach is instead leveraged that predicts discrete probability distributions over object state at each of a plurality of time steps. In one example, systems and methods predict future states of dynamic objects (e.g., pedestrians) such that an autonomous vehicle can plan safer actions/movement.

IPC Classes  ?

  • G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
  • G06K 9/62 - Methods or arrangements for recognition using electronic means
  • B60W 30/09 - Taking automatic action to avoid collision, e.g. braking and steering
  • G06N 20/00 - Machine learning
  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot

71.

High quality instance segmentation

      
Application Number 17878408
Grant Number 11734828
Status In Force
Filing Date 2022-08-01
First Publication Date 2022-12-01
Grant Date 2023-08-22
Owner UATC, LLC (USA)
Inventor
  • Homayounfar, Namdar
  • Xiong, Yuwen
  • Liang, Justin
  • Ma, Wei-Chiu
  • Urtasun, Raquel

Abstract

Disclosed herein are methods and systems for performing instance segmentation that can provide improved estimation of object boundaries. Implementations can include a machine-learned segmentation model trained to estimate an initial object boundary based on a truncated signed distance function (TSDF) generated by the model. The model can also generate outputs for optimizing the TSDF over a series of iterations to produce a final TSDF that can be used to determine the segmentation mask.

IPC Classes  ?

  • G06T 7/11 - Region-based segmentation
  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods
  • G06V 10/77 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
  • G06F 18/24 - Classification techniques
  • G06F 18/213 - Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
  • G06V 10/764 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
  • G06V 10/82 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
  • G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle

72.

Sensor Validation and Calibration

      
Application Number 17870711
Status Pending
Filing Date 2022-07-21
First Publication Date 2022-11-24
Owner UATC, LLC (USA)
Inventor
  • Travnikar, Marek Vladimir
  • Lutz, Kyle

Abstract

Systems, methods, tangible non-transitory computer-readable media, and devices associated with radar validation and calibration are provided. For example, target positions for targets can be determined based on imaging devices. The targets can be located at respective predetermined positions relative to the imaging devices. Radar detections of the targets can be generated based on radar devices. The radar devices can be located at a predetermined position relative to the imaging devices. Filtered radar detections can be generated based on performance of filtering operations on the radar detections. A detection error can be determined for the radar devices based on calibration operations performed using the filtered radar detections and the target positions determined based on the one or more imaging devices. Furthermore, the radar devices can be calibrated based on the detection error.

IPC Classes  ?

  • G01S 7/40 - Means for monitoring or calibrating
  • G01S 7/292 - Extracting wanted echo-signals
  • G01S 13/42 - Simultaneous measurement of distance and other coordinates
  • G01S 13/86 - Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
  • G01S 5/14 - Determining absolute distances from a plurality of spaced points of known location

73.

Image Based Localization System

      
Application Number 17833414
Status Pending
Filing Date 2022-06-06
First Publication Date 2022-10-06
Owner
  • UATC, LLC (USA)
  • UATC, LLC (USA)
Inventor
  • Urtasun, Raquel
  • Martinez Covarrubias, Julieta
  • Wang, Shenlong
  • Fan, Hongbo

Abstract

Systems and methods for determining a location based on image data are provided. A method can include receiving, by a computing system, a query image depicting a surrounding environment of a vehicle. The query image can be input into a machine-learned image embedding model and a machine-learned feature extraction model to obtain a query embedding and a query feature representation, respectively. The method can include identifying a subset of candidate embeddings that have embeddings similar to the query embedding. The method can include obtaining a respective feature representation for each image associated with the subset of candidate embeddings. The method can include determining a set of relative displacements between each image associated with the subset of candidate embeddings and the query image and determining a localized state of a vehicle based at least in part on the set of relative displacements.

IPC Classes  ?

  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods
  • G06K 9/62 - Methods or arrangements for recognition using electronic means
  • G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle

74.

Automated vehicle seats

      
Application Number 17843229
Grant Number 11642985
Status In Force
Filing Date 2022-06-17
First Publication Date 2022-10-06
Grant Date 2023-05-09
Owner UATC, LLC (USA)
Inventor
  • D'Eramo, Christopher Matthew
  • Chan, Nicolas Bryan
  • Urbanski, Janna

Abstract

Systems, methods, tangible non-transitory computer-readable media, and devices associated with the operation of a vehicle are provided. For example a vehicle computing system can receive occupancy data that includes information associated with occupancy of a vehicle that includes seats. One or more states of the vehicle can be determined. The states of the vehicle can include a disposition of any object that is within the vehicle. Further, a configuration of the seats in the vehicle can be determined based on the occupancy data and the states of the vehicle. The configuration can include a disposition of the seats inside the vehicle. Furthermore, at least one of the seats can be adjusted based on the configuration that was determined.

IPC Classes  ?

  • B60N 2/01 - Arrangement of seats relative to one another
  • B60N 2/02 - Seats specially adapted for vehicles; Arrangement or mounting of seats in vehicles the seat or part thereof being movable, e.g. adjustable
  • B60N 2/06 - Seats specially adapted for vehicles; Arrangement or mounting of seats in vehicles the seat or part thereof being movable, e.g. adjustable the whole seat being movable slidable
  • B60N 3/00 - Arrangements or adaptations of other passenger fittings, not otherwise provided for
  • A47C 3/04 - Stackable chairs or nesting chairs
  • B60N 2/00 - Seats specially adapted for vehicles; Arrangement or mounting of seats in vehicles
  • B60N 2/14 - Seats specially adapted for vehicles; Arrangement or mounting of seats in vehicles the seat or part thereof being movable, e.g. adjustable the whole seat being movable rotatable, e.g. to permit easy access

75.

Systems and Methods For Deploying Warning Devices From an Autonomous Vehicle

      
Application Number 17843278
Status Pending
Filing Date 2022-06-17
First Publication Date 2022-10-06
Owner UATC, LLC (USA)
Inventor
  • Vawter, Zac
  • Woodrow, Alden James

Abstract

Systems and methods are directed to deploying warning devices by an autonomous vehicle. In one example, a system includes one or more processors and memory including instructions that, when executed by the one or more processors, cause the one or more processors to perform operations. The operations include obtaining data indicating that a vehicle stop maneuver is to be implemented for an autonomous vehicle. The operations further include determining whether one or more warning devices should be dispensed from the autonomous vehicle during the vehicle stop maneuver based in part on the obtained data. The operations further include, in response to determining one or more warning devices should be dispensed from the autonomous vehicle, determining a dispensing maneuver for the one or more warning devices, and providing one or more command signals to one or more vehicle systems to perform the dispensing maneuver for the one or more warning devices.

IPC Classes  ?

  • G05D 1/02 - Control of position or course in two dimensions
  • B60Q 7/00 - Arrangement or adaptation of portable emergency signal devices on vehicles

76.

Configuring motion planning for a self-driving tractor unit

      
Application Number 17836110
Grant Number 11669091
Status In Force
Filing Date 2022-06-09
First Publication Date 2022-09-22
Grant Date 2023-06-06
Owner UATC, LLC (USA)
Inventor
  • Wood, Matthew Shaw
  • Sun, Nancy Yung-Hui
  • Vawter, Zachias Rauch

Abstract

A control system of a self-driving tractor can access sensor data to determine a set of trailer configuration parameters of a cargo trailer coupled to the self-driving tractor. Based on the set of trailer configuration parameters, the control system can configure a motion planning model for autonomously controlling the acceleration, braking, and steering systems of the tractor.

IPC Classes  ?

  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
  • B60W 30/00 - Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
  • B62D 13/00 - Steering specially adapted for trailers
  • G05D 1/02 - Control of position or course in two dimensions
  • G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle

77.

System and method for determining object intention through visual attributes

      
Application Number 17731722
Grant Number 11926337
Status In Force
Filing Date 2022-04-28
First Publication Date 2022-09-15
Grant Date 2024-03-12
Owner UATC, LLC (USA)
Inventor
  • Frossard, Davi Eugenio Nascimento
  • Kee, Eric Randall
  • Urtasun, Raquel

Abstract

Systems and methods for determining object intentions through visual attributes are provided. A method can include determining, by a computing system, one or more regions of interest. The regions of interest can be associated with surrounding environment of a first vehicle. The method can include determining, by a computing system, spatial features and temporal features associated with the regions of interest. The spatial features can be indicative of a vehicle orientation associated with a vehicle of interest. The temporal features can be indicative of a semantic state associated with signal lights of the vehicle of interest. The method can include determining, by the computing system, a vehicle intention. The vehicle intention can be based on the spatial and temporal features. The method can include initiating, by the computing system, an action. The action can be based on the vehicle intention.

IPC Classes  ?

  • B60W 50/14 - Means for informing the driver, warning the driver or prompting a driver intervention
  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
  • G05D 1/02 - Control of position or course in two dimensions
  • G06V 10/25 - Determination of region of interest [ROI] or a volume of interest [VOI]
  • G06V 10/764 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
  • G06V 10/82 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
  • G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads

78.

Multi-task machine-learned models for object intention determination in autonomous driving

      
Application Number 17749841
Grant Number 11794785
Status In Force
Filing Date 2022-05-20
First Publication Date 2022-09-15
Grant Date 2023-10-24
Owner UATC, LLC (USA)
Inventor
  • Casas, Sergio
  • Luo, Wenjie
  • Urtasun, Raquel

Abstract

Generally, the disclosed systems and methods utilize multi-task machine-learned models for object intention determination in autonomous driving applications. For example, a computing system can receive sensor data obtained relative to an autonomous vehicle and map data associated with a surrounding geographic environment of the autonomous vehicle. The sensor data and map data can be provided as input to a machine-learned intent model. The computing system can receive a jointly determined prediction from the machine-learned intent model for multiple outputs including at least one detection output indicative of one or more objects detected within the surrounding environment of the autonomous vehicle, a first corresponding forecasting output descriptive of a trajectory indicative of an expected path of the one or more objects towards a goal location, and/or a second corresponding forecasting output descriptive of a discrete behavior intention determined from a predefined group of possible behavior intentions.

IPC Classes  ?

  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
  • B60W 30/095 - Predicting travel path or likelihood of collision
  • G05D 1/02 - Control of position or course in two dimensions
  • G06N 20/00 - Machine learning
  • G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
  • G06V 10/764 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
  • G06V 10/82 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
  • G06V 40/20 - Movements or behaviour, e.g. gesture recognition

79.

Autonomous vehicle sensor cleaning system

      
Application Number 17825014
Grant Number 11801808
Status In Force
Filing Date 2022-05-26
First Publication Date 2022-09-08
Grant Date 2023-10-31
Owner UATC, LLC (USA)
Inventor
  • Rice, Wesly Mason
  • Wittenstein, Nikolaus
  • Jin, Zhizhuo
  • Smith, Paul Kevin
  • Kennelly, Sean Joseph
  • Brueckner, Peter
  • Rice, David Patrick

Abstract

The present disclosure provides a sensor cleaning system that cleans one or more sensors of an autonomous vehicle. Each sensor can have one or more corresponding sensor cleaning units that are configured to clean such sensor using a fluid (e.g., a gas or a liquid). Thus, the sensor cleaning system can include both a gas cleaning system and a liquid cleaning system. According to one aspect, the sensor cleaning system can provide individualized cleaning of the autonomous vehicle sensors. According to another aspect, a liquid cleaning system can be pressurized or otherwise powered by the gas cleaning system or other gas system.

IPC Classes  ?

  • B60S 1/48 - Liquid supply therefor
  • B60S 1/52 - Arrangement of nozzles
  • B60S 1/54 - Cleaning windscreens, windows, or optical devices using gas, e.g. hot air
  • B60S 1/56 - Cleaning windscreens, windows, or optical devices specially adapted for cleaning other parts or devices than front windows or windscreens

80.

Systems and methods for providing a ridesharing vehicle service using an autonomous vehicle

      
Application Number 17668510
Grant Number 11835952
Status In Force
Filing Date 2022-02-10
First Publication Date 2022-08-25
Grant Date 2023-12-05
Owner UATC, LLC (USA)
Inventor Donnelly, Richard Brian

Abstract

Systems and methods for providing an autonomous vehicle service are provided. A method can include obtaining data indicative of a service associated with a user, and obtaining data indicative of a transportation of an autonomous robot. The method can include determining one or more service configurations for the service. The method can include obtaining data indicative of a selected service configuration from among the one or more service configurations, and determining a service assignment for an autonomous vehicle based at least in part on the selected service configuration. The service assignment can indicate that the autonomous vehicle is to transport the user from the service-start location to the service-end location. The method can include communicating data indicative of the service assignment to the autonomous vehicle to perform the service.

IPC Classes  ?

  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
  • G01C 21/34 - Route searching; Route guidance
  • G06Q 10/02 - Reservations, e.g. for tickets, services or events
  • G06Q 50/30 - Transportation; Communications

81.

Multiple Stage Image Based Object Detection and Recognition

      
Application Number 17733688
Status Pending
Filing Date 2022-04-29
First Publication Date 2022-08-18
Owner UATC, LLC (USA)
Inventor
  • Amato, Joseph Lawrence
  • Djuric, Nemanja
  • Gautam, Shivam
  • Mohta, Abhishek

Abstract

Systems, methods, tangible non-transitory computer-readable media, and devices for autonomous vehicle operation are provided. For example, a computing system can receive object data that includes portions of sensor data. The computing system can determine, in a first stage of a multiple stage classification using hardware components, one or more first stage characteristics of the portions of sensor data based on a first machine-learned model. In a second stage of the multiple stage classification, the computing system can determine second stage characteristics of the portions of sensor data based on a second machine-learned model. The computing system can generate an object output based on the first stage characteristics and the second stage characteristics. The object output can include indications associated with detection of objects in the portions of sensor data.

IPC Classes  ?

  • G06K 9/62 - Methods or arrangements for recognition using electronic means
  • G06N 7/00 - Computing arrangements based on specific mathematical models
  • G06T 7/521 - Depth or shape recovery from the projection of structured light
  • G06T 15/08 - Volume rendering
  • G06N 20/00 - Machine learning
  • G06V 10/28 - Quantising the image, e.g. histogram thresholding for discrimination between background and foreground patterns
  • G06V 10/50 - Extraction of image or video features by summing image-intensity values; Projection analysis
  • G06V 10/56 - Extraction of image or video features relating to colour
  • G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
  • G06V 20/64 - Three-dimensional objects

82.

Systems and methods for generating synthetic sensor data via machine learning

      
Application Number 17727085
Grant Number 11797407
Status In Force
Filing Date 2022-04-22
First Publication Date 2022-08-18
Grant Date 2023-10-24
Owner UATC, LLC (USA)
Inventor
  • Manivasagam, Sivabalan
  • Wang, Shenlong
  • Ma, Wei-Chiu
  • Wong, Kelvin Ka Wing
  • Zeng, Wenyuan
  • Urtasun, Raquel

Abstract

The present disclosure provides systems and methods that combine physics-based systems with machine learning to generate synthetic LiDAR data that accurately mimics a real-world LiDAR sensor system. In particular, aspects of the present disclosure combine physics-based rendering with machine-learned models such as deep neural networks to simulate both the geometry and intensity of the LiDAR sensor. As one example, a physics-based ray casting approach can be used on a three-dimensional map of an environment to generate an initial three-dimensional point cloud that mimics LiDAR data. According to an aspect of the present disclosure, a machine-learned model can predict one or more dropout probabilities for one or more of the points in the initial three-dimensional point cloud, thereby generating an adjusted three-dimensional point cloud which more realistically simulates real-world LiDAR data.

IPC Classes  ?

  • G06F 11/263 - Generation of test inputs, e.g. test vectors, patterns or sequences
  • G06T 17/20 - Wire-frame description, e.g. polygonalisation or tessellation
  • G06F 17/18 - Complex mathematical operations for evaluating statistical data
  • G06N 20/00 - Machine learning
  • G06T 15/06 - Ray-tracing
  • G06N 3/047 - Probabilistic or stochastic networks

83.

Movable Front Shield for Vehicles

      
Application Number 17724902
Status Pending
Filing Date 2022-04-20
First Publication Date 2022-08-04
Owner UATC, LLC (USA)
Inventor
  • Haban, Philipp
  • Kanitz, Daniel Adam

Abstract

In one example embodiment, a computer-implemented method for autonomous vehicle control includes determining whether a cargo container is attached to the vehicle base. The method includes controlling a front shield associated with an autonomous vehicle to move from a closed position to an opened position when a cargo container is determined to be attached to a vehicle base associated with the autonomous vehicle. The method includes controlling the front shield to move from the opened position to the closed position when the cargo container is not attached to the vehicle base.

IPC Classes  ?

  • B62D 33/08 - Superstructures for load-carrying vehicles characterised by the connection of the superstructure to the vehicle frame comprising adjustable means
  • G05D 1/02 - Control of position or course in two dimensions
  • B62D 35/00 - Vehicle bodies characterised by streamlining
  • B60Q 1/28 - Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating front of vehicle

84.

Association and Tracking for Autonomous Devices

      
Application Number 17725879
Status Pending
Filing Date 2022-04-21
First Publication Date 2022-08-04
Owner UATC, LLC (USA)
Inventor
  • Gautam, Shivam
  • Becker, Brian C.
  • Vallespi-Gonzalez, Carlos
  • Gulino, Cole Christian

Abstract

Systems, methods, tangible non-transitory computer-readable media, and devices associated with object association and tracking are provided. Input data can be obtained. The input data can be indicative of a detected object within a surrounding environment of an autonomous vehicle and an initial object classification of the detected object at an initial time interval and object tracks at time intervals preceding the initial time interval. Association data can be generated based on the input data and a machine-learned model. The association data can indicate whether the detected object is associated with at least one of the object tracks. An object classification probability distribution can be determined based on the association data. The object classification probability distribution can indicate a probability that the detected object is associated with each respective object classification. The association data and the object classification probability distribution for the detected object can be outputted.

IPC Classes  ?

  • G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
  • G06T 7/20 - Analysis of motion
  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
  • G05D 1/02 - Control of position or course in two dimensions
  • G06V 40/10 - Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands

85.

Determining specular reflectivity characteristics using LiDAR

      
Application Number 17725867
Grant Number 11960028
Status In Force
Filing Date 2022-04-21
First Publication Date 2022-08-04
Grant Date 2024-04-16
Owner UATC, LLC (USA)
Inventor Dylewski, Scott

Abstract

Aspects of the present disclosure involve systems, methods, and devices for determining specular reflectivity characteristics of objects using a Lidar system of an autonomous vehicle (AV) system. A method includes transmitting at least two light signals directed at a target object utilizing the Lidar system of the AV system. The method further includes determining at least two reflectivity values for the target object based on return signals corresponding to the at least two light signals. The method further includes classifying specular reflectivity characteristics of the target object based on a comparison of the first and second reflectivity value. The method further includes updating a motion plan for the AV system based on the specular reflectivity characteristics of the target object.

IPC Classes  ?

  • G01C 3/08 - Use of electric radiation detectors
  • G01S 7/48 - RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES - Details of systems according to groups , , of systems according to group
  • G01S 17/93 - Lidar systems, specially adapted for specific applications for anti-collision purposes

86.

Autonomous vehicle compatible robot

      
Application Number 17718810
Grant Number 11841709
Status In Force
Filing Date 2022-04-12
First Publication Date 2022-07-28
Grant Date 2023-12-12
Owner UATC, LLC (USA)
Inventor Donnelly, Richard Brian

Abstract

An autonomous robot is provided. In one example embodiment, an autonomous robot can include a main body including one or more compartments. The one or more compartments can be configured to provide support for transporting an item. The autonomous robot can include a mobility assembly affixed to the main body and a sensor configured to obtain sensor data associated with a surrounding environment of the autonomous robot. The autonomous robot can include a computing system configured to plan a motion of the autonomous robot based at least in part on the sensor data. The computing system can be operably connected to the mobility assembly for controlling a motion of the autonomous robot. The autonomous robot can include a coupling assembly configured to temporarily secure the autonomous robot to an autonomous vehicle. The autonomous robot can include a power system and a ventilation system that can interface with the autonomous vehicle.

IPC Classes  ?

  • G06Q 20/18 - Payment architectures involving self-service terminals [SST], vending machines, kiosks or multimedia terminals
  • G06Q 10/083 - Shipping
  • B25J 9/08 - Programme-controlled manipulators characterised by modular constructions
  • G05D 1/02 - Control of position or course in two dimensions
  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot

87.

Pallet system for cargo transport

      
Application Number 17712602
Grant Number 11912517
Status In Force
Filing Date 2022-04-04
First Publication Date 2022-07-21
Grant Date 2024-02-27
Owner UATC, LLC (USA)
Inventor
  • Haban, Philipp
  • Kanitz, Daniel Adam

Abstract

In one example embodiment, a computer-implemented method for transporting cargo using smart palettes includes determining receipt of a first cargo onto a platform of a first smart palette at a first distribution hub. The method includes generating one or more signals that control a loading of the first smart palette and the first cargo onto a trailer located at the first distribution hub. The method includes determining a coordination with one or more second smart palettes associated with the trailer to determine a first position inside the trailer for the first smart palette and the first cargo. The method includes generating one or more signals that position the first smart palette and the first cargo at the first position inside the trailer.

IPC Classes  ?

  • G06F 7/00 - Methods or arrangements for processing data by operating upon the order or content of the data handled
  • B65G 67/04 - Loading land vehicles
  • G06Q 50/28 - Logistics, e.g. warehousing, loading, distribution or shipping
  • B65G 43/10 - Sequence control of conveyors operating in combination
  • G05B 15/02 - Systems controlled by a computer electric

88.

Three-dimensional object detection

      
Application Number 17571845
Grant Number 11768292
Status In Force
Filing Date 2022-01-10
First Publication Date 2022-07-07
Grant Date 2023-09-26
Owner UATC, LLC (USA)
Inventor
  • Liang, Ming
  • Yang, Bin
  • Wang, Shenlong
  • Ma, Wei-Chiu
  • Urtasun, Raquel

Abstract

Generally, the disclosed systems and methods implement improved detection of objects in three-dimensional (3D) space. More particularly, an improved 3D object detection system can exploit continuous fusion of multiple sensors and/or integrated geographic prior map data to enhance effectiveness and robustness of object detection in applications such as autonomous driving. In some implementations, geographic prior data (e.g., geometric ground and/or semantic road features) can be exploited to enhance three-dimensional object detection for autonomous vehicle applications. In some implementations, object detection systems and methods can be improved based on dynamic utilization of multiple sensor modalities. More particularly, an improved 3D object detection system can exploit both LIDAR systems and cameras to perform very accurate localization of objects within three-dimensional space relative to an autonomous vehicle. For example, multi-sensor fusion can be implemented via continuous convolutions to fuse image data samples and LIDAR feature maps at different levels of resolution.

IPC Classes  ?

  • G01S 7/48 - RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES - Details of systems according to groups , , of systems according to group
  • G01S 17/89 - Lidar systems, specially adapted for specific applications for mapping or imaging
  • G06N 3/08 - Learning methods
  • G06N 3/04 - Architecture, e.g. interconnection topology
  • G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles
  • G01S 7/481 - Constructional features, e.g. arrangements of optical elements
  • G05D 1/02 - Control of position or course in two dimensions
  • G06N 3/02 - Neural networks
  • G06V 20/64 - Three-dimensional objects
  • G06F 18/25 - Fusion techniques
  • G06K 9/66 - Methods or arrangements for recognition using electronic means using simultaneous comparisons or correlations of the image signals with a plurality of references, e.g. resistor matrix references adjustable by an adaptive method, e.g. learning

89.

Systems and Methods for Autonomous Vehicle Controls

      
Application Number 17705789
Status Pending
Filing Date 2022-03-28
First Publication Date 2022-07-07
Owner UATC, LLC (USA)
Inventor
  • Becker, Brian C.
  • Phillips, Michael Lee
  • Xu, Yang
  • Perko, Eric Michael
  • Melik-Barkhudarov, Narek

Abstract

Systems and methods for controlling autonomous vehicle are provided. A method can include obtaining, by a computing system, data indicative of a plurality of objects in a surrounding environment of the autonomous vehicle. The method can further include determining, by the computing system, one or more clusters of the objects based at least in part on the data indicative of the plurality of objects. The method can further include determining, by the computing system, whether to enter an operation mode having one or more limited operational capabilities based at least in part on one or more properties of the one or more clusters. In response to determining that the operation mode is to be entered by the autonomous vehicle, the method can include controlling, by the computing system, the operation of the autonomous vehicle based at least in part on the one or more limited operational capabilities.

IPC Classes  ?

  • G08G 1/16 - Anti-collision systems
  • G05D 1/02 - Control of position or course in two dimensions
  • B60W 30/085 - Taking automatic action to adjust vehicle attitude in preparation for collision, e.g. braking for nose dropping
  • G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
  • G06V 40/10 - Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands

90.

Autonomous vehicle control based on risk-based interactions

      
Application Number 16863341
Grant Number 11377120
Status In Force
Filing Date 2020-04-30
First Publication Date 2022-07-05
Grant Date 2022-07-05
Owner UATC, LLC (USA)
Inventor
  • Deng, Eric Chen
  • Marchetti-Bowick, Micol
  • Concilio, Yasmine Straka
  • Haynes, Galen Clark
  • Phillips, Michael Lee

Abstract

Systems, methods, tangible non-transitory computer-readable media, and devices associated with vehicle control based on risk-based interactions are provided. For example, vehicle data and perception data can be accessed. The vehicle data can include the speed of an autonomous vehicle in an environment. The perception data can include location information and classification information associated with an object in the environment. A scenario exposure can be determined based on the vehicle data and perception data. Prediction data including predicted trajectories of the object can be accessed. Expected speed data can be determined based on hypothetical speeds and hypothetical distances between the vehicle and the object. A speed profile that satisfies a threshold criteria can be determining based on the scenario exposure, the prediction data, and the expected speed data, over a distance. A motion plan to control the autonomous vehicle can be generated based on the speed profile.

IPC Classes  ?

  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
  • B60W 30/095 - Predicting travel path or likelihood of collision
  • B60W 30/16 - Control of distance between vehicles, e.g. keeping a distance to preceding vehicle
  • G08G 1/00 - Traffic control systems for road vehicles
  • B60W 10/20 - Conjoint control of vehicle sub-units of different type or different function including control of steering systems
  • B60W 10/18 - Conjoint control of vehicle sub-units of different type or different function including control of braking systems

91.

Fault-Tolerant Control of an Autonomous Vehicle with Multiple Control Lanes

      
Application Number 17695055
Status Pending
Filing Date 2022-03-15
First Publication Date 2022-06-30
Owner UATC, LLC (USA)
Inventor
  • Greenfield, Aaron L.
  • Yanakiev, Diana
  • Tschanz, Frederic
  • Tytler, Charles J.

Abstract

In one example embodiment, a computer-implemented method includes receiving data representing a motion plan of the autonomous vehicle via a plurality of control lanes configured to implement the motion plan to control a motion of the autonomous vehicle, the plurality of control lanes including at least a first control lane and a second control lane, and controlling the first control lane to implement the motion plan. The method includes detecting one or more faults associated with implementation of the motion plan by the first control lane or the second control lane, or in generation of the motion plan, and in response to one or more faults, controlling the first control lane or the second control lane to adjust the motion of the autonomous vehicle based at least in part on one or more fault reaction parameters associated with the one or more faults.

IPC Classes  ?

  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
  • B60W 50/029 - Adapting to failures or work around with other constraints, e.g. circumvention by avoiding use of failed parts
  • B60W 50/023 - Avoiding failures by using redundant parts
  • G06F 11/20 - Error detection or correction of the data by redundancy in hardware using active fault-masking, e.g. by switching out faulty elements or by switching in spare elements

92.

Systems and Methods for Error Sourcing in Autonomous Vehicle Simulation

      
Application Number 17140606
Status Pending
Filing Date 2021-01-04
First Publication Date 2022-06-23
Owner UATC, LLC (USA)
Inventor
  • Venkatadri, Arun David Kain
  • Lee, Jason C.
  • Tang, Yilun

Abstract

Systems and methods of the present disclosure are directed to a computer-implemented method. The method can include obtaining a first plurality of testing parameters for an autonomous vehicle testing scenario associated with a plurality of performance metrics based at least in part on a first sampling rule. The method can include simulating the autonomous vehicle testing scenario using the first plurality of testing parameters to obtain a first scenario output. The method can include evaluating an optimization function that evaluates the first scenario output to obtain simulation error data that corresponds to a performance metric. The method can include determining a second sampling rule associated with the performance metric. The method can include obtaining a second plurality of testing parameters for the autonomous vehicle testing scenario based at least in part on the second sampling rule.

IPC Classes  ?

  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
  • G06F 30/20 - Design optimisation, verification or simulation
  • G07C 5/08 - Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle, or waiting time
  • B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit

93.

Systems and Methods for Generation and Utilization of Vehicle Testing Knowledge Structures for Autonomous Vehicle Simulation

      
Application Number 17140597
Status Pending
Filing Date 2021-01-04
First Publication Date 2022-06-23
Owner UATC, LLC (USA)
Inventor
  • Venkatadri, Arun David Kain
  • Lee, Jason C.
  • Tang, Yilun

Abstract

Systems and methods of the present disclosure are directed to a computer-implemented method. The method can include obtaining a first vehicle testing tuple comprising a plurality of first testing parameters and a second vehicle testing tuple comprising a plurality of second testing parameters. The method can include determining that the plurality of first testing parameters are associated with an evaluated operating condition. The method can include appending the first tuple to a first portion of a plurality of portions of a vehicle testing knowledge structure. The method can include determining that a second testing parameter is associated with an unevaluated operating condition. The method can include evaluating the unevaluated operating condition. The method can include generating a second portion comprising the second vehicle testing tuple for the vehicle testing knowledge structure.

IPC Classes  ?

  • B60W 50/02 - Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
  • G07C 5/08 - Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle, or waiting time

94.

Autonomous Vehicle Interface System With Multiple Interface Devices Featuring Redundant Vehicle Commands

      
Application Number 17686852
Status Pending
Filing Date 2022-03-04
First Publication Date 2022-06-16
Owner UATC, LLC (USA)
Inventor
  • Nix, Molly Castle
  • Chin, Sean
  • Zhao, Dennis
  • Hanson, Eric James

Abstract

The present disclosure provides an autonomous vehicle and associated interface system that includes multiple vehicle interface computing devices that provide redundant vehicle commands. As one example, an autonomous vehicle interface system can include a first vehicle interface computing device located within the autonomous vehicle and physically coupled to the autonomous vehicle. The first vehicle interface computing device can provide a first plurality of selectable vehicle commands to a human passenger of the autonomous vehicle. The autonomous vehicle interface system can further include a second vehicle interface computing device that provides a second plurality of selectable vehicle commands to the human passenger. For example, the second vehicle interface computing device can be the passenger's own device (e.g., smartphone). The second plurality of selectable vehicle commands can include at least some of the same vehicle commands as the first plurality of selectable vehicle commands.

IPC Classes  ?

  • B60K 35/00 - Arrangement or adaptations of instruments

95.

Systems and methods for costing autonomous vehicle maneuvers

      
Application Number 17130124
Grant Number 11827240
Status In Force
Filing Date 2020-12-22
First Publication Date 2022-06-09
Grant Date 2023-11-28
Owner UATC, LLC (USA)
Inventor
  • Subramanian, Vijay
  • Kazemi, Moslem
  • Bradley, David Mcallister
  • Phillips, Michael Lee

Abstract

Systems and methods for autonomous vehicle motion planning are provided. Sensor data describing an environment of an autonomous vehicle and an initial travel path for the autonomous vehicle through the environment can be obtained. A number of trajectories for the autonomous vehicle are generated based on the sensor data and the initial travel path. The trajectories can be evaluated by generating a number of costs for each trajectory. The costs can include a safety cost and a total cost. Each cost is generated by a cost function created in accordance with a number of relational propositions defining desired relationships between the number of costs. A subset of trajectories can be determined from the trajectories based on the safety cost and an optimal trajectory can be determined from the subset of trajectories based on the total cost. The autonomous vehicle can control its motion in accordance with the optimal trajectory.

IPC Classes  ?

  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
  • G05D 1/02 - Control of position or course in two dimensions
  • B60W 50/14 - Means for informing the driver, warning the driver or prompting a driver intervention
  • B60W 30/12 - Lane keeping

96.

Autonomous vehicle with independent auxiliary control units

      
Application Number 17682133
Grant Number 11599112
Status In Force
Filing Date 2022-02-28
First Publication Date 2022-06-09
Grant Date 2023-03-07
Owner UATC, LLC (USA)
Inventor
  • Letwin, Nicholas
  • Kelly, Sean

Abstract

An autonomous vehicle which includes multiple independent control systems that provide redundancy as to specific and critical safety situations which may be encountered when the autonomous vehicle is in operation.

IPC Classes  ?

  • B60W 30/08 - Predicting or avoiding probable or impending collision
  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
  • B60T 8/1755 - Brake regulation specially adapted to control the stability of the vehicle, e.g. taking into account yaw rate or transverse acceleration in a curve
  • G05D 1/02 - Control of position or course in two dimensions
  • B60W 50/02 - Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures

97.

Light Detection and Ranging (LIDAR) Assembly Having a Switchable Mirror

      
Application Number 17129961
Status Pending
Filing Date 2020-12-22
First Publication Date 2022-06-09
Owner UATC, LLC (USA)
Inventor
  • Borden, Michael Bryan
  • Haslim, James Allen

Abstract

A LIDAR assembly is provided. The LIDAR assembly includes a LIDAR unit. The LIDAR unit includes a housing defining a cavity. The LIDAR unit further includes a plurality of emitters disposed within the cavity. Each of the plurality of emitters is configured to emit a laser beam. The LIDAR assembly further includes a switchable mirror. The switchable mirror is positioned relative to the LIDAR unit such that the switchable mirror receives a plurality of laser beams exiting the housing of the LIDAR unit. The switchable mirror is configurable in at least a reflective state to direct the plurality of laser beams along a first path and a transmissive state to direct the plurality of laser beams along a second path that is different than the first path to widen a field of view of the LIDAR unit along a first axis.

IPC Classes  ?

  • G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles
  • G01S 17/894 - 3D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
  • G01S 7/481 - Constructional features, e.g. arrangements of optical elements
  • G02B 26/08 - Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light

98.

Discrete Decision Architecture for Motion Planning System of an Autonomous Vehicle

      
Application Number 17585650
Status Pending
Filing Date 2022-01-27
First Publication Date 2022-06-02
Owner UATC, LLC (USA)
Inventor
  • Phillips, Michael Lee
  • Burnette, Don
  • Gochev, Kalin Vasilev
  • Liemhetcharat, Somchaya
  • Dayanidhi, Harishma
  • Perko, Eric Michael
  • Wilkinson, Eric Lloyd
  • Green, Colin Jeffrey
  • Liu, Wei
  • Stentz, Anthony Joseph
  • Bradley, David Mcallister
  • Marden, Samuel Philip

Abstract

The present disclosure provides autonomous vehicle systems and methods that include or otherwise leverage a motion planning system that generates constraints as part of determining a motion plan for an autonomous vehicle (AV). In particular, a scenario generator within a motion planning system can generate constraints based on where objects of interest are predicted to be relative to an autonomous vehicle. A constraint solver can identify navigation decisions for each of the constraints that provide a consistent solution across all constraints. The solution provided by the constraint solver can be in the form of a trajectory path determined relative to constraint areas for all objects of interest. The trajectory path represents a set of navigation decisions such that a navigation decision relative to one constraint doesn't sacrifice an ability to satisfy a different navigation decision relative to one or more other constraints.

IPC Classes  ?

  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
  • B60W 30/095 - Predicting travel path or likelihood of collision
  • B60W 30/12 - Lane keeping
  • B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
  • B60W 30/16 - Control of distance between vehicles, e.g. keeping a distance to preceding vehicle
  • G05D 1/02 - Control of position or course in two dimensions
  • B60W 30/18 - Propelling the vehicle
  • G01C 21/20 - Instruments for performing navigational calculations
  • G01C 21/34 - Route searching; Route guidance

99.

Fleet utilization efficiency for on-demand transportation services

      
Application Number 17678195
Grant Number 11887032
Status In Force
Filing Date 2022-02-23
First Publication Date 2022-06-02
Grant Date 2024-01-30
Owner UATC, LLC (USA)
Inventor
  • Kislovskiy, Dima
  • Bradley, David Mcallister

Abstract

An on-demand transportation management system can collect vehicle fleet utilization data corresponding to human-driven vehicles (HDVs) and autonomous vehicles (AVs) operating within a given region in connection with an on-demand transportation service. The on-demand transportation management system can then establish a set of selection priorities for respective areas of the given region based on the vehicle fleet utilization data, each selection priority indicating whether a respective area of the given region is to favor HDVs or AVs for servicing transport requests.

IPC Classes  ?

  • G06Q 10/0631 - Resource planning, allocation, distributing or scheduling for enterprises or organisations
  • G06Q 50/30 - Transportation; Communications
  • G06Q 10/04 - Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"

100.

Systems and Methods for Automated Testing of Autonomous Vehicles

      
Application Number 17672796
Status Pending
Filing Date 2022-02-16
First Publication Date 2022-06-02
Owner UATC, LLC (USA)
Inventor
  • Garg, Sunil Kumar
  • Sifleet, Todd William
  • Gorthy, Venkata Sathya Praveen
  • Kan, Lili
  • Weslosky, Emily Anna

Abstract

Systems and methods for controlling autonomous vehicle test trips are provided. In one example embodiment, a computer implemented method includes obtaining, by a computing system, data indicative of a test trip index associated with an autonomous vehicle. The test trip index includes a plurality of test trips for the autonomous vehicle and each test trip is associated with one or more test trip parameters. The method includes obtaining, by the computing system, data indicating that the autonomous vehicle is available to travel in accordance with at least one of the test trips of the test trip index. The method includes selecting, by the computing system and from the test trip index, at least one selected test trip for the autonomous vehicle. The method includes causing, by the computing system, the autonomous vehicle to travel in accordance with the test trip parameters associated with the at least one selected test trip.

IPC Classes  ?

  • G05D 1/02 - Control of position or course in two dimensions
  • B60K 35/00 - Arrangement or adaptations of instruments
  1     2     3     ...     6        Next Page