Mobileye Vision Technologies Ltd.

Israel

Back to Profile

1-100 of 489 for Mobileye Vision Technologies Ltd. Sort by
Query
Patent
United States - USPTO
Aggregations Reset Report
Date
2024 April (MTD) 2
2024 March 1
2024 February 4
2024 January 1
2023 December 6
See more
IPC Class
G06K 9/00 - Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints 215
G05D 1/02 - Control of position or course in two dimensions 167
G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot 127
G01C 21/36 - Input/output arrangements for on-board computers 95
B60W 30/18 - Propelling the vehicle 93
See more
Status
Pending 146
Registered / In Force 343
Found results for  patents
  1     2     3     ...     5        Next Page

1.

GRAPH NEURAL NETWORKS FOR PARSING ROADS

      
Application Number 18491409
Status Pending
Filing Date 2023-10-19
First Publication Date 2024-04-25
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor Ferencz, Andras

Abstract

Systems and methods for predicting drivable paths relative to road segments are disclosed. In one implementation, a system includes a processor programmed to access topographical information associated with a road segment; generate a topographical representation of the road segment based on the topographical information; input the topographical representation of the road segment to a trained model, wherein the trained model includes a graph neural network and is configured to predict at least one drivable path relative to the road segment based on the topographical representation of the road segment; receive, from the trained model, information identifying the drivable path; and store the information identifying the drivable path in a map.

IPC Classes  ?

  • G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
  • G06V 10/82 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks

2.

ROAD PROFILE ALONG A PREDICTED PATH

      
Application Number 18535458
Status Pending
Filing Date 2023-12-11
First Publication Date 2024-04-25
Owner MOBILEYE VISION TECHNOLOGIES LTD. (Israel)
Inventor
  • Stein, Gideon
  • Blumenthal, Itay
  • Shaag, Nadav

Abstract

Systems and methods are provided for navigating a host vehicle. A navigation system for the host vehicle may include at least one processor programmed to receive an image representative of an environment of a host vehicle; analyze the image to determine a predicted path of the host vehicle; determine, based on the image, an indicator of comfort associated with the predicted path; identify, based on the indicator of comfort, an alternative path of the host vehicle; and output a control signal configured to modify an operation of a component of the host vehicle to follow the alternative path of the host vehicle.

IPC Classes  ?

  • B62D 15/02 - Steering position indicators
  • B60G 17/0165 - Resilient suspensions having means for adjusting the spring or vibration-damper characteristics, for regulating the distance between a supporting surface and a sprung part of vehicle or for locking suspension during use to meet varying vehicular or s the regulating means comprising electric or electronic elements characterised by their responsiveness, when the vehicle is travelling, to specific motion, a specific condition, or driver input to an external condition, e.g. rough road surface, side wind
  • B60G 17/018 - Resilient suspensions having means for adjusting the spring or vibration-damper characteristics, for regulating the distance between a supporting surface and a sprung part of vehicle or for locking suspension during use to meet varying vehicular or s the regulating means comprising electric or electronic elements characterised by the use of a specific signal treatment or control method
  • B60G 17/019 - Resilient suspensions having means for adjusting the spring or vibration-damper characteristics, for regulating the distance between a supporting surface and a sprung part of vehicle or for locking suspension during use to meet varying vehicular or s the regulating means comprising electric or electronic elements characterised by the type of sensor or the arrangement thereof
  • B60T 8/172 - Determining control parameters used in the regulation, e.g. by calculations involving measured or detected parameters
  • B60W 40/06 - Road conditions
  • B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
  • G06T 3/40 - Scaling of a whole image or part thereof
  • G06T 7/55 - Depth or shape recovery from multiple images
  • G06V 10/40 - Extraction of image or video features
  • G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle

3.

TRAJECTORY SELECTION FOR AN AUTONOMOUS VEHICLE

      
Application Number 18469162
Status Pending
Filing Date 2023-09-18
First Publication Date 2024-03-14
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor
  • Shalev-Shwartz, Shai
  • Shashua, Amnon
  • Shammah, Shaked

Abstract

Systems and methods are provided for navigating a host vehicle. A navigation system for the host vehicle may include at least one processor programmed to receive images representative of an environment of the host vehicle; analyze at least one of the images to identify navigational state information associated with the host vehicle; determine a plurality of first potential navigational actions for the host vehicle based on the navigational state information; determine respective future states for the plurality of first potential navigational actions; determine a plurality of second potential navigational actions for the host vehicle based on the determined respective future states; select, based on the plurality of second potential navigational actions, one of the plurality of first potential navigational actions; and cause an adjustment of a navigational actuator of the host vehicle to implement the selected one of the plurality of first potential navigational actions.

IPC Classes  ?

  • G05D 1/02 - Control of position or course in two dimensions
  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
  • G01C 21/34 - Route searching; Route guidance
  • G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle

4.

SAFETY SYSTEM FOR A VEHICLE TO DETECT AND WARN OF A POTENTIAL COLLISION

      
Application Number 18084138
Status Pending
Filing Date 2022-12-19
First Publication Date 2024-02-22
Owner MOBILEYE VISION TECHNOLOGIES LTD. (Israel)
Inventor
  • Elimalech, Yaniv
  • Stein, Gideon

Abstract

Systems and methods are provided for processing reports received from vehicles. A processing device perform operations comprising receiving a first report generated by a first vehicle, the first report generated by the first vehicle for a first hazard detected by the first vehicle; receiving a second report generated by a second vehicle, the second report generated by the second vehicle for a second hazard detected by the second vehicle; analyzing the first report and the second report to make a determination that the first report and the second report identify a related hazard; aggregating the first report and the second report into a consolidated report based on the determination that the first report and the second report identify a related hazard; and generating a map, the map indicating a location of the related hazard based on the consolidated report.

IPC Classes  ?

  • G08G 1/16 - Anti-collision systems
  • G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
  • G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
  • H04N 23/69 - Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
  • B60Q 9/00 - Arrangement or adaptation of signal devices not provided for in one of main groups
  • B60R 1/00 - Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
  • H04N 7/18 - Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

5.

MAPPING LANE MARKS AND NAVIGATION BASED ON MAPPED LANE MARKS

      
Application Number 18457418
Status Pending
Filing Date 2023-08-29
First Publication Date 2024-02-22
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor
  • Shapira, Dori
  • Viente, Kfir
  • Braunstein, Daniel
  • Caspi, Bnaya
  • Hanniel, Iddo

Abstract

Systems and methods are provided for autonomous vehicle navigation. The systems and methods may map a lane mark, may map a directional arrow, selectively harvest road information based on data quality, map road segment free spaces, map traffic lights and determine traffic light relevancy, and map traffic lights and associated traffic light cycle times.

IPC Classes  ?

  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
  • G01C 21/00 - Navigation; Navigational instruments not provided for in groups
  • B60W 10/18 - Conjoint control of vehicle sub-units of different type or different function including control of braking systems
  • B60W 10/20 - Conjoint control of vehicle sub-units of different type or different function including control of steering systems
  • B60W 30/18 - Propelling the vehicle
  • G01C 21/34 - Route searching; Route guidance
  • G01C 21/36 - Input/output arrangements for on-board computers
  • G08G 1/14 - Traffic control systems for road vehicles indicating individual free spaces in parking areas
  • G08G 1/00 - Traffic control systems for road vehicles
  • B60T 7/12 - Brake-action initiating means for initiation not subject to will of driver or passenger
  • B62D 6/00 - Arrangements for automatically controlling steering depending on driving conditions sensed and responded to, e.g. control circuits
  • G05D 1/02 - Control of position or course in two dimensions
  • G01C 21/30 - Map- or contour-matching
  • B60W 30/12 - Lane keeping
  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
  • G08G 1/01 - Detecting movement of traffic to be counted or controlled
  • G08G 1/07 - Controlling traffic signals
  • G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
  • G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads

6.

SAFE STATE TO SAFE STATE NAVIGATION

      
Application Number 18214692
Status Pending
Filing Date 2023-06-27
First Publication Date 2024-02-08
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor
  • Shalev-Shwartz, Shai
  • Shashua, Ammon
  • Shammah, Shaked

Abstract

Systems and methods are provided for navigating a host vehicle. In one implementation, a system may include a processing device configured to receive an image acquired by an image capture device; determine a planned navigational action for accomplishing a navigational goal of the host vehicle; analyze the at least one image to identify a first target vehicle ahead of the host vehicle and a second target vehicle ahead of the first target vehicle; determine a next-state distance between the host vehicle and the second target vehicle that would result if the planned navigational action was taken; determine a stopping distance for the host vehicle based on a maximum braking capability of the host vehicle and a current speed of the host vehicle; and cause the vehicle to implement the planned navigational action if the stopping distance is less than the determined next-state distance.

IPC Classes  ?

  • G06Q 40/08 - Insurance
  • B60W 10/18 - Conjoint control of vehicle sub-units of different type or different function including control of braking systems
  • B60W 30/09 - Taking automatic action to avoid collision, e.g. braking and steering
  • G07C 5/02 - Registering or indicating driving, working, idle, or waiting time only
  • G07C 5/08 - Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle, or waiting time
  • B60W 30/095 - Predicting travel path or likelihood of collision
  • G05D 1/02 - Control of position or course in two dimensions
  • G08G 1/16 - Anti-collision systems
  • B60W 10/04 - Conjoint control of vehicle sub-units of different type or different function including control of propulsion units
  • B60W 10/20 - Conjoint control of vehicle sub-units of different type or different function including control of steering systems
  • B60W 30/18 - Propelling the vehicle
  • G01C 21/36 - Input/output arrangements for on-board computers
  • G01C 21/34 - Route searching; Route guidance
  • G06Q 10/00 - Administration; Management
  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles

7.

NAVIGATION IN VEHICLE CROSSING SCENARIOS

      
Application Number 18143884
Status Pending
Filing Date 2023-05-05
First Publication Date 2024-02-01
Owner MOBILEYE VISION TECHNOLOGIES LTD. (Israel)
Inventor
  • Finelt, Ofer
  • Baba, Tomer

Abstract

Systems and methods are disclosed for navigating a host vehicle. In one implementation, at least one processor may be programmed to receive an image captured by a camera; identify an oncoming target vehicle; determine that a planned trajectory for the host vehicle includes a turn that crosses a projected trajectory of the oncoming target vehicle in a potential turn-across-path event; determine a driving jurisdiction associated with the road; determine, based on the driving jurisdiction, whether the target vehicle is approaching a road feature that negates the potential turn-across-path event; and either implement or forego implementing a remedial action based on whether the target vehicle is approaching a road feature that negates the potential turn-across-path event.

IPC Classes  ?

  • B60W 30/09 - Taking automatic action to avoid collision, e.g. braking and steering
  • B60W 30/095 - Predicting travel path or likelihood of collision
  • B60W 40/072 - Curvature of the road
  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
  • G05D 1/02 - Control of position or course in two dimensions
  • G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle

8.

SIGNATURE NETWORK FOR TRAFFIC SIGN CLASSIFICATION

      
Application Number 18449290
Status Pending
Filing Date 2023-08-14
First Publication Date 2024-01-25
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor
  • Kassel, Levi
  • Ochana, Neriya
  • Hochman, Yuval
  • Hendler, Abraham
  • Alkon, Gal

Abstract

In an embodiment, a navigation system for a host vehicle may include at least one processor comprising circuitry and a memory. The memory may include instructions that when executed by the circuitry cause the at least one processor to receive at least one image from a camera on a host vehicle, to analyze the at least one image to identify at least one object represented in the image, to generate a feature vector representative of the at least one object, to compare the generated feature vector to a plurality of feature vectors stored in a database and in response to a determination that the generated feature vector does not match an entry in the database, send the generated feature vector to a server, wherein the server is configured to generate an updated feature vector database in response to the generated feature vector sent by the host vehicle navigation system in combination with feature vectors received from a plurality of additional vehicles.

IPC Classes  ?

  • G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
  • G06V 10/74 - Image or video pattern matching; Proximity measures in feature spaces
  • G06V 10/44 - Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
  • G06V 10/82 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
  • G06V 10/774 - Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
  • G06V 10/764 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects

9.

SYSTEMS AND METHODS FOR NAVIGATING A VEHICLE AMONG ENCROACHING VEHICLES

      
Application Number 18339258
Status Pending
Filing Date 2023-06-22
First Publication Date 2023-12-28
Owner MOBILEYE VISION TECHNOLOGIES LTD. (Israel)
Inventor
  • Clarke, Anna
  • Bagon, Eyal

Abstract

Systems and methods use cameras to provide autonomous navigation features. In one implementation, a method for navigating a user vehicle may include acquiring, using at least one image capture device, a plurality of images of an area in a vicinity of the user vehicle; determining from the plurality of images a first lane constraint on a first side of the user vehicle and a second lane constraint on a second side of the user vehicle opposite to the first side of the user vehicle; enabling the user vehicle to pass a target vehicle if the target vehicle is determined to be in a lane different from the lane in which the user vehicle is traveling; and causing the user vehicle to abort the pass before completion of the pass, if the target vehicle is determined to be entering the lane in which the user vehicle is traveling.

IPC Classes  ?

  • B60W 30/18 - Propelling the vehicle
  • B60T 7/12 - Brake-action initiating means for initiation not subject to will of driver or passenger
  • G01C 21/26 - Navigation; Navigational instruments not provided for in groups specially adapted for navigation in a road network
  • G08G 1/0962 - Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
  • G08G 1/16 - Anti-collision systems
  • B60K 31/00 - Vehicle fittings, acting on a single sub-unit only, for automatically controlling vehicle speed, i.e. preventing speed from exceeding an arbitrarily established velocity or maintaining speed at a particular velocity, as selected by the vehicle operat
  • B60W 30/14 - Cruise control
  • G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
  • H04N 23/90 - Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
  • B60W 30/12 - Lane keeping
  • B60W 30/165 - Control of distance between vehicles, e.g. keeping a distance to preceding vehicle automatically following the path of a preceding lead vehicle, e.g. "electronic tow-bar"
  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
  • B60W 30/00 - Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
  • B60W 30/09 - Taking automatic action to avoid collision, e.g. braking and steering
  • B60W 40/06 - Road conditions
  • B60W 40/072 - Curvature of the road
  • B60W 40/076 - Slope angle of the road
  • G01C 11/04 - Interpretation of pictures
  • G01C 21/10 - Navigation; Navigational instruments not provided for in groups by using measurement of speed or acceleration
  • G05D 1/02 - Control of position or course in two dimensions
  • B60T 8/32 - Arrangements for adjusting wheel-braking force to meet varying vehicular or ground-surface conditions, e.g. limiting or varying distribution of braking force responsive to a speed condition, e.g. acceleration or deceleration
  • B60W 30/08 - Predicting or avoiding probable or impending collision
  • B60T 7/22 - Brake-action initiating means for initiation not subject to will of driver or passenger initiated by contact of vehicle, e.g. bumper, with an external object, e.g. another vehicle
  • B60R 1/00 - Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
  • B60W 10/18 - Conjoint control of vehicle sub-units of different type or different function including control of braking systems
  • B60W 10/20 - Conjoint control of vehicle sub-units of different type or different function including control of steering systems
  • B60W 30/095 - Predicting travel path or likelihood of collision
  • B62D 15/02 - Steering position indicators
  • B60W 50/14 - Means for informing the driver, warning the driver or prompting a driver intervention
  • G01C 21/30 - Map- or contour-matching
  • G06T 7/00 - Image analysis
  • G06T 7/70 - Determining position or orientation of objects or cameras
  • G06T 7/285 - Analysis of motion using a sequence of stereo image pairs
  • G06T 7/292 - Multi-camera tracking

10.

VEHICLE OPERATION SAFETY MODEL GRADE MEASUREMENT

      
Application Number 18037190
Status Pending
Filing Date 2021-11-19
First Publication Date 2023-12-21
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor
  • Zhu, Qianying
  • Zhang, Lidan
  • Wu, Xiangbin
  • Zhang, Xinxin
  • Li, Fei
  • Guo, Ping

Abstract

System and techniques for vehicle operation safety model (VOSM) grade measurement are described herein. A data set of parameter measurements-defined by the VOSM-of multiple vehicles are obtained. A statistical value is then derived from a portion of the parameter measurements. A measurement from a subject vehicle is obtained that corresponds to the portion of the parameter measurements from which the statistical value was derived. The measurement is then compared to the statistical value to produce a safety grade for the subject vehicle.

IPC Classes  ?

  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
  • B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions

11.

VEHICLE OPERATION SAFETY MODEL TEST SYSTEM

      
Application Number 18242887
Status Pending
Filing Date 2023-09-06
First Publication Date 2023-12-21
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor
  • Zhu, Qianying
  • Zhang, Lidan
  • Wu, Xiangbin
  • Zhang, Xinxin
  • Li, Fei

Abstract

System and techniques for test verification of a control system (e.g., a vehicle safety system) with a vehicle operation safety model (VOSM) such as Responsibility Sensitive Safety (RSS) are described. In an example, using test scenarios to measure performance of VOSM includes: defining safety condition parameters of a VOSM for use in a test scenario configured to test performance of a safety system; generating a test scenario, using the safety condition parameters, the test scenario generated as a steady state test, a dynamic test, or a stress test; executing the test scenario with a test simulator, to produce test results for the safety system; measuring real-time kinematics of the safety system, during execution of the test scenario, based on compliance with the safety condition parameters; and producing a parameter rating for performance of the safety system with the VOSM, based on the test results and the measured real-time kinematics.

IPC Classes  ?

  • B60W 50/02 - Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
  • B60W 30/16 - Control of distance between vehicles, e.g. keeping a distance to preceding vehicle
  • B60W 30/165 - Control of distance between vehicles, e.g. keeping a distance to preceding vehicle automatically following the path of a preceding lead vehicle, e.g. "electronic tow-bar"

12.

LIGHTWEIGHT IN-VEHICLE CRITICAL SCENARIO EXTRACTION SYSTEM

      
Application Number 18034232
Status Pending
Filing Date 2021-11-19
First Publication Date 2023-12-14
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor
  • Zhu, Qianying
  • Zhang, Lidan
  • Wu, Xiangbin
  • Zhang, Xinxin
  • Li, Fei

Abstract

Various aspects of methods, systems, and use cases for critical scenario identification and extraction from vehicle operations are described. In an example, an approach for lightweight analysis and detection includes capturing data from sensors associated with (e.g., located within, or integrated into) a vehicle, detecting the occurrence of a critical scenario, extracting data from the sensors in response to detecting the occurrence of the critical scenario, and outputting the extracted data. The critical scenario may be specifically detected based on a comparison of the operation of the vehicle to at least one requirement specified by a vehicle operation safety model. Reconstruction and further data processing may be performed on the extracted data, such as with the creation of a simulation from extracted data that is communicated to a remote service.

IPC Classes  ?

  • G07C 5/08 - Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle, or waiting time
  • G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
  • G07C 5/00 - Registering or indicating the working of vehicles

13.

SAFETY AND CRITICAL INFORMATION LOGGING MECHANISM FOR VEHICLES

      
Application Number 18033934
Status Pending
Filing Date 2021-11-19
First Publication Date 2023-12-07
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor
  • Zhu, Qianying
  • Zhang, Lidan
  • Wu, Xiangbin
  • Zhang, Xinxin
  • Li, Fei

Abstract

Various aspects of methods, systems, and use cases for safety logging in a vehicle are described. In an example, an approach for data logging in a vehicle includes use of logging triggers, public and private data buckets, and defined data formats, for data provided during autonomous vehicle operation. Data logging operations may be triggered in response to safety conditions, such as detecting a dangerous situation from a failure of the vehicle to comply with safety criteria of a vehicle operational safety model. Data logging operations may include logging data in response to detection of the dangerous situation, including storage of a first portion of data in a public data store, and storage of a second portion of privacy-sensitive data in a private data store, where the data stored in the private data store is encrypted, and where access to the private data store is controlled.

IPC Classes  ?

  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles

14.

MAP TILE OPTIMIZATION BASED ON TILE CONNECTIVITY

      
Application Number 18236555
Status Pending
Filing Date 2023-08-22
First Publication Date 2023-12-07
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor
  • Schwartz, Maxim
  • Goldman, Yehonatan
  • Cohen, Asaf
  • Fisher, Amiel

Abstract

A system for vehicle navigation may include a processor including a circuitry and a memory. The memory may include instructions that when executed by the circuitry cause the processor to receive navigational information associated with the vehicle including an indicator of a location of the vehicle, and determine target navigational map segments to retrieve from a map database. The map database may include stored navigational map segments each corresponding to a real-world area. The determination of the target navigational map segments may be based on the indicator of vehicle location and on map segment connectivity information associated with the stored navigational map segments. The instructions may also cause the processor to initiate downloading of the target navigational map segments from the map database, and cause the vehicle to navigate along a target trajectory included in one or more of the target navigational map segments downloaded from the map database.

IPC Classes  ?

  • G01C 21/00 - Navigation; Navigational instruments not provided for in groups
  • G01C 21/36 - Input/output arrangements for on-board computers

15.

SYSTEMS AND METHODS FOR VEHICLE NAVIGATION

      
Application Number 18204146
Status Pending
Filing Date 2023-05-31
First Publication Date 2023-11-30
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor
  • Ferencz, Andras
  • Zackay, Ora

Abstract

Systems and methods are provided for vehicle navigation. In one implementation, at least one processor may be programmed to receive, from a camera, a captured image representative of features in an environment of the vehicle. The processor may generate a warped image based on the received captured image, which may simulate a view of the features in the environment of the vehicle from a simulated viewpoint elevated relative to an actual position of the camera. The processor may further identify a road feature represented in the warped image, which may be transformed in one or more respects relative to a representation of the road feature in the captured image. The processor may then determine a navigational action for the vehicle based on the identified feature represented in the warped image and cause at least one actuator system of the vehicle to implement the determined navigational action.

IPC Classes  ?

  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
  • G05D 1/02 - Control of position or course in two dimensions

16.

Navigation Based on Detected Size of Occlusion Zones

      
Application Number 18219240
Status Pending
Filing Date 2023-07-07
First Publication Date 2023-11-02
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor
  • Shalev-Shwartz, Shai
  • Shashua, Amnon
  • Stein, Gideon
  • Shammah, Shaked

Abstract

A navigation system for a host vehicle is provided. The system may comprise at least one processing device programmed to receive, from a camera, a plurality of images representative of an environment of the host vehicle; analyze the plurality of images to identify at least one vehicle-induced occlusion zone in an environment of the host vehicle; and cause a navigational change for the host vehicle based, at least in part, on a size of a target vehicle that induces the identified occlusion zone.

IPC Classes  ?

  • B60W 30/09 - Taking automatic action to avoid collision, e.g. braking and steering
  • G05D 1/02 - Control of position or course in two dimensions
  • G08G 1/01 - Detecting movement of traffic to be counted or controlled
  • G08G 1/0968 - Systems involving transmission of navigation instructions to the vehicle
  • B60W 30/095 - Predicting travel path or likelihood of collision
  • B60W 30/16 - Control of distance between vehicles, e.g. keeping a distance to preceding vehicle
  • B60W 30/18 - Propelling the vehicle
  • B60W 50/10 - Interpretation of driver requests or demands
  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
  • G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
  • G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads

17.

SYSTEMS AND METHODS FOR ROAD SEGMENT MAPPING

      
Application Number 18344333
Status Pending
Filing Date 2023-06-29
First Publication Date 2023-10-26
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor
  • Ferencz, Andras
  • Zackay, Ora
  • Frenkel-Levy, Roi
  • Urbach, Ron
  • Reiss, Yigal

Abstract

A system for automatically mapping a road segment may include: at least one processor programmed to: receive, from at least one camera mounted on a vehicle, a plurality of images acquired as the vehicle traversed the road segment; convert each of the plurality of images to a corresponding top view image to provide a plurality of top view images; aggregate the plurality of top view images to provide an aggregated top view image of the road segment; analyze the aggregated top view image to identify at least one road feature associated with the road segment; automatically annotate the at least one road feature relative to the aggregated top view image; and output to at least one memory the aggregated top view image including the annotated at least one road feature.

IPC Classes  ?

  • G01C 21/00 - Navigation; Navigational instruments not provided for in groups
  • H04N 5/262 - Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects
  • G06V 20/70 - Labelling scene content, e.g. deriving syntactic or semantic representations
  • G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
  • G06T 7/11 - Region-based segmentation
  • G06T 7/246 - Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
  • G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads

18.

SECURE DISTRIBUTED EXECUTION OF JOBS

      
Application Number 18210794
Status Pending
Filing Date 2023-06-16
First Publication Date 2023-10-19
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor
  • Agam, Oren
  • Kuch, Liron
  • Galil, Eran
  • Atedgi, Liron

Abstract

A processing unit, where the processing unit one of a group of processing units of a system, includes a processor; and memory including instructions, which when executed by the processor while avoiding interrupting a controller that does not belong to the group of processing units, cause the processor to: perform at least one iteration of the steps of: (a) entering a trusted mode, (b) selecting a selected job to be executed by the processing unit, (c) retrieving access control metadata related to the selected job, (d) entering, by the processing unit, an untrusted mode, (e) executing the selected job by the processing unit while adhering to the access control metadata related to the job, and (f) resetting the processing unit.

IPC Classes  ?

  • G06F 21/54 - Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems during program execution, e.g. stack integrity, buffer overflow or preventing unwanted data erasure by adding security routines or objects to programs
  • G06F 9/48 - Program initiating; Program switching, e.g. by interrupt
  • G06F 9/4401 - Bootstrapping

19.

STEERING LIMITERS FOR VEHICLE NAVIGATION

      
Application Number 18131772
Status Pending
Filing Date 2023-04-06
First Publication Date 2023-10-12
Owner MOBILEYE VISION TECHNOLOGIES LTD. (Israel)
Inventor
  • Rojas, Ronen
  • Selig, Ilan

Abstract

Systems and methods are provided for navigating a host vehicle. In one implementation, a system may include at least one processor configured to receive at least one image acquired by an image capture device, the at least one image being representative of an environment of the host vehicle; analyze the at least one image to identify at least one characteristic associated with the environment of the host vehicle; determine a navigational action for the host vehicle based on: the at least one characteristic associated with the environment of the host vehicle, and a steering limit corresponding to a maximum allowable lateral acceleration for the host vehicle; and cause one or more actuators associated with the host vehicle to implement the determined navigational action.

IPC Classes  ?

  • B60W 50/08 - Interaction between the driver and the control system
  • B60W 40/072 - Curvature of the road
  • B60W 10/20 - Conjoint control of vehicle sub-units of different type or different function including control of steering systems

20.

Augmenting autonomous driving with remote viewer recommendation

      
Application Number 18077752
Grant Number 11899457
Status In Force
Filing Date 2022-12-08
First Publication Date 2023-10-05
Grant Date 2024-02-13
Owner MOBILEYE VISION TECHNOLOGIES LTD. (Israel)
Inventor
  • Vaughn, Robert
  • Baron, Casey

Abstract

Autonomous vehicles are an exciting prospect to the future of driving. However, concerns about the decision-making made by the AI controlling a vehicle has been of concern, particularly in light of high-profile accidents. We can alleviate some concern, introduce better decisions, and also train an AI to make better decisions by introducing a remote viewer's, e.g., a human's, reaction to a possibly complex environment surrounding a vehicle that includes a potential threat to the vehicle. One or more remote viewer may provide a recommended response to the threat that may be incorporated in whole or in part in how the vehicle reacts. Various ways to engage and utilize remote viewers are proposed to improve the likelihood of receiving useful recommendations, including modifying how the environment is presented to a remote viewer to best suit the remote viewer, e.g., perhaps present the threat in a game.

IPC Classes  ?

  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
  • G06N 3/08 - Learning methods
  • G05D 1/02 - Control of position or course in two dimensions
  • G06N 3/04 - Architecture, e.g. interconnection topology
  • B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions

21.

DETECTING AN OPEN DOOR USING A SPARSE REPRESENTATION

      
Application Number 18295422
Status Pending
Filing Date 2023-04-04
First Publication Date 2023-10-05
Owner MOBILEYE VISION TECHNOLOGIES LTD. (Israel)
Inventor
  • Lotan, Roy
  • Harouche, Shahar

Abstract

A computer-implemented method for navigating a host vehicle includes receiving an image frame from an image capture device, the image frame representing an environment of the host vehicle and including a representation of a target vehicle; analyzing the image frame to determine a sparse representation of a portion of the image frame; providing the sparse representation to an object detection network; receiving an identifier of a candidate region identified by the object detection network; based on the identifier, extracting the candidate region from the image frame; providing the candidate region to an open door detection network; determining a navigational action for the host vehicle in response to an indication from the open door detection network that the candidate region includes a representation of a door of the target vehicle in an open condition; and causing an actuator associated with the host vehicle to implement the navigational action.

IPC Classes  ?

  • B60W 30/09 - Taking automatic action to avoid collision, e.g. braking and steering
  • B60W 10/18 - Conjoint control of vehicle sub-units of different type or different function including control of braking systems
  • B60W 10/20 - Conjoint control of vehicle sub-units of different type or different function including control of steering systems
  • B60W 30/10 - Path keeping
  • G06V 10/25 - Determination of region of interest [ROI] or a volume of interest [VOI]
  • G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods

22.

SYSTEMS AND METHODS FOR EVALUATING DOMAIN-SPECIFIC NAVIGATION SYSTEM CAPABILITIES

      
Application Number 18139041
Status Pending
Filing Date 2023-04-25
First Publication Date 2023-09-14
Owner Mobileye Vision Technologies Ltd (Israel)
Inventor
  • Belman, Efim
  • Goldstein, Moshe

Abstract

Systems and methods evaluate navigation system capabilities. In one implementation, at least one processing device is programmed to acquire characteristics of one or more sensors included in the host vehicle; establish a testing domain, wherein the testing domain includes at least one mapped representation of a geographic region; and simulate operation of the one or more sensors relative to the testing domain. Based on the simulated operation of the one or more sensors, the at least one processing device may determine whether one or more regions exist within the geographic region where outputs of the one or more sensors are insufficient for ensuring that each navigational action implemented by the navigation system of the host vehicle will not result in an accident for which the host vehicle is at fault.

IPC Classes  ?

  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles

23.

VEHICLE OPERATION SAFETY MODEL COMPLIANCE MEASUREMENT

      
Application Number 18197279
Status Pending
Filing Date 2023-05-15
First Publication Date 2023-09-07
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor
  • Zhu, Qianying
  • Guo, Ping
  • Zhang, Xinxin
  • Li, Fei
  • Wu, Xiangbin

Abstract

System and techniques for vehicle operation safety model (VOSM) compliance measurement are described herein. A subject vehicle is tested in a vehicle following scenario against VOSM parameter compliance. The test measures the subject vehicle activity during phases of the following scenario in which a lead vehicle slows and produces log data and calculations that form the basis of a VOSM compliance measurement.

IPC Classes  ?

  • B60T 17/22 - Devices for monitoring or checking brake systems; Signal devices
  • B60T 8/171 - Detecting parameters used in the regulation; Measuring values used in the regulation

24.

MACHINE LEARNING-BASED TRAFFIC LIGHT RELEVANCY MAPPING

      
Application Number 18116084
Status Pending
Filing Date 2023-03-01
First Publication Date 2023-09-07
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor
  • Hayat, Adi
  • Barlev, Jonathan

Abstract

Systems and methods are provided for generating a crowd-sourced map for use in vehicle navigation. In one implementation, a system may include at least one processor configured to receive drive information collected from vehicles that traversed a junction; aggregate the received drive information to determine positions of traffic lights and spline representations for drivable paths; input the determined positions and the spline representations to a trained model configured to generate a traffic light relevancy mapping indicating a traffic light relevancy for traffic light to drivable path pairs of the junction; input an observed vehicle behavior to the at least one trained model to generate an updated traffic light relevancy mapping; store in the crowd-sourced map the indicators of traffic light relevancy for the traffic light to drivable path pairs; and transmit the crowd-sourced map to a vehicle for use in navigating the road segment.

IPC Classes  ?

  • G01C 21/00 - Navigation; Navigational instruments not provided for in groups

25.

UPDATING SOFTWARE ELEMENTS WITH DIFFERENT TRUST LEVELS

      
Application Number 18189302
Status Pending
Filing Date 2023-03-24
First Publication Date 2023-08-17
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor Ben-Avi, Eran

Abstract

Techniques are disclosed for updating a trusted software and another software, which may include receiving a software update package that comprises a trusted updated software component, a trusted updated software booting metadata, another updated software component, and another updated software booting metadata. The trusted updated software component may belong to an updated version of the trusted software, and the trusted software may have a certain trust level. The other updated software component may belong to an updated version of the other software, and the other software may have a trust level that is lower than the certain trust level. At least a part of the trusted updated software booting metadata may comprise retrieval information for retrieving, during a booting process, at least a portion of the other software booting metadata.

IPC Classes  ?

  • G06F 8/65 - Updates
  • G06F 21/54 - Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems during program execution, e.g. stack integrity, buffer overflow or preventing unwanted data erasure by adding security routines or objects to programs
  • G06F 21/57 - Certifying or maintaining trusted computer platforms, e.g. secure boots or power-downs, version controls, system software checks, secure updates or assessing vulnerabilities

26.

DETERMINING ROAD LOCATION OF A TARGET VEHICLE BASED ON TRACKED TRAJECTORY

      
Application Number 18127533
Status Pending
Filing Date 2023-03-28
First Publication Date 2023-07-27
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor Stein, Gideon

Abstract

Systems and methods are provided for navigating a host vehicle. In an embodiment, a processing device may be configured to receive images captured over a time period; analyze images to identify a target vehicle; receive map information associated including a plurality of target trajectories; determine, based on analysis of the images, first and second estimated positions of the target vehicle within the time period; determine, based on the first and second estimated positions, a trajectory of the target vehicle over the time period; compare the determined trajectory to the plurality of target trajectories to identify a target trajectory traversed by the target vehicle; determine, based on the identified target trajectory, a position of the target vehicle; and determine a navigational action for the host vehicle based on the determined position.

IPC Classes  ?

  • G06T 7/70 - Determining position or orientation of objects or cameras
  • G05D 1/02 - Control of position or course in two dimensions
  • G08G 1/16 - Anti-collision systems
  • G01C 21/00 - Navigation; Navigational instruments not provided for in groups
  • G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
  • G06V 10/772 - Determining representative reference patterns, e.g. averaging or distorting patterns; Generating dictionaries
  • G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
  • G01C 21/16 - Navigation; Navigational instruments not provided for in groups by using measurement of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
  • G01C 21/36 - Input/output arrangements for on-board computers
  • G01C 21/32 - Structuring or formatting of map data

27.

SYSTEMS AND METHODS FOR COMMON SPEED MAPPING AND NAVIGATION

      
Application Number 18129293
Status Pending
Filing Date 2023-03-31
First Publication Date 2023-07-27
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor
  • Heilbron, Adina
  • Tzigelman, Shalom
  • Yuval, Amitai

Abstract

A system for collecting and distributing navigation information relative to a road segment is disclosed. In one embodiment, the system includes at least one processor programmed to receive drive information collected from each of a plurality of vehicles that traversed the road segment, wherein the drive information received from each of the plurality of vehicles includes indicators of speed traveled by one of the plurality of vehicles during a drive traversing the road segment; determine, based on the indicators of speed included in the drive information received from each of the plurality of vehicles, at least one aggregated common speed profile for the road segment; store the at least one aggregated common speed profile in an autonomous vehicle road navigation model associated with the road segment; and distribute the autonomous vehicle road navigation model to one or more autonomous vehicles for use in navigating along the road segment.

IPC Classes  ?

  • G01C 21/00 - Navigation; Navigational instruments not provided for in groups

28.

Display screen with animated graphical user interface

      
Application Number 29748233
Grant Number D0992597
Status In Force
Filing Date 2020-08-27
First Publication Date 2023-07-18
Grant Date 2023-07-18
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor
  • Walter, Samantha
  • Gofberg, Judith
  • Seri-Levi, Shira

29.

TRAFFIC LIGHT ORIENTED NETWORK

      
Application Number 18093219
Status Pending
Filing Date 2023-01-04
First Publication Date 2023-07-06
Owner MOBILEYE VISION TECHNOLOGIES LTD. (Israel)
Inventor
  • Hendler, Abraham
  • Saban, Orit
  • Hochbaum, Tomer
  • Daybog, Itay
  • Hochman, Yuval

Abstract

A navigation system for a host vehicle may include at least one processor comprising circuitry and a memory. The memory may include instructions that when executed by the circuitry cause the at least one processor to receive from an image capture device associated with the host vehicle a captured image representative of an environment of the host vehicle, to identify a first segment of the captured image associated with a traffic light, to provide the first segment of the captured image to a first trained network, the first trained network being configured to generate a first output indicative of a state of the traffic light, to identify a second segment of the captured image that includes contextual information associated with the traffic light, to provide the second segment to a second trained network, the second trained network being configured to generate a second output indicative of a proposed navigational action for the host vehicle relative to the traffic light, to determine, based on both the first output from the first trained network and the second output from the second trained network a planned navigational action for the host vehicle and to cause the host vehicle to take the planned navigational action.

IPC Classes  ?

  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
  • G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
  • G06V 10/82 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks

30.

CROWDSOURCED TURN INDICATORS

      
Application Number 18093226
Status Pending
Filing Date 2023-01-04
First Publication Date 2023-07-06
Owner MOBILEYE VISION TECHNOLOGIES LTD. (Israel)
Inventor
  • Nehushtan, Nimrod
  • Amir, Arbel
  • Eshet, Tomer
  • Viente, Kfir

Abstract

A server-based system for generating a map for storing a turn signal activation location along a road segment may include at least one processor comprising circuitry and a memory. The memory may include instructions that when executed by the circuitry cause the at least one processor to receive drive information from each of a plurality of vehicles that traversed a road segment, wherein the drive information includes turn signal activation information indicating a detected change in state of a turn signal of at least one target vehicle and a location where the detected change in state of the turn signal of the target vehicle occurred; aggregate the turn signal activation information from two or more of the plurality of vehicles to generate a refined location of a turn signal activation location associated with the road segment; store an indicator of the refined location of the turn signal activation location in a map; store an indicator of the refined location of the turn signal activation location in a map; and distribute the map to one or more vehicles that later traverse the road segment.

IPC Classes  ?

  • B60Q 1/34 - Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating change of drive direction
  • G01C 21/00 - Navigation; Navigational instruments not provided for in groups
  • G08G 1/01 - Detecting movement of traffic to be counted or controlled

31.

CALCULATING VEHICLE SPEED FOR A ROAD CURVE

      
Application Number 18090927
Status Pending
Filing Date 2022-12-29
First Publication Date 2023-06-29
Owner MOBILEYE VISION TECHNOLOGIES LTD. (Israel)
Inventor
  • Shalev-Shwartz, Shai
  • Molnar, Moran
  • Granot, Ilai
  • Weiser, Doron

Abstract

Systems and methods for navigating a host vehicle are disclosed. In one implementation, a system includes a processor configured to receive from a camera onboard the host vehicle a captured image representative of an environment of the host vehicle. The captured image is provided to a trained system. The trained system is configured to infer an output from the captured image a presence of a curved road segment in the captured image, wherein the curved road segment is associated with a road on which the host vehicle is traveling. The processor is configured to receive the output provided by the training system. The output includes at least one speed value for the host vehicle. The at least one speed value output from the trained system is based on a proximity of the host vehicle to the curved road segment and based on at least one characteristic of the curved road segment represented in the captured image. The processor is configured to cause the host vehicle to take at least one navigational action based on the determined at least one speed value.

IPC Classes  ?

  • B60W 30/14 - Cruise control
  • B60W 40/072 - Curvature of the road
  • B60W 40/105 - Speed
  • B60W 40/107 - Longitudinal acceleration
  • B60W 40/13 - Load or weight
  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
  • G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
  • G06V 10/82 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks

32.

SYSTEMS AND METHODS FOR PERFORMING NEURAL NETWORK OPERATIONS

      
Application Number 18111661
Status Pending
Filing Date 2023-02-20
First Publication Date 2023-06-29
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor Agam, Oren

Abstract

A method for retrieving neural network coefficients may include executing neural network operations and storing, in at least one data memory, one or more intermediate results of the neural network operations. The method may also include retrieving, in an iterative manner, subsets of neural network coefficients related to a particular layer of a neural network associated with at least one of the neural network processors. Different ones of the neural network processors may use at least one of the subsets of the neural network coefficients. The retrieving the subsets of neural network coefficients may include caching the subsets in coefficient cache memory. At least some of the subsets may be cached in the coefficient cache memory for up to a first duration, and at least some of the intermediate results may be stored in the at least one data memory for a duration that exceeds the first duration.

IPC Classes  ?

  • G06F 9/38 - Concurrent instruction execution, e.g. pipeline, look ahead
  • G06F 15/80 - Architectures of general purpose stored program computers comprising an array of processing units with common control, e.g. single instruction multiple data processors

33.

SYSTEM FOR MAPPING TRAFFIC LIGHTS AND ASSOCIATED TRAFFIC LIGHT CYCLE TIMES

      
Application Number 18175493
Status Pending
Filing Date 2023-02-27
First Publication Date 2023-06-29
Owner MOBILEYE VISION TECHNOLOGIES LTD. (Israel)
Inventor
  • Huberman, David
  • Barzilay, Ouriel

Abstract

Systems and methods are provided for autonomous vehicle navigation. The systems and methods may map a lane mark, may map a directional arrow, selectively harvest road information based on data quality, map road segment free spaces, map traffic lights and determine traffic light relevancy, and map traffic lights and associated traffic light cycle times.

IPC Classes  ?

  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
  • G01C 21/00 - Navigation; Navigational instruments not provided for in groups
  • B60W 10/18 - Conjoint control of vehicle sub-units of different type or different function including control of braking systems
  • B60W 10/20 - Conjoint control of vehicle sub-units of different type or different function including control of steering systems
  • B60W 30/18 - Propelling the vehicle
  • G01C 21/34 - Route searching; Route guidance
  • G01C 21/36 - Input/output arrangements for on-board computers
  • G08G 1/14 - Traffic control systems for road vehicles indicating individual free spaces in parking areas
  • G08G 1/00 - Traffic control systems for road vehicles
  • B60T 7/12 - Brake-action initiating means for initiation not subject to will of driver or passenger
  • B62D 6/00 - Arrangements for automatically controlling steering depending on driving conditions sensed and responded to, e.g. control circuits
  • G05D 1/02 - Control of position or course in two dimensions
  • G01C 21/30 - Map- or contour-matching
  • B60W 30/12 - Lane keeping
  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
  • G08G 1/01 - Detecting movement of traffic to be counted or controlled
  • G08G 1/07 - Controlling traffic signals
  • G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
  • G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads

34.

Secure system that includes driving related systems

      
Application Number 18116982
Grant Number 11951998
Status In Force
Filing Date 2023-03-03
First Publication Date 2023-06-29
Grant Date 2024-04-09
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor
  • Smolyansky, Leonid
  • Arbeli, Yosi
  • Rushinek, Elchanan
  • Cohen, Shmuel

Abstract

A system that may include multiple driving related systems that are configured to perform driving related operations; a selection module; multiple fault collection and management units that are configured to monitor statuses of the multiple driving related systems and to report, to the selection module, at least one out of (a) an occurrence of at least one critical fault, (b) an absence of at least one critical fault, (c) an occurrence of at least one non-critical fault, and (d) an absence of at least one non-critical fault; and wherein the selection module is configured to respond to the report by performing at least one out of: (i) reset at least one entity out of the multiple fault collection and management units and the multiple driving related systems; and (ii) select data outputted from a driving related systems.

IPC Classes  ?

  • B60W 50/02 - Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
  • B60W 50/023 - Avoiding failures by using redundant parts
  • G07C 5/02 - Registering or indicating driving, working, idle, or waiting time only
  • G07C 5/08 - Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle, or waiting time

35.

SYSTEMS AND METHODS FOR ANALYZING AND RESOLVING IMAGE BLOCKAGES

      
Application Number 18147304
Status Pending
Filing Date 2022-12-28
First Publication Date 2023-06-29
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor
  • Eldar, Avigdor
  • Springer, Ofer

Abstract

Systems and methods are provided for training and using a model to predict image blockages. In one implementation, a system may comprise at least one processor. The at least one processor may be programmed to obtain a plurality of training images, each of the plurality of training images being associated with a blockage indicator representing a presence of a blockage or an absence of a blockage; analyze intensities of pixels located at corresponding pixel coordinates of the plurality of training images; and cause the model to undergo at least one training process based on the plurality of training images, the blockage indicator associated with each of the plurality of training images, and the analysis of the intensities of the pixels.

IPC Classes  ?

  • G06V 10/774 - Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
  • G06V 10/50 - Extraction of image or video features by summing image-intensity values; Projection analysis
  • G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
  • G06V 10/82 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
  • G06V 10/98 - Detection or correction of errors, e.g. by rescanning the pattern or by human intervention; Evaluation of the quality of the acquired patterns
  • G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
  • B60Q 9/00 - Arrangement or adaptation of signal devices not provided for in one of main groups

36.

SYSTEMS AND METHODS FOR PROCESSING ATOMIC COMMANDS

      
Application Number 18110605
Status Pending
Filing Date 2023-02-16
First Publication Date 2023-06-22
Owner MOBILEYE VISION TECHNOLOGIES LTD. (Israel)
Inventor
  • Agam, Oren
  • Atedgi, Liron
  • Galil, Eran
  • Kuch, Liron

Abstract

A method for executing atomic commands may include receiving, by an interface of an atomic command execution unit and from a plurality of requestors, a plurality of memory mapped atomic commands. The method may also include executing the plurality of memory mapped atomic commands to provide output values. The method may further include storing, in a first memory unit of the atomic command execution unit, requestor specific information. Different entries of a plurality of entries of the first memory unit may be allocated to different requestors of the plurality of requestors. The method may also include storing, in a second memory unit of the atomic command execution unit, the output values of the plurality of memory mapped atomic commands, and outputting, by the interface and to at least one of the plurality of requestors, at least one indication indicating a completion of at least one of the atomic commands.

IPC Classes  ?

  • G06F 9/46 - Multiprogramming arrangements
  • G06V 20/54 - Surveillance or monitoring of activities, e.g. for recognising suspicious objects of traffic, e.g. cars on the road, trains or boats

37.

SYSTEMS AND METHODS FOR DECOMPRESSING NEURAL NETWORK COEFFICIENTS

      
Application Number 18109670
Status Pending
Filing Date 2023-02-14
First Publication Date 2023-06-22
Owner MOBILEYE VISION TECHNOLOGIES LTD. (Israel)
Inventor Agam, Oren

Abstract

A method for decompressing data may include receiving a first sequence of bits and performing a plurality of iterations. Each of the plurality of iterations may include scanning bits of the first sequence, starting from a starting point, to search for at least one of a variable length codeword or a bypass indicator, the starting point being either a starting point of the first sequence or a starting point defined in a previous iteration. The method also include, for at least one of the plurality of iterations, when a bypass indicator is found, outputting a neural network coefficient related value (NNCRV) that is non-compressed and follows the bypass indicator, and defining a starting point that follows the NNCRV as a starting point for a next iteration.

IPC Classes  ?

  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
  • G06N 3/045 - Combinations of networks
  • G06N 3/0495 - Quantised networks; Sparse networks; Compressed networks
  • G05D 1/02 - Control of position or course in two dimensions

38.

SYSTEMS AND METHODS FOR MAP-BASED REAL-WORLD MODELING

      
Application Number 18112741
Status Pending
Filing Date 2023-02-22
First Publication Date 2023-06-22
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor
  • Shenfeld, Moshe
  • Chapman, Ruth
  • Hanniel, Iddo
  • Hacohen, Yael
  • Golinsky, Ishay

Abstract

A system for correlating drive information from multiple road segments is disclosed. In one embodiment, the system includes memory and a processor configured to receive drive information from vehicles that traversed a first road segment and vehicles that traversed a second road segment. The processor is configured to correlate the drive information from the vehicles to provide a first road model segment representative of the first road segment and a second road model segment representative of the second road segment. The processor correlates the first road model segment with the second road model segment to provide a correlated road segment model if a drivable distance between a first point associated with the first road segment and a second point associated with the second road segment is less than or equal to a predetermined distance threshold, and stores the correlated road segment model as part of a sparse navigational map.

IPC Classes  ?

  • G05D 1/02 - Control of position or course in two dimensions
  • G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
  • G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle

39.

SYSTEMS AND METHODS FOR DYNAMIC HEADLIGHT LEVELING

      
Application Number 18151790
Status Pending
Filing Date 2023-01-09
First Publication Date 2023-06-15
Owner MOBILEYE VISION TECHNOLOGIES LTD. (Israel)
Inventor
  • Hershkovich, Shai
  • Shenfeld, Moshe
  • Caplan, Dmitri

Abstract

A system for navigating a host vehicle may include memory and at least one processor configured to receive a plurality of images acquired by a camera onboard the host vehicle; generate, based on analysis of the plurality of images, a road geometry model for a segment of road forward of the host vehicle; determine, based on analysis of at least one of the plurality of images, one or more indicators of an orientation of the host vehicle; and generate, based on the one or more indicators of orientation of the host vehicle and the road geometry model for the segment of road forward of the host vehicle, one or more output signals configured to cause a change in a pointing direction of a movable headlight onboard the host vehicle.

IPC Classes  ?

  • B60Q 1/08 - Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights adjustable, e.g. remotely-controlled from inside vehicle automatically
  • G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
  • G06V 10/25 - Determination of region of interest [ROI] or a volume of interest [VOI]
  • B60R 11/04 - Mounting of cameras operative during drive; Arrangement of controls thereof relative to the vehicle

40.

Methods and devices for triggering vehicular actions based on passenger actions

      
Application Number 18077258
Grant Number 11932249
Status In Force
Filing Date 2022-12-08
First Publication Date 2023-06-08
Grant Date 2024-03-19
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor
  • Chao, Min-An
  • Kose Cihangir, Neslihan
  • Rosales, Rafael

Abstract

Autonomous driving system methods and devices which trigger vehicular actions based on the monitoring of one or more occupants of a vehicle are presented. The methods, and corresponding devices, may include identifying a plurality of features in a plurality of subsets of image data detailing the one or more occupants; tracking changes over time of the plurality of features over the plurality of subsets of image data; determining a state, from a plurality of states, of the one or more occupants based on the tracked changes; and triggering the vehicular action based on the determined state.

IPC Classes  ?

  • B60W 30/16 - Control of distance between vehicles, e.g. keeping a distance to preceding vehicle
  • B60W 40/08 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to drivers or passengers
  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
  • G06N 3/04 - Architecture, e.g. interconnection topology
  • G06T 7/50 - Depth or shape recovery
  • G06V 10/764 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
  • G06V 10/82 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
  • G06V 20/40 - Scenes; Scene-specific elements in video content
  • G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
  • G06V 20/59 - Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions

41.

NAVIGATION SYSTEMS AND METHODS FOR DETERMINING OBJECT DIMENSIONS

      
Application Number 17616719
Status Pending
Filing Date 2020-12-31
First Publication Date 2023-06-08
Owner MOBILEYE VISION TECHNOLOGIES LTD. (Israel)
Inventor
  • Shambik, Yaakov
  • Chitrit, Ohad
  • Sachter, Michael
  • Netser, Alon

Abstract

Systems and methods are provided for vehicle navigation. In one implementation, a navigation system for a host vehicle may comprise at least one processor. The processor may be programmed to receive from a camera onboard the host vehicle a plurality of captured images representative of an environment of the host vehicle. The processor may provide each of the plurality of captured images to a target object analysis module including at least one trained model configured to generate an output for each of the plurality of captured image. The processor may receive from the target object analysis module the generated output. The processor may further determine at least one navigational action to be taken by the host vehicle based on the output generated by the target object analysis module. The processor may cause the at least one navigational action to be taken by the host vehicle.

IPC Classes  ?

  • G01C 21/36 - Input/output arrangements for on-board computers
  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles

42.

SYSTEMS AND METHODS FOR NAVIGATING A VEHICLE

      
Application Number 18104048
Status Pending
Filing Date 2023-01-31
First Publication Date 2023-06-01
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor
  • Shalev-Shwartz, Shai
  • Shammah, Shaked
  • Shashua, Amnon

Abstract

An autonomous system may selectively displace human driver control of a host vehicle. The system may receive an image representative of an environment of the host vehicle and detect an obstacle in the environment of the host vehicle based on analysis of the image. The system may monitor a driver input to a throttle, brake, and/or steering control associated with the host vehicle. The system may determine whether the driver input would result in the host vehicle navigating within a proximity buffer relative to the obstacle. If the driver input would not result in the host vehicle navigating within the proximity buffer, the system may allow the driver input to cause a corresponding change in one or more host vehicle motion control systems. If the driver input would result in the host vehicle navigating within the proximity buffer, the system may prevent the driver input from causing the corresponding change.

IPC Classes  ?

  • B60W 30/09 - Taking automatic action to avoid collision, e.g. braking and steering
  • B60W 10/18 - Conjoint control of vehicle sub-units of different type or different function including control of braking systems
  • B60W 10/20 - Conjoint control of vehicle sub-units of different type or different function including control of steering systems
  • B60W 40/06 - Road conditions
  • B60W 40/105 - Speed
  • B60W 50/08 - Interaction between the driver and the control system
  • B60W 30/18 - Propelling the vehicle
  • B60W 30/095 - Predicting travel path or likelihood of collision
  • B60W 50/12 - Limiting control by the driver depending on vehicle state, e.g. interlocking means for the control input for preventing unsafe operation
  • B60W 10/06 - Conjoint control of vehicle sub-units of different type or different function including control of propulsion units including control of combustion engines
  • B60W 30/165 - Control of distance between vehicles, e.g. keeping a distance to preceding vehicle automatically following the path of a preceding lead vehicle, e.g. "electronic tow-bar"

43.

SYSTEMS AND METHODS FOR VEHICLE SPEED AND LATERAL POSITION CONTROL

      
Application Number 18094774
Status Pending
Filing Date 2023-01-09
First Publication Date 2023-05-25
Owner MOBILEYE VISION TECHNOLOGIES LTD. (Israel)
Inventor
  • Hershkovich, Shai
  • Shenfeld, Moshe
  • Caplan, Dmitri

Abstract

A system for navigating a host vehicle may include memory and at least one processor configured to receive a plurality of images acquired by a camera onboard the host vehicle; generate, based on analysis of the plurality of images, a road geometry model for a segment of road forward of the host vehicle; determine, based on analysis of at least one of the plurality of images, one or more indicators of an orientation of the host vehicle; and generate, based on the one or more indicators of orientation of the host vehicle and the road geometry model for the segment of road forward of the host vehicle, one or more output signals configured to cause a change in a pointing direction of a movable headlight onboard the host vehicle.

IPC Classes  ?

  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
  • G06T 7/246 - Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
  • G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
  • G06V 10/44 - Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06T 7/60 - Analysis of geometric attributes
  • B60W 30/18 - Propelling the vehicle

44.

SYSTEMS AND METHODS FOR LOCAL HORIZON AND OCCLUDED ROAD SEGMENT DETECTION

      
Application Number 18151945
Status Pending
Filing Date 2023-01-09
First Publication Date 2023-05-25
Owner MOBILEYE VISION TECHNOLOGIES LTD. (Israel)
Inventor
  • Hershkovich, Shai
  • Shenfeld, Moshe
  • Caplan, Dmitri

Abstract

A system for navigating a host vehicle may include memory and at least one processor configured to receive a plurality of images acquired by a camera onboard the host vehicle; generate, based on analysis of the plurality of images, a road geometry model for a segment of road forward of the host vehicle; determine, based on analysis of at least one of the plurality of images, one or more indicators of an orientation of the host vehicle; and generate, based on the one or more indicators of orientation of the host vehicle and the road geometry model for the segment of road forward of the host vehicle, one or more output signals configured to cause a change in a pointing direction of a movable headlight onboard the host vehicle.

IPC Classes  ?

  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
  • G01C 21/30 - Map- or contour-matching
  • G01C 21/00 - Navigation; Navigational instruments not provided for in groups
  • G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
  • G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle

45.

Vehicle environment modeling with a camera

      
Application Number 18095626
Grant Number 11869253
Status In Force
Filing Date 2023-01-11
First Publication Date 2023-05-25
Grant Date 2024-01-09
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor
  • Stein, Gideon
  • Blumenthal, Itay
  • Moskowitz, Jeffrey
  • Shaag, Nadav
  • Carlebach, Natalie

Abstract

System and techniques for vehicle environment modeling with a camera are described herein. A device for modeling an environment comprises: a hardware sensor interface to obtain a sequence of unrectified images representative of a road environment, the sequence of unrectified images including a first unrectified image, a previous unrectified image, and a previous-previous unrectified image; and processing circuitry to: provide the first unrectified image, the previous unrectified image, and the previous-previous unrectified image to an artificial neural network (ANN) to produce a three-dimensional structure of a scene; determine a selected homography; and apply the selected homography to the three-dimensional structure of the scene to create a model of the road environment.

IPC Classes  ?

  • G06T 7/00 - Image analysis
  • G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
  • G06T 7/50 - Depth or shape recovery
  • B60W 40/06 - Road conditions

46.

SYSTEMS AND METHODS FOR HARVESTING IMAGES FOR VEHICLE NAVIGATION

      
Application Number 17978258
Status Pending
Filing Date 2022-11-01
First Publication Date 2023-05-04
Owner MOBILEYE VISION TECHNOLOGIES LTD. (Israel)
Inventor
  • Guberman, Yahel
  • Fridman, Ofer
  • Ben Shalom, Itai
  • Barahovsky, Ilia
  • Schwartz, Maxim
  • Gdalyahu, Yoram

Abstract

Systems and methods for harvesting images for vehicle navigation are disclosed. In one implementation, a system includes a processor configured to receive an image having been captured by a camera of a vehicle from an environment of the vehicle during a first time period; during a second time period after the first time period: use a first object model to identify a representation of a first object in the image and localize the image relative to a map database based on the representation of the first object; and store the image in the map database based on the localization; and during a third time period after the second time period, use a second object model to identify a representation of a second object in the image, wherein the second object model was not available to the system during the second time period.

IPC Classes  ?

  • G01C 21/00 - Navigation; Navigational instruments not provided for in groups
  • G06V 10/764 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
  • G06V 20/60 - Type of objects
  • G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle

47.

Stereo-assist network for determining an object's location

      
Application Number 17974789
Grant Number 11858504
Status In Force
Filing Date 2022-10-27
First Publication Date 2023-05-04
Grant Date 2024-01-02
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor
  • Shaag, Nadav
  • Springer, Ofer

Abstract

Systems and methods for navigating a host vehicle are disclosed. In one implementation, a system includes a processor configured to receive a first image acquired by a first camera and a second image acquired by a second camera onboard the host vehicle; identify a first representation of an object in the first image and a second representation of the object in the second image; input to a first trained model at least a portion of the first image; input to a second trained model at least a portion of the second image; receive the first signature encoding determined by the first trained model and the second signature encoding determined by the second trained model; input to a third trained model the first signature encoding and the second signature encoding; and receive an indicator of a location of the object determined by the third trained model.

IPC Classes  ?

  • G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
  • B60W 30/095 - Predicting travel path or likelihood of collision

48.

CONTROL LOOP FOR NAVIGATING A VEHICLE

      
Application Number 17905302
Status Pending
Filing Date 2021-04-01
First Publication Date 2023-04-27
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor
  • Molnar, Moran
  • Selig, Ilan
  • Rojas, Ronen

Abstract

A system for navigating a vehicle may include a processor programmed to receive an output provided by a vehicle sensor, and determine a navigational maneuver for the vehicle along a road segment based on the output provided by the vehicle sensor. The processor may also be programmed to determine a yaw rate command and a speed command for implementing the navigational maneuver. The processor may also be programmed to determine a first vehicle steering angle based on the yaw rate and speed commands using a first control subsystem, and determine a second vehicle steering angle based on the yaw rate and speed commands using a second control subsystem. The processor may further be programmed to determine an overall steering command for the vehicle based on a combination of the first and second steering angles, and cause an actuator associated with the vehicle to implement the overall steering command.

IPC Classes  ?

  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
  • B60W 40/12 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to parameters of the vehicle itself
  • B60W 10/20 - Conjoint control of vehicle sub-units of different type or different function including control of steering systems
  • B60W 40/10 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to vehicle motion

49.

EGO MOTION-BASED ONLINE CALIBRATION BETWEEN COORDINATE SYSTEMS

      
Application Number 18086251
Status Pending
Filing Date 2022-12-21
First Publication Date 2023-04-20
Owner MOBILEYE VISION TECHNOLOGIES LTD. (Israel)
Inventor
  • Molad, Assaf
  • Shenfeld, Moshe
  • Hochman, Amit
  • Shachor, Haim
  • Stern, Yotam
  • Sharon, Yuval

Abstract

The present disclosure relates to systems and methods for calibrating a multi-camera navigation system for a vehicle. In one implementation, at least one processing device may receive first and second image frames acquired by a first camera onboard the vehicle; receive first and second image frames acquired by a second camera onboard the vehicle; determine a first ego-motion signal, including an indication of a change in position of the first camera relative to capture times associated with the first and second image frames acquired by the first camera; determine a second ego-motion signal, including an indication of a change in position of the second camera relative to capture times associated with the first and second image frames acquired by the second camera; and determine a relative orientation between the first camera and the second camera based on the first ego-motion signal and the second ego-motion signal.

IPC Classes  ?

  • G06T 7/70 - Determining position or orientation of objects or cameras
  • G06T 7/20 - Analysis of motion
  • G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle

50.

A MULTI-PART COMPARE AND EXCHANGE OPERATION

      
Application Number 17915821
Status Pending
Filing Date 2021-04-01
First Publication Date 2023-04-20
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor
  • Galil, Eran
  • Ouriel, Boaz

Abstract

A method for executing an atomic compare and exchange operation, the method may include processing a compare command and a conditional exchange command while considering hardware failures.

IPC Classes  ?

  • G06F 11/14 - Error detection or correction of the data by redundancy in operation, e.g. by using different operation sequences leading to the same result
  • G06F 9/30 - Arrangements for executing machine instructions, e.g. instruction decode

51.

Computer-assisted driving method and apparatus including automatic mitigation of potential emergency

      
Application Number 18077110
Grant Number 11952003
Status In Force
Filing Date 2022-12-07
First Publication Date 2023-04-13
Grant Date 2024-04-09
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor
  • Tatourian, Igor
  • Moustafa, Hassnaa
  • Zage, David

Abstract

Apparatuses, storage media and methods associated with computer assisted driving (CAD), are disclosed herein. In some embodiments, an apparatus includes an autopilot engine to automatically pilot the vehicle out of a potential emergency situation, including to automatically pilot the vehicle for a period of time in view of a plurality of operational guardrails determined in real time for each of a plurality of timing windows; and a mitigation unit to conditionally activate the autopilot engine, including to activate the autopilot engine in response to analysis results indicative of the vehicle being potentially operated into the emergency situation manually. Other embodiments are also described and claimed.

IPC Classes  ?

  • B60W 50/12 - Limiting control by the driver depending on vehicle state, e.g. interlocking means for the control input for preventing unsafe operation
  • B60W 30/09 - Taking automatic action to avoid collision, e.g. braking and steering
  • B60W 30/095 - Predicting travel path or likelihood of collision
  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot

52.

SYSTEMS AND METHODS FOR DETECTING VEHICLE WHEEL SLIPS

      
Application Number 17756857
Status Pending
Filing Date 2021-06-22
First Publication Date 2023-04-06
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor
  • Shenfeld, Moshe
  • Oskar, Gilad

Abstract

The present disclosure relates to systems and methods for identifying a wheel slip condition. In one implementation, a processor may receive a plurality of image frames acquired by an image capture device of a vehicle. The processor may also determine based on analysis of the images one or more indicators of a motion of the vehicle; and determine a predicted wheel rotation corresponding to the motion of the vehicle. The processor may further receive sensor outputs indicative of measured wheel rotation associated with a wheel; and compare the predicted wheel rotation to the measured wheel rotation for the wheel. The processor may additionally detect a wheel slip condition wheel based on a discrepancy between the predicted wheel rotation and the measured wheel rotation; and initiate at least one navigational action in response to the detected wheel slip condition associated with the wheel.

IPC Classes  ?

  • G01C 21/34 - Route searching; Route guidance
  • G01C 21/00 - Navigation; Navigational instruments not provided for in groups
  • B60W 30/02 - Control of vehicle driving stability
  • B60W 30/18 - Propelling the vehicle
  • B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
  • B60W 10/20 - Conjoint control of vehicle sub-units of different type or different function including control of steering systems
  • B60W 10/22 - Conjoint control of vehicle sub-units of different type or different function including control of suspension systems
  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
  • B60W 10/18 - Conjoint control of vehicle sub-units of different type or different function including control of braking systems
  • B60W 50/14 - Means for informing the driver, warning the driver or prompting a driver intervention

53.

FLOW CONTROL INTEGRITY

      
Application Number 17439125
Status Pending
Filing Date 2021-03-31
First Publication Date 2023-03-23
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor Kipnis, Aviad

Abstract

A method for evaluating flow control integrity, the method may include detecting that a flow reached a flow change command or is about to reach the flow change command, wherein the flow change command belongs to a current software environment, wherein the current software environment is identified by a current environment identifier; retrieving a shadow environment identifier that is a last environment identifier stored in a shadow stack, wherein the shadow environment identifier identifies a software environment having an entry region that was a last entry region accessed by the flow, wherein the entry region comprises a shadow stack update instruction that was executed by the flow; comparing the shadow environment identifier to the current environment identifier; and detecting a potential attack when the shadow environment identifier differs from the current environment identifier.

IPC Classes  ?

  • G06F 21/52 - Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems during program execution, e.g. stack integrity, buffer overflow or preventing unwanted data erasure
  • G06F 21/55 - Detecting local intrusion or implementing counter-measures

54.

AUTOMATIC TEST GENERATION FOR HIGHLY COMPLEX EXISTING SOFTWARE

      
Application Number 17612670
Status Pending
Filing Date 2021-03-11
First Publication Date 2023-03-23
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor
  • Gurvitz, Elya
  • Kimelman, Amit
  • Fabris, Simone

Abstract

Techniques are disclosed for the generation of automatic software tests for complex software systems, such as operating systems (OS) and/or systems that may be implemented as part of an autonomous vehicle (AV) or advanced driving assistance system (ADAS). The technique generates tests using a tool, such as a stressor, which stresses a particular system under test in multiple ways. For every run of the stressor, the functions of the system that are invoked during the test are captured. A check is then performed to determine if this set of functions corresponds to one of the test scenarios for which testing is desired. If the set of functions that were invoked matches the set of functions that defines the test, then the configuration of the stressor is stored, and this stressor configuration is considered as the test for a particular scenario.

IPC Classes  ?

  • G06F 11/36 - Preventing errors by testing or debugging of software

55.

Safety system for a vehicle

      
Application Number 17941044
Grant Number 11814052
Status In Force
Filing Date 2022-09-09
First Publication Date 2023-03-16
Grant Date 2023-11-14
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor
  • Gonzalez Aguirre, David Israel
  • Alvarez, Ignacio
  • Elli, Maria Soledad
  • Felip Leon, Javier
  • Turek, Javier

Abstract

A safety system for a vehicle may include one or more processors configured to determine, based on a friction prediction model, one or more predictive friction coefficients between the ground and one or more tires of the ground vehicle using first ground condition data and second ground condition data. The first ground condition data represent conditions of the ground at or near the position of the ground vehicle, and the second ground condition data represent conditions of the ground in front of the ground vehicle with respect to a driving direction of the ground vehicle. The one or more processors are further configured to determine driving conditions of the ground vehicle using the determined one or more predictive friction coefficients.

IPC Classes  ?

  • B60W 40/08 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to drivers or passengers
  • B60W 40/068 - Road friction coefficient
  • B60W 40/10 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to vehicle motion
  • B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot

56.

Acceleration of data processing for object detection

      
Application Number 17957161
Grant Number 11887377
Status In Force
Filing Date 2022-09-30
First Publication Date 2023-03-16
Grant Date 2024-01-30
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor
  • Chattopadhyay, Rita
  • Martinez-Canales, Monica Lucia
  • Wolak, Tomasz J.

Abstract

Spatial data may be divided along an axis of the second dimension into a first data segment and a second data segment, such that the first data segment is limited to data points of the spatial data with second dimension coordinates within a first range and the second data segment is limited to data points of the spatial data with second dimension coordinates within a second range. A first processing element may execute an object detection process on the first data segment to generate a first list of objects within the first data segment. A second processing element may execute the object detection process on the second data segment to generate a second list of objects within the second data segment. A first set of objects detected in the first data segment may be combined with a second set of objects detected in the second data segment.

IPC Classes  ?

  • G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
  • G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles
  • G06V 10/25 - Determination of region of interest [ROI] or a volume of interest [VOI]
  • G06V 10/764 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
  • G01S 17/89 - Lidar systems, specially adapted for specific applications for mapping or imaging

57.

Methods and apparatus to provide accident avoidance information to passengers of autonomous vehicles

      
Application Number 17947792
Grant Number 11807273
Status In Force
Filing Date 2022-09-19
First Publication Date 2023-03-09
Grant Date 2023-11-07
Owner Mobileye Vision Technoloties Ltd. (Israel)
Inventor
  • Yurdana, Matt
  • Weast, John
  • Alvarez, Ignacio

Abstract

Methods and apparatus to provide accident avoidance information to passengers of autonomous vehicles are disclosed. An example apparatus includes a safety analyzer to determine an autonomously controlled vehicle is in a safe situation at a first point in time and in a dangerous situation at a second point in time. The example apparatus also includes a user interface generator to: generate user interface data to define content for a user interface to be displayed via a screen, the user interface to include a graphical representation of the vehicle, the user interface to graphically indicate the vehicle is in the safe situation at the first point in time; and modify the user interface data so that the user interface graphically indicates the vehicle is in the dangerous situation at the second point in time.

IPC Classes  ?

  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
  • B60W 30/08 - Predicting or avoiding probable or impending collision
  • B60W 50/16 - Tactile feedback to the driver, e.g. vibration or force feedback to the driver on the steering wheel or the accelerator pedal
  • B60W 50/14 - Means for informing the driver, warning the driver or prompting a driver intervention

58.

ALIGNING ROAD INFORMATION FOR NAVIGATION

      
Application Number 17975258
Status Pending
Filing Date 2022-10-27
First Publication Date 2023-03-09
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor
  • Shapira, Dori
  • Tubis, Igor
  • Rokni, Uri
  • Ratner, Nir

Abstract

The present disclosure relates to systems and methods for aligning navigation information from a plurality of vehicles. In one implementation, at least one processing device may receive first navigational information from a first vehicle and second navigational information from a second vehicle. The first and second navigational information may be associated with a common road segment. The processor may divide the common road segment into a first road section and a second road section that join at a common point. The processor may then align the first and second navigational information by rotating at least a portion of the first navigational information or the second navigational information relative to the common point. The processor may store the aligned navigational information in association with the common road segment and send the aligned navigational information to vehicles for use in navigating along the common road segment.

IPC Classes  ?

  • G01C 21/32 - Structuring or formatting of map data
  • G05D 1/02 - Control of position or course in two dimensions

59.

Vehicle operation safety model compliance measurement

      
Application Number 17789712
Grant Number 11697406
Status In Force
Filing Date 2021-11-19
First Publication Date 2023-02-16
Grant Date 2023-07-11
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor
  • Zhu, Qianying
  • Guo, Ping
  • Zhang, Xinxin
  • Li, Fei
  • Wu, Xiangbin

Abstract

System and techniques for vehicle operation safety model (VOSM) compliance measurement are described herein. A subject vehicle is tested in a vehicle following scenario against VOSM parameter compliance. The test measures the subject vehicle activity during phases of the following scenario in which a lead vehicle slows and produces log data and calculations that form the basis of a VOSM compliance measurement.

IPC Classes  ?

  • B60T 17/22 - Devices for monitoring or checking brake systems; Signal devices
  • B60T 8/17 - Using electrical or electronic regulation means to control braking
  • B60T 8/171 - Detecting parameters used in the regulation; Measuring values used in the regulation

60.

APPLYING A TWO DIMENSIONAL (2D) KERNEL ON AN INPUT FEATURE MAP

      
Application Number 17884948
Status Pending
Filing Date 2022-08-10
First Publication Date 2023-02-16
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor
  • Weisel, Orly
  • Fais, Yaniv
  • Hirsch, Shira

Abstract

A method, integrated circuit, and a computer readable medium that stores instructions for reducing IO traffic from a global or remote memory unit to a buffer of a neural network unit, by using overlap rows of an input feature map tile.

IPC Classes  ?

  • G06N 3/08 - Learning methods
  • G06N 3/063 - Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons using electronic means

61.

Vehicle operation safety model test system

      
Application Number 17789426
Grant Number 11772666
Status In Force
Filing Date 2021-12-17
First Publication Date 2023-02-09
Grant Date 2023-10-03
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor
  • Zhu, Qianying
  • Zhang, Lidan
  • Wu, Xiangbin
  • Zhang, Xinxin
  • Li, Fei

Abstract

System and techniques for test scenario verification, for a simulation of an autonomous vehicle safety action, are described. In an example, measuring performance of a test scenario used in testing an autonomous driving safety requirement includes: defining a test environment for a test scenario that tests compliance with a safety requirement including a minimum safe distance requirement; identifying test procedures to use in the test scenario that define actions for testing the minimum safe distance requirement; identifying test parameters to use with the identified test procedures, such as velocity, amount of braking, timing of braking, and rate of acceleration or deceleration; and creating the test scenario for use in an autonomous driving test simulator. Use of the test scenario includes applying the identified test procedures and the identified test parameters to identify a response of a test vehicle to the minimum safe distance requirement.

IPC Classes  ?

  • B60W 50/02 - Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
  • B60W 30/16 - Control of distance between vehicles, e.g. keeping a distance to preceding vehicle
  • B60W 30/165 - Control of distance between vehicles, e.g. keeping a distance to preceding vehicle automatically following the path of a preceding lead vehicle, e.g. "electronic tow-bar"

62.

TECHNOLOGIES FOR PROVIDING A COGNITIVE CAPACITY TEST FOR AUTONOMOUS DRIVING

      
Application Number 17860741
Status Pending
Filing Date 2022-07-08
First Publication Date 2023-02-02
Owner MOBILEYE VISION TECHNOLOGIES LTD. (Israel)
Inventor
  • Swan, Johanna
  • Azizi, Shahrnaz
  • Baskaran, Rajashree
  • Ortiz, Melissa
  • Adenwala, Fatema
  • Yu, Mengjie

Abstract

Technologies for providing a cognitive capacity test for autonomous driving include a compute device. The compute device includes circuitry that is configured to display content to a user, prompt a message to the user to turn user’s attention to another activity that needs situational awareness, receive a user response, and analyze the user response to determine an accuracy of the user response and a response time, wherein the accuracy and response time are indicative of a cognitive capacity of the user to assume control of an autonomous vehicle when the autonomous vehicle encounters a situation that the vehicle is unable to navigate.

IPC Classes  ?

  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
  • A61B 5/16 - Devices for psychotechnics; Testing reaction times
  • A61B 5/18 - Devices for psychotechnics; Testing reaction times for vehicle drivers
  • A61B 5/00 - Measuring for diagnostic purposes ; Identification of persons
  • A63F 13/44 - Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment involving timing of operations, e.g. performing an action within a time slot
  • A63F 13/803 - Driving vehicles or craft, e.g. cars, airplanes, ships, robots or tanks

63.

Map tile optimization based on tile connectivity

      
Application Number 17635635
Grant Number 11768085
Status In Force
Filing Date 2021-03-23
First Publication Date 2023-01-05
Grant Date 2023-09-26
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor
  • Schwartz, Maxim
  • Goldman, Yehonatan
  • Cohen, Asaf
  • Fisher, Amiel

Abstract

A system for vehicle navigation may include a processor including a circuitry and a memory. The memory may include instructions that when executed by the circuitry cause the processor to receive navigational information associated with the vehicle including an indicator of a location of the vehicle, and determine target navigational map segments to retrieve from a map database. The map database may include stored navigational map segments each corresponding to a real-world area. The determination of the target navigational map segments may be based on the indicator of vehicle location and on map segment connectivity information associated with the stored navigational map segments. The instructions may also cause the processor to initiate downloading of the target navigational map segments from the map database, and cause the vehicle to navigate along a target trajectory included in one or more of the target navigational map segments downloaded from the map database.

IPC Classes  ?

  • G01C 21/00 - Navigation; Navigational instruments not provided for in groups
  • G01C 21/36 - Input/output arrangements for on-board computers

64.

SYSTEMS AND METHODS FOR PREDICTING BLIND SPOT INCURSIONS

      
Application Number 17642894
Status Pending
Filing Date 2020-09-17
First Publication Date 2023-01-05
Owner MOBILEYE VISION TECHNOLOGIES LTD. (Israel)
Inventor
  • Elimaleh, Yaniv
  • Walter, Samantha

Abstract

Systems and methods are provided for predicting blind spot incursions for a host vehicle. In one implementation, a navigation system for a host vehicle may comprise a processor. The processor may be programmed to receive, from an image capture device located on a rear of the host vehicle, at least one image representative of an environment of the host vehicle. The processor may be programmed to analyze the at least one image to identify an object in the environment of the host vehicle and to determine kinematic information associated with the object. The processor may further be programmed to predict, based on the kinematic information, that the object will travel in a region outside of a field of view of the image capture device and perform a control action based on the prediction.

IPC Classes  ?

  • G08G 1/16 - Anti-collision systems
  • G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
  • G06T 7/70 - Determining position or orientation of objects or cameras
  • G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
  • G08G 1/01 - Detecting movement of traffic to be counted or controlled
  • B60W 40/04 - Traffic conditions
  • B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
  • B60W 30/18 - Propelling the vehicle
  • B60W 30/095 - Predicting travel path or likelihood of collision

65.

SYSTEMS AND METHODS FOR MONITORING TRAFFIC LANE CONGESTION

      
Application Number 17642897
Status Pending
Filing Date 2020-09-17
First Publication Date 2023-01-05
Owner MOBILEYE VISION TECHNOLOGIES LTD. (Israel)
Inventor Walter, Samantha

Abstract

Systems and methods are provided for predicting blind spot incursions for a host vehicle. In one implementation, a navigation system for a host vehicle may comprise a processor. The processor may be programmed to receive, from an image capture device located on a rear of the host vehicle, at least one image representative of an environment of the host vehicle. The processor may be programmed to analyze the at least one image to identify an object in the environment of the host vehicle and to determine kinematic information associated with the object. The processor may further be programmed to predict, based on the kinematic information, that the object will travel in a region outside of a field of view of the image capture device and perform a control action based on the prediction.

IPC Classes  ?

  • G08G 1/01 - Detecting movement of traffic to be counted or controlled
  • G08G 1/065 - Traffic control systems for road vehicles by counting the vehicles in a section of the road or in a parking area, i.e. comparing incoming count with outgoing count
  • G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
  • G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle

66.

DUAL SENSOR READOUT CHANNEL TO ALLOW FOR FREQUENCY DETECTION

      
Application Number 17898610
Status Pending
Filing Date 2022-08-30
First Publication Date 2023-01-05
Owner MOBILEYE VISION TECHNOLOGIES LTD. (Israel)
Inventor Bowers, Gabriel

Abstract

The present disclosure relates to navigation and to systems and methods for using a dual sensor readout channel to allow for frequency detection. In one implementation, at least one processing device may receive a plurality of images acquired by a camera onboard a host vehicle, wherein the plurality of images are received via a first channel and via a second channel, and wherein the first channel is associated with a first frame capture rate, and the second channel is associated with a second frame capture rate different from the first frame capture rate. The processing device may use images received via the first channel to detect flickering and non-flickering light sources in an environment of the host vehicle; and provide, based on images received via the second channel, images for showing on one or more human-viewable displays.

IPC Classes  ?

  • H04N 5/235 - Circuitry for compensating for variation in the brightness of the object
  • G01C 21/36 - Input/output arrangements for on-board computers
  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
  • B60W 50/14 - Means for informing the driver, warning the driver or prompting a driver intervention
  • G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle

67.

SYSTEMS AND METHODS FOR MONITORING LANE MARK QUALITY

      
Application Number 17901040
Status Pending
Filing Date 2022-09-01
First Publication Date 2022-12-29
Owner MOBILEYE VISION TECHNOLOGIES, LTD. (Israel)
Inventor
  • Rabani, Meital
  • Ben-Hamo, Michal
  • Schwartz, Maxim
  • Rosenberg-Biton, Efrat
  • Shay, Yizhar
  • Rappel-Kroyzer, Or

Abstract

A host vehicle-based feature harvester is disclosed. In one implementation, the feature harvester includes memory and a processor configured to receive a plurality of images captured by a camera onboard the host vehicle, the plurality of images being representative of an environment of the host vehicle; analyze at least one image from the plurality of images to identify a representation of a lane mark; select at least one sample area of the representation of the lane mark, wherein the at least one sample area is associated with an image location of at least a portion of the representation of lane mark; determine a location identifier of the at least one sample area; determine a surface quality indicator associated with the at least one sample area; and cause transmission of the location identifier and the surface quality indicator to an entity remotely-located relative to the host vehicle.

IPC Classes  ?

  • G01C 21/00 - Navigation; Navigational instruments not provided for in groups
  • G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
  • G06T 7/00 - Image analysis
  • G01S 17/89 - Lidar systems, specially adapted for specific applications for mapping or imaging
  • G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles

68.

SYSTEMS AND METHODS FOR DETERMINING ROAD SAFETY

      
Application Number 17662523
Status Pending
Filing Date 2022-05-09
First Publication Date 2022-12-15
Owner MOBILEYE VISION TECHNOLOGIES LTD. (Israel)
Inventor
  • Bolless, Eiran
  • Karavany, Ido
  • Neuhof, Bitya
  • Rappel-Kroyzer, Or
  • Shpigelman, Shahar
  • Ben-Ami, Hila
  • Aviad, Efrat

Abstract

A system for determining safety of a road segment may include at least one processor programmed to receive, from a first vehicle, first navigation information associated with the road segment. The first navigation information may include information collected by a first sensor of the first vehicle from an environment of the first vehicle. The at least one processor may also be programmed to receive, from a second vehicle, second navigation information associated with the road segment. The second navigation information may include information collected by a second sensor of the second vehicle from an environment of the second vehicle. The at least one processor may further be programmed to determine, based on the first navigation information and the second navigation information, a score representative of the safety of the road segment, and transmit, to a third vehicle, the score representative of the safety of the road segment.

IPC Classes  ?

69.

CROWD-SOURCED 3D POINTS AND POINT CLOUD ALIGNMENT

      
Application Number 17824264
Status Pending
Filing Date 2022-05-25
First Publication Date 2022-12-01
Owner MOBILEYE VISION TECHNOLOGIES LTD. (Israel)
Inventor
  • Wachtel, Gideon
  • Taieb, Yoav
  • Guberman, Yahel
  • Fridman, Ofer
  • Cohen Maslaton, Raz
  • Shenfeld, Moshe
  • Segel, Ori
  • Springer, Ofer

Abstract

Systems and methods are provided for vehicle navigation. In one implementation, a host vehicle-based sparse map feature harvester system may include at least one processor programmed to receive a plurality of images captured by a camera onboard the host vehicle as the host vehicle travels along a road segment in a first direction, wherein the plurality of images are representative of an environment of the host vehicle; detect one or more semantic features represented in one or more of the plurality of images, the one or more semantic features each being associated with a predetermined object type classification; identify at least one position descriptor associated with each of the detected one or more semantic features; identify three-dimensional feature points associated with one or more detected objects represented in at least one of the plurality of images; receive position information, for each of the plurality of images, wherein the position information is indicative of a position of the camera when each of the plurality of images was captured; and cause transmission of drive information for the road segment to an entity remotely-located relative to the host vehicle, wherein the drive information includes the identified at least one position descriptor associated with each of the detected one or more semantic features, the identified three-dimensional feature points, and the position information.

IPC Classes  ?

  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods
  • G01S 19/45 - Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
  • G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
  • G06V 10/44 - Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
  • G08G 1/04 - Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
  • B60R 1/22 - Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
  • G01C 21/00 - Navigation; Navigational instruments not provided for in groups

70.

BLINKING TRAFFIC LIGHT DETECTION

      
Application Number 17885300
Status Pending
Filing Date 2022-08-10
First Publication Date 2022-12-01
Owner MOBILEYE VISION TECHNOLOGIES LTD. (Israel)
Inventor
  • Oren, Gal
  • Levy, Sagi

Abstract

Systems and methods are provided for vehicle navigation. In one implementation, a system for navigating a vehicle may include at least one processor configured to receive a first image frame; detect in the first image frame a representation of a traffic light and determine a color state associated with lamps included on the traffic light. The at least one processor may receive an additional image frame includes a representation of the at least one traffic light; and determine, based on a comparison of the first image frame and the additional image frame, whether the at least one traffic light includes a blinking lamp. If the at least one traffic light includes a blinking lamp, the processor may cause the vehicle to implement a navigational action relative the traffic light in accordance with the determination and also based on a detected color state for the blinking lamp.

IPC Classes  ?

  • G08G 1/0967 - Systems involving transmission of highway information, e.g. weather, speed limits
  • G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads

71.

TRAFFIC LIGHT RELEVANCY

      
Application Number 17885350
Status Pending
Filing Date 2022-08-10
First Publication Date 2022-12-01
Owner MOBILEYE VISION TECHNOLOGIES LTD. (Israel)
Inventor
  • Huberman, David
  • Barlev, Jonathan

Abstract

Systems and methods are provided for vehicle navigation. In one implementation, a system for navigating a vehicle may include at least one processor configured to receive a first image frame; detect in the first image frame a representation of a traffic light and determine a color state associated with lamps included on the traffic light. The at least one processor may receive an additional image frame includes a representation of the at least one traffic light; and determine, based on a comparison of the first image frame and the additional image frame, whether the at least one traffic light includes a blinking lamp. If the at least one traffic light includes a blinking lamp, the processor may cause the vehicle to implement a navigational action relative the traffic light in accordance with the determination and also based on a detected color state for the blinking lamp.

IPC Classes  ?

  • G08G 1/0967 - Systems involving transmission of highway information, e.g. weather, speed limits
  • G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
  • G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle

72.

SYSTEMS AND METHODS FOR MANAGING STORAGE SPACE

      
Application Number 17755064
Status Pending
Filing Date 2020-10-23
First Publication Date 2022-12-01
Owner MOBILEYE VISION TECHNOLOGIES LTD. (Israel)
Inventor
  • Barros, Yanor
  • Kronstein, Dror
  • Zhalkin, Vladislav

Abstract

A system for managing storage space for a computer may include at least one processor programmed to determine a maximum data space for a computing task. The at least one processor may also be programmed to create a file having a maximum size equal to or greater than the maximum data space. The at least one processor may further be programmed to create a virtual device linked to the file and mount a filesystem inside the virtual device. The at least one processor may also be programmed to mount the virtual device. The at least one processor may further be programmed to determine that the computing task is completed. The at least one processor may further be programmed to unmount the virtual device.

IPC Classes  ?

  • G06F 9/50 - Allocation of resources, e.g. of the central processing unit [CPU]
  • G06F 16/188 - Virtual file systems

73.

TRAFFIC SIGN RELEVANCY

      
Application Number 17885373
Status Pending
Filing Date 2022-08-10
First Publication Date 2022-12-01
Owner MOBILEYE VISION TECHNOLOGIES LTD. (Israel)
Inventor
  • Heilbron, Adina
  • Wolff, Daniel
  • Shaashua, Barak
  • Huberman, David
  • Barlev, Jonathan
  • Gdalyahu, Yoram

Abstract

Systems and methods are provided for vehicle navigation. In one implementation, a system for navigating a vehicle may include at least one processor configured to receive a first image frame; detect in the first image frame a representation of a traffic light and determine a color state associated with lamps included on the traffic light. The at least one processor may receive an additional image frame includes a representation of the at least one traffic light; and determine, based on a comparison of the first image frame and the additional image frame, whether the at least one traffic light includes a blinking lamp. If the at least one traffic light includes a blinking lamp, the processor may cause the vehicle to implement a navigational action relative the traffic light in accordance with the determination and also based on a detected color state for the blinking lamp.

IPC Classes  ?

  • G08G 1/0967 - Systems involving transmission of highway information, e.g. weather, speed limits
  • G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads

74.

ON THE FLY CONFIGURATION OF A PROCESSING CIRCUIT

      
Application Number 17747428
Status Pending
Filing Date 2022-05-18
First Publication Date 2022-11-24
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor Srebnik, Daniel

Abstract

A method for on-the fly updating of a processing circuit, the method includes monitoring, by multiple coroutines and during a monitoring period, a progress of multiple suspend-update-resume sequences executed by the processing circuit, wherein at least some of the multiple execute and suspend-update-resume sequences partially overlap and are not mutually synchronized, and wherein each suspend-update-resume sequence comprises on-the-fly updates; and determining, by a merged coroutine, timings of the multiple suspend-update-resume sequences, wherein the determining comprises performing multiple calculation iterations, wherein a calculation iteration of the multiple calculation iterations comprises calculating, in a an iterative manner, a timing of a next suspend-update-resume sequence to be executed out of the multiple suspend-update-resume sequences, and wherein the calculating is responsive to timing offsets between different suspend-update-resume sequences.

IPC Classes  ?

  • G06F 9/448 - Execution paradigms, e.g. implementations of programming paradigms
  • G06F 9/38 - Concurrent instruction execution, e.g. pipeline, look ahead
  • G06F 11/30 - Monitoring

75.

FAST CONFIGURATION OF A PROCESSING CIRCUIT

      
Application Number 17747452
Status Pending
Filing Date 2022-05-18
First Publication Date 2022-11-24
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor
  • Srebnik, Daniel
  • Dubinski, Yan

Abstract

A method for configuring a processing circuit, the method may include (i) receiving, by the processing circuit, a compressed configuration information data structure (CCDS) that comprises multiple segments, wherein the CCDS was generated by a size-preserving compression process that maintains a size of the segments; (ii) decompressing the CCDS, by the processing circuit, to provide decompressed configuration information, wherein the decompressing comprises: searching for headers, wherein a header comprises sequence parameters, wherein the sequence parameters comprise at least one out of a length, an address field, and a type; and (iii) configuring the processing circuit using the decompressed configuration information, wherein the configuring is executed based on the headers.

IPC Classes  ?

76.

Systems and Methods for Selectively Decelerating a Vehicle

      
Application Number 17805981
Status Pending
Filing Date 2022-06-08
First Publication Date 2022-11-24
Owner MOBILEYE VISION TECHNOLOGIES LTD. (Israel)
Inventor
  • Kario, Jack
  • Navon, Michael
  • Aloni, David

Abstract

Systems and methods are provided for vehicle navigation. In one implementation, a system for a host vehicle includes at least one processor programmed to determine, based on an output of at least one sensor of the host vehicle, one or more target dynamics of a target vehicle; determine, based on one or more host dynamics of the host vehicle and the target dynamics, a time to collision; determine, based on the time to collision and the host dynamics, a host deceleration for the host vehicle to avoid the collision; determine, based on the time to collision and the target dynamics, a target deceleration for the target vehicle to avoid the collision; determine, based on the host deceleration and the target deceleration, a host deceleration threshold; and determine, based on a speed of the host vehicle and the host deceleration threshold, to brake the host vehicle.

IPC Classes  ?

  • B60W 30/09 - Taking automatic action to avoid collision, e.g. braking and steering
  • B60W 30/095 - Predicting travel path or likelihood of collision
  • B60W 40/04 - Traffic conditions
  • B60W 40/105 - Speed
  • B60W 40/12 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to parameters of the vehicle itself
  • B60W 50/16 - Tactile feedback to the driver, e.g. vibration or force feedback to the driver on the steering wheel or the accelerator pedal
  • B60W 10/18 - Conjoint control of vehicle sub-units of different type or different function including control of braking systems

77.

TRANSPOSED CONVOLUTION ON DOWNSAMPLED DATA

      
Application Number 17739814
Status Pending
Filing Date 2022-05-09
First Publication Date 2022-11-17
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor
  • Fais, Yaniv
  • Binenfeld, Ariel
  • Levy, Liran

Abstract

The present subject matter provides technical solutions facing technical problems associated with preserving spatial dimension of images used in CNNs. Transposed convolutional layers may be used to provide improved spatial dimension preservation or reconstruction. In contrast with image interpolation, transposed convolutional layers may use a set of weights to reconstruct input images. When using CNNs for ADAS and AV applications, the transposed convolutional layers may be trained jointly with convolutional layers during the CNN training process. This may provide the ability to use a lower-dimensional representation of input images, while preserving the spatial dimension of images for use in ADAS and AV systems.

IPC Classes  ?

  • G06T 3/40 - Scaling of a whole image or part thereof
  • G06N 3/04 - Architecture, e.g. interconnection topology

78.

APPLYING A CONVOLUTION KERNEL ON INPUT DATA

      
Application Number 17739834
Status Pending
Filing Date 2022-05-09
First Publication Date 2022-11-17
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor
  • Weisel, Orly
  • Fais, Yaniv
  • Hirsch, Shira
  • Srebnik, Daniel

Abstract

A method for neural network convolution, the method may include receiving input data that is a 3D input data and comprises input data segments associated with different input data depth values; receiving a convolution kernel that is a 3D convolution kernel and comprises kernel segments associated with different kernel depth values; performing multiple 3D convolution iteration, wherein each of 3D convolution iteration comprises: determining whether the 3D convolution iteration is of a first type or of a second type; executing the 3D convolution iteration of the first type when determining that the 3D convolution iteration is of the first type; and executing the 3D convolution iteration of the second type when determining that the 3D convolution iteration is of the second type.

IPC Classes  ?

79.

Vehicle environment modeling with a camera

      
Application Number 17841937
Grant Number 11568653
Status In Force
Filing Date 2022-06-16
First Publication Date 2022-11-17
Grant Date 2023-01-31
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor
  • Stein, Gideon
  • Blumenthal, Itay
  • Moskowitz, Jeffrey
  • Shaag, Nadav
  • Carlebach, Natalie

Abstract

System and techniques for vehicle environment modeling with a camera are described herein. A device for modeling an environment comprises: a hardware sensor interface to obtain a sequence of unrectified images representative of a road environment, the sequence of unrectified images including a first unrectified image, a previous unrectified image, and a previous-previous unrectified image; and processing circuitry to: provide the first unrectified image, the previous unrectified image, and the previous-previous unrectified image to an artificial neural network (ANN) to produce a three-dimensional structure of a scene; determine a selected homography; and apply the selected homography to the three-dimensional structure of the scene to create a model of the road environment.

IPC Classes  ?

  • G06T 7/00 - Image analysis
  • G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
  • B60W 40/06 - Road conditions
  • G06T 7/50 - Depth or shape recovery

80.

TEMPORARY RULE SUSPENSION FOR AUTONOMOUS NAVIGATION

      
Application Number 17861533
Status Pending
Filing Date 2022-07-11
First Publication Date 2022-11-17
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor
  • Shalev-Shwartz, Shai
  • Shashua, Amnon
  • Shammah, Shaked

Abstract

A navigation system for a host vehicle is provided. The system may comprise at least one processing device comprising circuitry and a memory. The memory includes instructions that when executed by the circuitry cause the at least one processing device to: receive a plurality of images acquired by a camera, the plurality of images being representative of an environment of the host vehicle; analyze the plurality of images to identify a presence in the environment of the host vehicle a navigation rule suspension condition; temporarily suspend at least one navigational rule in response to identification of the navigation rule suspension condition; and cause at least one navigational change of the host vehicle unconstrained by the temporarily suspended at least one navigational rule.

IPC Classes  ?

  • B60W 30/09 - Taking automatic action to avoid collision, e.g. braking and steering
  • G05D 1/02 - Control of position or course in two dimensions
  • G08G 1/01 - Detecting movement of traffic to be counted or controlled
  • G08G 1/0968 - Systems involving transmission of navigation instructions to the vehicle
  • B60W 30/095 - Predicting travel path or likelihood of collision
  • B60W 30/16 - Control of distance between vehicles, e.g. keeping a distance to preceding vehicle
  • B60W 30/18 - Propelling the vehicle
  • B60W 50/10 - Interpretation of driver requests or demands
  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
  • G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
  • G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads

81.

Anonymous collection of data from a group of entitled members

      
Application Number 17878363
Grant Number 11888826
Status In Force
Filing Date 2022-08-01
First Publication Date 2022-11-17
Grant Date 2024-01-30
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor Kipnis, Aviad

Abstract

A method for collecting data from a group of entitled members. The method may include receiving, by a collection unit, a message and a message signature; validating, by the collection unit, whether the message was received from any of the entitled members of the group, without identifying the entitled member that sent the message; wherein the validating comprises applying a second plurality of mathematical operations on a first group of secrets, a second group of secrets, and a first part of the message signature; and rejecting, by the collection unit, the message when validating that the message was not received from any entitled member of the group.

IPC Classes  ?

  • H04L 9/40 - Network security protocols
  • H04L 9/08 - Key distribution
  • H04L 9/32 - Arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system

82.

CROSS FIELD OF VIEW FOR AUTONOMOUS VEHICLE SYSTEMS

      
Application Number 17811735
Status Pending
Filing Date 2022-07-11
First Publication Date 2022-11-10
Owner MOBILEYE VISION TECHNOLOGIES LTD. (Israel)
Inventor
  • Stein, Gideon
  • Eytan, Ori
  • Belman, Efim
  • Katiee, Moshe

Abstract

An imaging system is provided for a vehicle. In one implementation, the imaging system includes an imaging module, a first camera coupled to the imaging module, a second camera coupled to the imaging module, and a mounting assembly configured to attach the imaging module to the vehicle such that the first and second camera face outward with respect to the vehicle. The first camera has a first field of view and a first optical axis, and the second camera has a second field of view and a second optical axis. The first optical axis crosses the second optical axis in at least one crossing point of a crossing plane. The first camera is focused a first horizontal distance beyond the crossing point of the crossing plane and the second camera is focused a second horizontal distance beyond the crossing point of the crossing plane.

IPC Classes  ?

83.

MULTI-FRAME IMAGE SEGMENTATION

      
Application Number 17731794
Status Pending
Filing Date 2022-04-28
First Publication Date 2022-11-03
Owner
  • MOBILEYE VISION TECHNOLOGIES, LTD. (Israel)
  • MOBILEYE VISION TECHNOLOGIES, LTD. (Israel)
Inventor
  • Bar Zvi, Asaf
  • Zabari, Nir

Abstract

Systems and methods for identifying objects in an environment of a host vehicle are disclosed. In one implementation, a system includes a processor configured to receive images representative of the environment of the host vehicle; assign first pixel descriptor values to a plurality of pixels associated with a first image and second pixel descriptor values to a plurality of pixels associated with a second image; identify object representations in the first image and the second image based on at the first pixel descriptor values and the second pixel descriptor values, respectively; determine a first object descriptor and a second object descriptor based on the first pixel descriptor values and the second pixel descriptor values, respectively; and based on a comparison of the first object descriptor and the second object descriptor, output an indication that the object representations in the first image and the second image represent a common object.

IPC Classes  ?

  • G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
  • B60W 10/20 - Conjoint control of vehicle sub-units of different type or different function including control of steering systems
  • B60W 10/04 - Conjoint control of vehicle sub-units of different type or different function including control of propulsion units
  • B60W 10/18 - Conjoint control of vehicle sub-units of different type or different function including control of braking systems
  • G06V 10/75 - Image or video pattern matching; Proximity measures in feature spaces using context analysis; Selection of dictionaries
  • G06V 10/762 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using clustering, e.g. of similar faces in social networks
  • G06V 10/82 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks

84.

QUANTITATIVE DRIVING EVALUATION AND VEHICLE SAFETY RESTRICTIONS

      
Application Number 17762761
Status Pending
Filing Date 2019-12-27
First Publication Date 2022-10-27
Owner MOBILEYE VISION TECHNOLOGIES LTD. (Israel)
Inventor
  • Alvarez, Ignacio J.
  • Buerkle, Cornelius
  • Ji, Haitao
  • Li, Fei
  • Oboril, Fabian
  • Quast, Johannes
  • Scholl, Kay-Ulrich
  • Weast, John Charles
  • Wu, Xiangbin
  • Zhang, Xinxin
  • Zhang, Zhiyuan
  • Zhu, Qianying

Abstract

One or more processors may be configured to determine one or more prospective routes of an ego vehicle being at least partially controlled by a human driver; receive first sensor data, representing one or more attributes of a second vehicle; determine a danger probability of the one or more prospective routes of the first vehicle using the at least the one or more attributes of the second vehicle from the first sensor data; and if each of the one or more prospective routes of the first vehicle has a danger probability outside of a predetermined range, send a signal representing a safety intervention. Whenever a safety intervention signal is sent, the one or more processors may be configured to increment or decrement a counter.

IPC Classes  ?

  • G08G 1/0968 - Systems involving transmission of navigation instructions to the vehicle
  • G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
  • G08G 1/01 - Detecting movement of traffic to be counted or controlled
  • G08G 1/0962 - Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages

85.

LIDAR and rem localization

      
Application Number 17809632
Grant Number 11573090
Status In Force
Filing Date 2022-06-29
First Publication Date 2022-10-20
Grant Date 2023-02-07
Owner MOBILEYE VISION TECHNOLOGIES LTD. (Israel)
Inventor
  • Rosenblum, Kevin
  • Ziv, Alon
  • Dagan, Erez

Abstract

A navigation system for a host vehicle may include a processor programmed to: receive, from an entity remotely located relative to the host vehicle, a sparse map associated with at least one road segment to be traversed by the host vehicle; receive point cloud information from a LIDAR system onboard the host vehicle, the point cloud information being representative of distances to various objects in an environment of the host vehicle; compare the received point cloud information with at least one of the plurality of mapped navigational landmarks in the sparse map to provide a LIDAR-based localization of the host vehicle relative to at least one target trajectory; determine an navigational action for the host vehicle based on the LIDAR-based localization of the host vehicle relative to the at least one target trajectory; and cause the at least one navigational action to be taken by the host vehicle.

IPC Classes  ?

  • G01C 21/30 - Map- or contour-matching
  • G01S 17/89 - Lidar systems, specially adapted for specific applications for mapping or imaging
  • G01C 21/00 - Navigation; Navigational instruments not provided for in groups

86.

Navigation based on partially occluded pedestrians

      
Application Number 17635599
Grant Number 11680801
Status In Force
Filing Date 2020-12-31
First Publication Date 2022-10-20
Grant Date 2023-06-20
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor
  • Benou, Ariel
  • Aloni, David
  • Kirzhner, Dmitry
  • Kario, Jack

Abstract

Systems and methods are provided for navigating a host vehicle. In an embodiment, a processing device may be configured to receive a captured image acquired by a camera onboard the host vehicle; provide the captured image to an analysis module configured to generate an output including an indicator of a contact position of the occluded pedestrian with the ground surface, the analysis module including a trained model trained based a plurality of training images having been modified to occlude a region where a training pedestrian contacts a training ground surface; receive from the analysis module the generated output, including the indicator of the contact position of the occluded pedestrian with the ground surface; and cause at least one navigational action by the host vehicle based on the indicator of the contact position of the occluded pedestrian with the ground surface.

IPC Classes  ?

  • G01C 21/00 - Navigation; Navigational instruments not provided for in groups
  • G06T 7/50 - Depth or shape recovery
  • G06T 7/70 - Determining position or orientation of objects or cameras
  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
  • G06V 10/25 - Determination of region of interest [ROI] or a volume of interest [VOI]
  • G06V 10/82 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
  • G06V 40/10 - Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands

87.

EGO MOTION CORRECTION OF LIDAR OUTPUT

      
Application Number 17809589
Status Pending
Filing Date 2022-06-29
First Publication Date 2022-10-20
Owner MOBILEYE VISION TECHNOLOGIES LTD. (Israel)
Inventor
  • Rosenblum, Kevin
  • Dagan, Erez
  • Boublil, David
  • Shaag, Nadav
  • Neuhof, David
  • Moskowitz, Jeffrey
  • Topel, Gal

Abstract

A navigation system for a host vehicle may include a processor programmed to determine at least one indicator of ego motion of the host vehicle. A processor may be also programmed to receive, from a LIDAR system, a first point cloud including a first representation of at least a portion of an object and a second point cloud including a second representation of the at least a portion of the object. The processor may further be programmed to determine a velocity of the object based on the at least one indicator of ego motion of the host vehicle, and based on a comparison of the first point cloud, including the first representation of the at least a portion of the object, and the second point cloud, including the second representation of the at least a portion of the object.

IPC Classes  ?

  • G01C 21/16 - Navigation; Navigational instruments not provided for in groups by using measurement of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
  • G01S 17/86 - Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
  • G01S 17/42 - Simultaneous measurement of distance and other coordinates
  • G01S 7/481 - Constructional features, e.g. arrangements of optical elements
  • G01S 17/58 - Velocity or trajectory determination systems; Sense-of-movement determination systems
  • G01S 17/89 - Lidar systems, specially adapted for specific applications for mapping or imaging

88.

Pseudo lidar

      
Application Number 17809641
Grant Number 11734848
Status In Force
Filing Date 2022-06-29
First Publication Date 2022-10-13
Grant Date 2023-08-22
Owner MOBILEYE VISION TECHNOLOGIES LTD. (Israel)
Inventor
  • Springer, Ofer
  • Neuhof, David
  • Moskowitz, Jeffrey
  • Topel, Gal
  • Shaag, Nadav
  • Stern, Yotam
  • Lotan, Roy
  • Harouche, Shahar
  • Einy, Daniel

Abstract

A navigation system for a host vehicle may include a processor programmed to: receive from a center camera onboard the host vehicle a captured center image including a representation of at least a portion of an environment of the host vehicle, receive from a left surround camera onboard the host vehicle a captured left surround image including a representation of at least a portion of the environment of the host vehicle, and receive from a right surround camera onboard the host vehicle a captured right surround image including a representation of at least a portion of the environment of the host vehicle; provide the center image, the left surround image, and the right surround image to an analysis module configured to generate an output relative to the at least one captured center image; and cause a navigational action by the host vehicle based on the generated output.

IPC Classes  ?

  • G01S 17/86 - Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
  • G01C 21/30 - Map- or contour-matching
  • G06T 7/55 - Depth or shape recovery from multiple images
  • G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles
  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
  • G01B 11/22 - Measuring arrangements characterised by the use of optical techniques for measuring depth
  • G01S 17/89 - Lidar systems, specially adapted for specific applications for mapping or imaging
  • B60W 10/04 - Conjoint control of vehicle sub-units of different type or different function including control of propulsion units
  • B60W 10/18 - Conjoint control of vehicle sub-units of different type or different function including control of braking systems
  • B60W 10/20 - Conjoint control of vehicle sub-units of different type or different function including control of steering systems
  • B60W 30/09 - Taking automatic action to avoid collision, e.g. braking and steering
  • G01C 21/00 - Navigation; Navigational instruments not provided for in groups
  • G01C 21/16 - Navigation; Navigational instruments not provided for in groups by using measurement of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
  • G01S 7/481 - Constructional features, e.g. arrangements of optical elements
  • G01S 17/42 - Simultaneous measurement of distance and other coordinates
  • G01S 17/58 - Velocity or trajectory determination systems; Sense-of-movement determination systems
  • H04N 23/90 - Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
  • H04N 23/698 - Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture

89.

DRIVING SAFETY SYSTEMS

      
Application Number 17642739
Status Pending
Filing Date 2019-12-27
First Publication Date 2022-10-13
Owner MOBILEYE VISION TECHNOLOGIES LTD. (Israel)
Inventor
  • Alvarez, Ignacio J.
  • Geissler, Florian
  • Ji, Haitao
  • Li, Fei
  • Oboril, Fabian
  • Rosales, Rafael
  • Scholl, Kay-Ulrich
  • Wu, Xiangbin
  • Zhang, Xinxin
  • Zhang, Zhiyuan
  • Zhu, Qianying

Abstract

A safety system (200) for a vehicle (100) is provided. The safety system (200) may include one or more processors (102). The one or more processors (102) may be configured to control a vehicle (100) to operate in accordance with the predefined stored driving model parameters, to detect vehicle operation data during the operation of the vehicle (100), to determine whether to change predefined driving model parameters based on the detected vehicle operation data and the driving model parameters, to change the driving model parameters to changed driving model parameters, and to control the vehicle (100) to operate in accordance with the changed driving model parameters.

IPC Classes  ?

  • B60W 50/00 - CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles

90.

LIDAR-Camera Fusion Where LIDAR and Camera Validly See Different Things

      
Application Number 17809612
Status Pending
Filing Date 2022-06-29
First Publication Date 2022-10-13
Owner MOBILEYE VISION TECHNOLOGIES LTD. (Israel)
Inventor
  • Blau, Chaim
  • Springer, Ofer
  • Rosenblum, Kevin
  • Ziv, Alon
  • Dagan, Erez
  • Boublil, David
  • Shaag, Nadav
  • Neuhof, David
  • Moskowitz, Jeffrey
  • Topel, Gal
  • Stern, Yotam

Abstract

A navigation system for a host vehicle may include a processor programmed to: receive from a camera onboard the host vehicle at least one captured image representative of an environment of the host vehicle, wherein the camera is positioned at a first location relative to the host vehicle; receive point cloud information from a LIDAR system onboard the host vehicle, wherein the LIDAR system is positioned at a second location relative to the host vehicle; analyze the at least one captured image and the received point cloud information to detect one or more objects in the shared field of view region; determine whether a vantage point difference between the first location of the camera and the second location of the LIDAR system accounts for the one or more detected objects being represented in only one of the at least one captured image or the received point cloud information.

IPC Classes  ?

  • B60W 30/09 - Taking automatic action to avoid collision, e.g. braking and steering
  • G01C 21/30 - Map- or contour-matching
  • G01S 17/86 - Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
  • G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles
  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
  • B60W 10/04 - Conjoint control of vehicle sub-units of different type or different function including control of propulsion units
  • B60W 10/18 - Conjoint control of vehicle sub-units of different type or different function including control of braking systems
  • B60W 10/20 - Conjoint control of vehicle sub-units of different type or different function including control of steering systems

91.

SYSTEMS AND METHODS FOR DETECTING TRAFFIC LIGHTS

      
Application Number 17848124
Status Pending
Filing Date 2022-06-23
First Publication Date 2022-10-13
Owner MOBILEYE VISION TECHNOLOGIES LTD. (Israel)
Inventor
  • Belman, Efim
  • Bowers, Gabriel

Abstract

Systems and methods are provided for vehicle navigation. In one implementation, a navigation system for a host vehicle may comprise at least one processor. The processor may be programmed to receive from a first camera at least a first captured image representative of an environment of the host vehicle. The processor may be programmed to receive from a second camera at least a second captured image representative of the environment of the host vehicle. Both the first captured image and the second image includes a representation of the traffic light, and wherein the second camera is configured to operate in a primary mode where at least one operational parameter of the second camera is tuned to detect at least one feature of the traffic light. The processor may be further programmed cause at least one navigational action by the vehicle based on analysis of the representation of the traffic light.

IPC Classes  ?

  • G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
  • G06V 10/88 - Image or video recognition using optical means, e.g. reference filters, holographic masks, frequency domain filters or spatial domain filters
  • G06V 10/143 - Sensing or illuminating at different wavelengths
  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles

92.

SYSTEMS AND METHODS FOR DETECTING LOW-HEIGHT OBJECTS IN A ROADWAY

      
Application Number 17806976
Status Pending
Filing Date 2022-06-15
First Publication Date 2022-10-06
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor
  • Stein, Gideon
  • Shashua, Amnon

Abstract

Systems and methods use cameras to provide autonomous navigation features. In one implementation, a driver-assist object detection system is provided for a vehicle. One or more processing devices associated with the system receive at least two images from a plurality of captured images via a data interface. The device(s) analyze the first image and at least a second image to determine a reference plane corresponding to the roadway the vehicle is traveling on. The processing device(s) locate a target object in the first two images, and determine a difference in a size of at least one dimension of the target object between the two images. The system may use the difference in size to determine a height of the object. Further, the system may cause a change in at least a directional course of the vehicle if the determined height exceeds a predetermined threshold.

IPC Classes  ?

  • G05D 1/02 - Control of position or course in two dimensions
  • B60W 30/00 - Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
  • B60W 30/18 - Propelling the vehicle
  • G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
  • G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
  • B60T 7/12 - Brake-action initiating means for initiation not subject to will of driver or passenger
  • B62D 6/00 - Arrangements for automatically controlling steering depending on driving conditions sensed and responded to, e.g. control circuits
  • B60W 10/20 - Conjoint control of vehicle sub-units of different type or different function including control of steering systems
  • B60K 31/00 - Vehicle fittings, acting on a single sub-unit only, for automatically controlling vehicle speed, i.e. preventing speed from exceeding an arbitrarily established velocity or maintaining speed at a particular velocity, as selected by the vehicle operat
  • B60W 30/09 - Taking automatic action to avoid collision, e.g. braking and steering
  • B60W 30/14 - Cruise control
  • G01S 19/42 - Determining position
  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
  • H04N 7/18 - Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
  • G08G 1/095 - Traffic lights
  • G06T 7/62 - Analysis of geometric attributes of area, perimeter, diameter or volume

93.

Adversarial Approach to Usage of Lidar Supervision to Image Depth Estimation

      
Application Number 17698344
Status Pending
Filing Date 2022-03-18
First Publication Date 2022-09-29
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor Shaag, Nadav

Abstract

Techniques are disclosed for improving upon the usage of Light Detection and Ranging (LIDAR) supervision to perform image depth estimation. The techniques use a generator and adversary network to generate respective models that “compete” against one another to enable the generator model to output a desired output image that compensates for a LIDAR image having a structured or lined data pattern. The techniques described herein may be suitable for use by vehicles and/or other agents operating in a particular environment as part of machine vision algorithms that are implemented to perform autonomous and/or semi-autonomous functions.

IPC Classes  ?

  • G06T 7/521 - Depth or shape recovery from the projection of structured light
  • G06N 3/08 - Learning methods
  • G01B 11/22 - Measuring arrangements characterised by the use of optical techniques for measuring depth
  • G01S 17/89 - Lidar systems, specially adapted for specific applications for mapping or imaging
  • G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles

94.

CHIMNEY AND FLANGE DESIGN FOR CAMERA MODULE

      
Application Number 17617186
Status Pending
Filing Date 2020-06-23
First Publication Date 2022-09-22
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor Eytan, Ori

Abstract

A camera module assembly can include a chimney and a lens assembly. The chimney can include a distal portion having a substantially spherical profile. The lens assembly can include a lens barrel, an optical device, and a flange extending radially from the lens barrel, where the flange can be securable to the distal portion of the chimney.

IPC Classes  ?

  • A61B 1/05 - Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances characterised by the image sensor, e.g. camera, being in the distal end portion
  • A61B 1/00 - Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
  • G02B 7/00 - Mountings, adjusting means, or light-tight connections, for optical elements

95.

Priority based management of access to shared resources

      
Application Number 17680871
Grant Number 11868801
Status In Force
Filing Date 2022-02-25
First Publication Date 2022-09-15
Grant Date 2024-01-09
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor
  • Shulman, Boris
  • Richter, Itamar
  • Keret, Galit

Abstract

A system, computer readable medium and a method that may include performing multiple iterations of: determining, by each active initiator of the multiple initiators, a number of pending access requests generated by the active initiator, wherein each access request is a request to access a shared resource out of the shared resources; determining, by each active initiator, a priority level to be assigned to all pending access requests generated by the active initiator, wherein the determining is based on the number of pending access requests generated by the active initiator, a number of active initiators out of the multiple initiators, and a number of access requests serviceable by the shared resource; for each active initiator, informing an arbitration hardware of a network on chip about the priority level to be assigned to all pending access requests generated by the active initiator; and managing access to the shared resources, by the arbitration hardware, based on the priority level to be assigned to all pending access requests generated by each active initiator.

IPC Classes  ?

  • G06F 9/46 - Multiprogramming arrangements
  • G06F 9/48 - Program initiating; Program switching, e.g. by interrupt
  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
  • G06N 3/08 - Learning methods

96.

SIGN BACKSIDE MAPPING AND NAVIGATION

      
Application Number 17824355
Status Pending
Filing Date 2022-05-25
First Publication Date 2022-09-15
Owner MOBILEYE VISION TECHNOLOGIES LTD. (Israel)
Inventor
  • Shenfeld, Moshe
  • Wachtel, Gideon
  • Segel, Ori

Abstract

Systems and methods are provided for vehicle navigation. In one implementation, a host vehicle-based sparse map feature harvester system may include at least one processor programmed to receive a first image captured by a forward-facing camera and a second image captured by a rearward-facing camera and as the host vehicle travels along a road segment in a first direction; detect a semantic feature represented in the first image and a semantic feature represented in the second image, the semantic features being associated with predetermined object type classifications; identify position descriptors associated with the first semantic feature and the second semantic feature; receive position information indicative of positions of the forward-facing and rearward-facing cameras when the first and second images were captured; and cause transmission of drive information including the position descriptors and the position information to a remotely-located entity.

IPC Classes  ?

  • B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
  • G05D 1/02 - Control of position or course in two dimensions
  • G01C 21/00 - Navigation; Navigational instruments not provided for in groups
  • G01C 21/30 - Map- or contour-matching
  • B60W 30/095 - Predicting travel path or likelihood of collision

97.

Systems and methods for navigating lane merges and lane splits

      
Application Number 17824508
Grant Number 11960293
Status In Force
Filing Date 2022-05-25
First Publication Date 2022-09-08
Grant Date 2024-04-16
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor
  • Eagelberg, Dor
  • Stein, Gideon
  • Taieb, Yoav
  • Gdalyahu, Yoram
  • Fridman, Ofer

Abstract

Systems and methods are provided for navigating an autonomous vehicle. In one implementation, a system includes a processing device programmed to receive a plurality of images representative of an environment of the host vehicle. The environment includes a road on which the host vehicle is traveling. The at least one processing device is further programmed to analyze the images to identify a target vehicle traveling in a lane of the road different from a lane in which the host vehicle is traveling; analyze the images to identify a lane mark associated with the lane in which the target vehicle is traveling; detect lane mark characteristics of the identified lane mark; use the detected lane mark characteristics to determine a type of the identified lane mark; determine a characteristic of the target vehicle; and determine a navigational action for the host vehicle based on the determined lane mark type and the determined characteristic of the target vehicle.

IPC Classes  ?

  • G05D 1/02 - Control of position or course in two dimensions
  • B60W 30/09 - Taking automatic action to avoid collision, e.g. braking and steering
  • G01C 21/34 - Route searching; Route guidance
  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
  • G06N 3/045 - Combinations of networks
  • G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
  • G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads

98.

Generating a Navigational Map

      
Application Number 17825758
Status Pending
Filing Date 2022-05-26
First Publication Date 2022-09-08
Owner MOBILEYE VISION TECHNOLOGIES LTD. (Israel)
Inventor
  • Reuveni, Nadav
  • Lifshitz, Ben-Tzion
  • Eliassaf, Ofer
  • Fridman, Ofer
  • Guberman, Yahel

Abstract

Systems and methods are provided for vehicle navigation. In one implementation, a host vehicle-based sparse map feature harvester system may include at least one processor programmed to receive a plurality of images captured by a camera onboard the host vehicle as the host vehicle travels along a road segment in a first direction, wherein the plurality of images are representative of an environment of the host vehicle; detect one or more semantic features represented in one or more of the plurality of images, the one or more semantic features each being associated with a predetermined object type classification; identify at least one position descriptor associated with each of the detected one or more semantic features; identify three-dimensional feature points associated with one or more detected objects represented in at least one of the plurality of images; receive position information, for each of the plurality of images, wherein the position information is indicative of a position of the camera when each of the plurality of images was captured; and cause transmission of drive information for the road segment to an entity remotely-located relative to the host vehicle, wherein the drive information includes the identified at least one position descriptor associated with each of the detected one or more semantic features, the identified three-dimensional feature points, and the position information.

IPC Classes  ?

  • G01C 21/00 - Navigation; Navigational instruments not provided for in groups

99.

FULLY ALIGNED JUNCTIONS

      
Application Number 17824305
Status Pending
Filing Date 2022-05-25
First Publication Date 2022-09-08
Owner MOBILEYE VISION TECHNOLOGIES LTD. (Israel)
Inventor
  • Guberman, Yahel
  • Springer, Ofer
  • Fridman, Ofer
  • Taieb, Yoav

Abstract

Systems and methods for creating maps used in navigating autonomous vehicles are disclosed. In one implementation at least one processor is programmed to receive drive information from each of a plurality of vehicles that traverse different entrance-exit combinations of a road junction; for each of the entrance-exit combinations, align three-dimensional feature points in the drive information to generate a plurality of aligned three-dimensional feature point groups, one for each entrance-exit combination of the road junction; correlate one or more three-dimensional feature points in each of the plurality of aligned three-dimensional feature point groups with one or more three-dimensional feature points included in every other aligned three-dimensional feature point group from among the plurality of aligned three-dimensional feature point groups; and generate a sparse map based on the correlation, the sparse map including a target trajectory associated with each of the entrance-exit combinations.

IPC Classes  ?

  • G01C 21/00 - Navigation; Navigational instruments not provided for in groups

100.

VARIABLE HEADER AND OBJECT PAYLOAD

      
Application Number 17656024
Status Pending
Filing Date 2022-03-23
First Publication Date 2022-09-08
Owner Mobileye Vision Technologies Ltd. (Israel)
Inventor
  • Goldman, Yehonatan
  • Fisher, Amiel

Abstract

A system for navigating a host vehicle includes at least one electronic horizon processor to determine an electronic horizon for the host vehicle based on localization of the host vehicle relative to a map, generate a navigation information packet including information associated with the determined electronic horizon, and output the generated navigation information packet to one or more navigation system processors configured to cause the host vehicle to execute at least one navigational maneuver based on the information included in the navigation information packet.

IPC Classes  ?

  • G01C 21/00 - Navigation; Navigational instruments not provided for in groups
  1     2     3     ...     5        Next Page