Magic Leap, Inc.

United States of America

Back to Profile

1-100 of 1,959 for Magic Leap, Inc. Sort by
Query
Patent
United States - USPTO
Aggregations Reset Report
Date
New (last 4 weeks) 22
2024 April (MTD) 18
2024 March 28
2024 February 30
2024 January 24
See more
IPC Class
G02B 27/01 - Head-up displays 939
G06T 19/00 - Manipulating 3D models or images for computer graphics 617
G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer 564
G02B 27/00 - Optical systems or apparatus not provided for by any of the groups , 380
F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems 240
See more
Status
Pending 386
Registered / In Force 1,573
Found results for  patents
  1     2     3     ...     20        Next Page

1.

MIXED REALITY SPATIAL AUDIO

      
Application Number 18389698
Status Pending
Filing Date 2023-12-19
First Publication Date 2024-04-25
Owner Magic Leap, Inc. (USA)
Inventor
  • Schmidt, Brian Lloyd
  • Tajik, Jehangir
  • Jot, Jean-Marc

Abstract

A method of presenting an audio signal to a user of a mixed reality environment is disclosed. According to examples of the method, an audio event associated with the mixed reality environment is detected. The audio event is associated with a first audio signal. A location of the user with respect to the mixed reality environment is determined. An acoustic region associated with the location of the user is identified. A first acoustic parameter associated with the first acoustic region is determined. A transfer function is determined using the first acoustic parameter. The transfer function is applied to the first audio signal to produce a second audio signal, which is then presented to the user.

IPC Classes  ?

  • H04S 7/00 - Indicating arrangements; Control arrangements, e.g. balance control
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • H04S 3/00 - Systems employing more than two channels, e.g. quadraphonic

2.

PATCH TRACKING IMAGE SENSOR

      
Application Number 18403673
Status Pending
Filing Date 2024-01-03
First Publication Date 2024-04-25
Owner Magic Leap, Inc. (USA)
Inventor
  • Zahnert, Martin Georg
  • Ilic, Alexander
  • Fonseka, Erik

Abstract

An image sensor suitable for use in an augmented reality system to provide low latency image analysis with low power consumption. The augmented reality system can be compact, and may be small enough to be packaged within a wearable device such as a set of goggles or mounted on a frame resembling ordinary eyeglasses. The image sensor may receive information about a region of an imaging array associated with a movable object and selectively output imaging information for that region. The region may be updated dynamically as the image sensor and/or the object moves. Such an image sensor provides a small amount of data from which object information used in rendering an augmented reality scene can be developed. The amount of data may be further reduced by configuring the image sensor to output indications of pixels for which the measured intensity of incident light changes.

IPC Classes  ?

  • H04N 25/50 - Control of the SSIS exposure
  • G06T 11/00 - 2D [Two Dimensional] image generation
  • H04N 25/702 - SSIS architectures characterised by non-identical, non-equidistant or non-planar pixel layout

3.

METHOD AND SYSTEM FOR REDUCING LINE SEPARATION ARTIFACTS IN INTERLACED IMAGE PROJECTION

      
Application Number 18400601
Status Pending
Filing Date 2023-12-29
First Publication Date 2024-04-25
Owner Magic Leap, Inc. (USA)
Inventor
  • Edwin, Lionel Ernest
  • Vlaskamp, Björn Nicolaas Servatius
  • Miller, Samuel A.
  • Clarke, Aaron M.

Abstract

An image display system includes an optical subsystem configured to emit a modulated light beam, and a scanning mirror for generating a reflected light beam that is scanned according to randomly selected or pseudo-randomly selected scan patterns to generate multiple image fields of a multiple interlaced scan image. A plurality of different scan patterns can be cycled through, randomly or pseudo-randomly selected, for the different image fields to reduce artifacts that may be observed while viewing a projected image.

IPC Classes  ?

  • G02B 26/10 - Scanning systems
  • G02B 26/08 - Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
  • G03B 21/00 - Projectors or projection-type viewers; Accessories therefor
  • H04N 9/31 - Projection devices for colour picture display

4.

METHOD OF FABRICATING DIFFRACTION GRATINGS

      
Application Number 18400891
Status Pending
Filing Date 2023-12-29
First Publication Date 2024-04-25
Owner Magic Leap, Inc. (USA)
Inventor
  • Yang, Shuqiang
  • Luo, Kang
  • Singh, Vikramjit
  • Xu, Frank Y.

Abstract

A method of fabricating a blazed diffraction grating comprises providing a master template substrate and imprinting periodically repeating lines on the master template substrate in a plurality of master template regions. In some embodiments, the periodically repeating lines in different ones of the master template regions extend in different directions. The method additionally comprises using at least one of the master template regions as a master template to imprint at least one blazed diffraction grating pattern on a grating substrate. In some embodiments, the method further comprises coating the periodically repeating lines with a material having a greater hardness than the material that forms the lines.

IPC Classes  ?

5.

LIGHTWEIGHT CROSS REALITY DEVICE WITH PASSIVE DEPTH EXTRACTION

      
Application Number 18542122
Status Pending
Filing Date 2023-12-15
First Publication Date 2024-04-25
Owner Magic Leap, Inc. (USA)
Inventor
  • Zahnert, Martin Georg
  • Ilic, Alexander
  • Velasquez, Miguel Andres Granados
  • Victorio, Javier

Abstract

A wearable display system including multiple cameras and a processor is disclosed. A greyscale camera and a color camera can be arranged to provide a central view field associated with both cameras and a peripheral view field associated with one of the two cameras. One or more of the two cameras may be a plenoptic camera. The wearable display system may acquire light field information using the at least one plenoptic camera and create a world model using the first light field information and first depth information stereoscopically determined from images acquired by the greyscale camera and the color camera. The wearable display system can track head pose using the at least one plenoptic camera and the world model. The wearable display system can track objects in the central view field and the peripheral view fields using the one or two plenoptic cameras, when the objects satisfy a depth criterion.

IPC Classes  ?

  • H04N 25/531 - Control of the integration time by controlling rolling shutters in CMOS SSIS
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/01 - Head-up displays
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • H04N 23/951 - Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
  • H04N 23/957 - Light-field or plenoptic cameras or camera modules
  • H04N 25/46 - Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by combining or binning pixels

6.

ENHANCED POSE DETERMINATION FOR DISPLAY DEVICE

      
Application Number 18378086
Status Pending
Filing Date 2023-10-08
First Publication Date 2024-04-25
Owner Magic Leap, Inc. (USA)
Inventor
  • Zahnert, Martin Georg
  • Faro, Joao Antonio Pereira
  • Velasquez, Miguel Andres Granados
  • Kasper, Dominik Michael
  • Swaminathan, Ashwin
  • Mohan, Anush
  • Singhal, Prateek

Abstract

To determine the head pose of a user, a head-mounted display system having an imaging device can obtain a current image of a real-world environment, with points corresponding to salient points which will be used to determine the head pose. The salient points are patch-based and include: a first salient point being projected onto the current image from a previous image, and with a second salient point included in the current image being extracted from the current image. Each salient point is subsequently matched with real-world points based on descriptor-based map information indicating locations of salient points in the real-world environment. The orientation of the imaging devices is determined based on the matching and based on the relative positions of the salient points in the view captured in the current image. The orientation may be used to extrapolate the head pose of the wearer of the head-mounted display system.

IPC Classes  ?

  • G06V 20/20 - Scenes; Scene-specific elements in augmented reality scenes
  • G02B 27/01 - Head-up displays
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
  • G06V 40/18 - Eye characteristics, e.g. of the iris

7.

CROSS REALITY SYSTEM FOR LARGE SCALE ENVIRONMENTS

      
Application Number 18396682
Status Pending
Filing Date 2023-12-26
First Publication Date 2024-04-25
Owner Magic Leap, Inc. (USA)
Inventor
  • Gomez Gonzalez, Javier Victorio
  • Velasquez, Miguel Andres Granados
  • Prasad, Mukta
  • Kasper, Dominik Michael
  • Guendelman, Eran
  • Lin, Keng-Sheng

Abstract

A cross reality system enables portable devices to access stored maps and efficiently and accurately render virtual content specified in relation to those maps. The system may process images acquired with a portable device to quickly and accurately localize the portable device to the persisted maps by constraining the result of localization based on the estimated direction of gravity of a persisted map and the coordinate frame in which data in a localization request is posed. The system may actively align the data in the localization request with an estimated direction of gravity during the localization processing, and/or a portable device may establish a coordinate frame in which the data in the localization request is posed aligned with an estimated direction of gravity such that the subsequently acquired data for inclusion in a localization request, when posed in that coordinate frame, is passively aligned with the estimated direction of gravity.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

8.

MONOVISION DISPLAY FOR WEARABLE DEVICE

      
Application Number 18400278
Status Pending
Filing Date 2023-12-29
First Publication Date 2024-04-25
Owner Magic Leap, Inc. (USA)
Inventor
  • Vlaskamp, Bjorn Nicolaas Servatius
  • Shultz, Jason Allen
  • Welch, William Hudson
  • Wu, Bing

Abstract

A wearable device includes a left optical stack having a left eyepiece configured to receive left virtual image light, a left accommodating lens, and a left compensating lens. The wearable device also includes a right optical stack having a right eyepiece configured to receive right virtual image light, a right accommodating lens, and a right compensating lens. An optical power of the left accommodating lens is equal in magnitude to an optical power of the left compensating lens, an optical power of the right accommodating lens is equal in magnitude to an optical power of the right compensating lens, and the optical power of the left accommodating lens and the optical power of the right accommodating lens differ by an offset amount.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,

9.

PERIOCULAR TEST FOR MIXED REALITY CALIBRATION

      
Application Number 18401020
Status Pending
Filing Date 2023-12-29
First Publication Date 2024-04-25
Owner Magic Leap, Inc. (USA)
Inventor
  • Kaehler, Adrian
  • Bradski, Gary
  • Badrinarayanan, Vijay

Abstract

A wearable device can include an inward-facing imaging system configured to acquire images of a user's periocular region. The wearable device can determine a relative position between the wearable device and the user's face based on the images acquired by the inward-facing imaging system. The relative position may be used to determine whether the user is wearing the wearable device, whether the wearable device fits the user, or whether an adjustment to a rendering location of virtual object should be made to compensate for a deviation of the wearable device from its normal resting position.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G06T 3/20 - Linear translation of a whole image or part thereof, e.g. panning
  • G06T 11/60 - Editing figures and text; Combining figures or text
  • G06V 10/46 - Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
  • G06V 20/20 - Scenes; Scene-specific elements in augmented reality scenes
  • G06V 40/16 - Human faces, e.g. facial parts, sketches or expressions
  • G06V 40/18 - Eye characteristics, e.g. of the iris
  • G06V 40/19 - Sensors therefor

10.

METHODS, SYSTEMS, AND COMPUTER PROGRAM PRODUCTS FOR IMPLEMENTING CROSS-PLATFORM MIXED-REALITY APPLICATIONS WITH A SCRIPTING FRAMEWORK

      
Application Number 18402485
Status Pending
Filing Date 2024-01-02
First Publication Date 2024-04-25
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Grozdanov, Nikolay Ivanov
  • Piascik, Konrad
  • Zolotarev, Leonid
  • Caswell, Timothy Dean

Abstract

Disclosed are methods and systems for a scripting framework and implementations therefor for mixed reality software applications of heterogeneous systems. These methods or systems create a mixed-reality software application that executes across heterogeneous platforms on a server-side instance of a scripting framework and manage a change in the mixed-reality software application using the server-side instance of the scripting framework. Moreover, the change in the mixed-reality software application using a client-side instance of the scripting framework; and the mixed-reality software application may be interactively executed on a mixed-reality device.

IPC Classes  ?

  • H04L 67/131 - Protocols for games, networked simulations or virtual reality
  • G06F 8/38 - Creation or generation of source code for implementing user interfaces
  • G06F 8/76 - Adapting program code to run in a different environment; Porting
  • G06F 9/455 - Emulation; Interpretation; Software simulation, e.g. virtualisation or emulation of application or operating system execution engines
  • G06F 40/166 - Editing, e.g. inserting or deleting
  • G06F 40/211 - Syntactic parsing, e.g. based on context-free grammar [CFG] or unification grammars
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • H04L 65/75 - Media network packet handling
  • H04L 67/133 - Protocols for remote procedure calls [RPC]

11.

SCENE UNDERSTANDING USING OCCUPANCY GRIDS

      
Application Number 18275468
Status Pending
Filing Date 2022-02-03
First Publication Date 2024-04-18
Owner Magic Leap, Inc. (USA)
Inventor
  • Ramnath, Divya
  • Dong, Shiyu
  • Choudhary, Siddharth
  • Mahendran, Siddharth
  • Kannan, Arumugam Kalai
  • Singhal, Prateek
  • Gupta, Khushi

Abstract

This document describes scene understanding for cross reality systems using occupancy grids. In one aspect, a method includes recognizing one or more objects in a model of a physical environment generated using images of the physical environment. For each object, a bounding box is fit around the object. An occupancy grid that includes a multiple cells is generated within the bounding box around the object. A value is assigned to each cell of the occupancy grid based on whether the cell includes a portion of the object. An object representation that includes information describing the occupancy grid for the object is generated. The object representations are sent to one or more devices.

IPC Classes  ?

  • G06T 17/20 - Wire-frame description, e.g. polygonalisation or tessellation
  • G06V 20/64 - Three-dimensional objects

12.

SYSTEMS AND METHODS FOR OPERATING A DISPLAY SYSTEM BASED ON USER PERCEPTIBILITY

      
Application Number 18395195
Status Pending
Filing Date 2023-12-22
First Publication Date 2024-04-18
Owner Magic Leap, Inc. (USA)
Inventor
  • Edwin, Lionel Ernest
  • Yeoh, Ivan Li Chuen
  • Miller, Samuel A.
  • Selker, Edwin Joseph
  • Carlson, Adam Charles
  • Vlaskamp, Bjorn Nicolaas Servatius
  • Greco, Paul M.

Abstract

Systems and methods are disclosed for operating a head-mounted display system based on user perceptibility. The display system may be an augmented reality display system configured to provide virtual content on a plurality of depth planes by presenting the content with different amounts of wavefront divergence. Some embodiments include obtaining an image captured by an imaging device of the display system. Whether a threshold measure or more of motion blur is determined to be exhibited in one or more regions of the image. Based on a determination that the threshold measure or more of motion blur is exhibited in one or more regions of the image, one or more operating parameters of the wearable display are adjusted. Example operating parameter adjustments comprise adjusting the depth plane on which content is presented (e.g., by switching from a first depth plane to a second depth plane), adjusting a rendering quality, and adjusting power characteristics of the system.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06V 40/19 - Sensors therefor
  • G09G 5/00 - Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators

13.

AUGMENTED REALITY DISPLAY WITH WAVEGUIDE CONFIGURED TO CAPTURE IMAGES OF EYE AND/OR ENVIRONMENT

      
Application Number 18493429
Status Pending
Filing Date 2023-10-24
First Publication Date 2024-04-18
Owner Magic Leap, Inc. (USA)
Inventor
  • Sinay, Asif
  • Freedman, Barak
  • Klug, Michael Anthony
  • Oh, Chulwoo
  • Meitav, Nizan

Abstract

Head mounted display systems configured to project light to an eye of a user to display augmented reality image content in a vision field of the user are disclosed. In embodiments, the system includes a frame configured to be supported on a head of the user, an image projector configured to project images into the user's eye, a camera coupled to the frame, a waveguide optically coupled to the camera, an optical coupling optical element me, an out-coupling element configured to direct light emitted from the waveguide to the camera, and a first light source configured to direct light to the user's eye through the waveguide. Electronics control the camera to capture images periodically and farther control the first light source to pulse in time with the camera such that light emitted by the light source has a reduced intensity when the camera is not capturing images.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/28 - Optical systems or apparatus not provided for by any of the groups , for polarising
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

14.

AMBIENT LIGHT MANAGEMENT SYSTEMS AND METHODS FOR WEARABLE DEVICES

      
Application Number 18542491
Status Pending
Filing Date 2023-12-15
First Publication Date 2024-04-18
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Mathur, Vaibhav
  • Haddock, Joshua Naaman
  • Diehl, Ted
  • Oh, Chulwoo
  • Carlisle, Clinton

Abstract

Techniques are described for operating an optical system. In some embodiments, light associated with a world object is received at the optical system. Virtual image light is projected onto an eyepiece of the optical system. A portion of a system field of view of the optical system to be at least partially dimmed is determined based on information detected by the optical system. A plurality of spatially-resolved dimming values for the portion of the system field of view may be determined based on the detected information. The detected information may include light information, gaze information, and/or image information. A dimmer of the optical system may be adjusted to reduce an intensity of light associated with the world object in the portion of the system field of view according to the plurality of dimming values.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02F 1/1333 - Constructional arrangements
  • G02F 1/1335 - Structural association of cells with optical devices, e.g. polarisers or reflectors
  • G02F 1/13363 - Birefringent elements, e.g. for optical compensation
  • G02F 1/1343 - Electrodes
  • G02F 1/1368 - Active matrix addressed cells in which the switching element is a three-electrode device

15.

Handheld accessory with cameras

      
Application Number 29717184
Grant Number D1022010
Status In Force
Filing Date 2019-12-16
First Publication Date 2024-04-09
Grant Date 2024-04-09
Owner Magic Leap, Inc. (USA)
Inventor
  • Natsume, Shigeru
  • Gunther, Sebastian Gonzalo Arrieta

16.

DISPLAY SYSTEMS AND METHODS FOR DETERMINING REGISTRATION BETWEEN A DISPLAY AND EYES OF USER

      
Application Number 18525487
Status Pending
Filing Date 2023-11-30
First Publication Date 2024-04-04
Owner Magic Leap, Inc. (USA)
Inventor
  • Xu, Yan
  • Cazamias, Jordan Alexander
  • Peng, Rose Mei

Abstract

A display system may include a wearable display for rendering a three-dimensional virtual image content which appears to be located in an environment of a user of the display. The relative positions of the display and one or more eyes of the user may not be in desired positions to receive, or register, image information outputted by the display. For example, the display-to-eye alignment may vary for different users and/or may change over time (e.g., as a user moves or as the display becomes displaced). The wearable device may determine a relative position and/or alignment between the display and the user's eyes by determining whether features of the eye are at certain vertical positions relative to the display. Based on the relative positions, the wearable device may determine if it is properly fitted to the user, and position render camera(s) accordingly to present virtual image content.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

17.

CAMERA INTRINSIC CALIBRATION

      
Application Number 18530565
Status Pending
Filing Date 2023-12-06
First Publication Date 2024-04-04
Owner Magic Leap, Inc. (USA)
Inventor
  • Jia, Zhiheng
  • Grossmann, Etienne Gregoire
  • Zheng, Hao
  • Dominguez, Daniel Roger
  • Tekolste, Robert D.

Abstract

Embodiments provide image display systems and methods for a camera calibration using a two-sided diffractive optical element (DOE). More specifically, embodiments are directed to determining intrinsic parameters of a camera using a single image obtained using a two-sided DOE. The two-sided DOE has a first pattern on a first surface and a second pattern on a second surface. Each of the first and second patterns may be formed by repeating sub-patterns that are lined when tiled on each surface. The patterns on the two-sided DOE are formed such that the brightness of the central intensity peak on the image of the image pattern formed by the DOE is reduced to a predetermined amount.

IPC Classes  ?

  • G06T 7/80 - Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
  • G02B 27/42 - Diffraction optics
  • G06V 10/14 - Optical characteristics of the device performing the acquisition or on the illumination arrangements
  • H04N 17/00 - Diagnosis, testing or measuring for television systems or their details
  • H04N 23/90 - Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums

18.

EYE CENTER OF ROTATION DETERMINATION, DEPTH PLANE SELECTION, AND RENDER CAMERA POSITIONING IN DISPLAY SYSTEMS

      
Application Number 18532448
Status Pending
Filing Date 2023-12-07
First Publication Date 2024-04-04
Owner Magic Leap, Inc. (USA)
Inventor
  • Miller, Samuel A.
  • Agarwal, Lomesh
  • Edwin, Lionel Ernest
  • Yeoh, Ivan Li Chuen
  • Farmer, Daniel
  • Prokushkin, Sergey Fyodorovich
  • Munk, Yonatan
  • Selker, Edwin Joseph
  • Stuart, Bradley Vincent
  • Sommers, Jeffrey Scott

Abstract

A display system can include a head-mounted display configured to project light to an eye of a user to display virtual image content at different amounts of divergence and collimation. The display system can include an inward-facing imaging system images the user's eye and processing electronics that are in communication with the inward-facing imaging system and that are configured to obtain an estimate of a center of rotation of the user's eye. The display system may render virtual image content with a render camera positioned at the determined position of the center of rotation of said eye.

IPC Classes  ?

  • A61B 3/113 - Objective types, i.e. instruments for examining the eyes independent of the patients perceptions or reactions for determining or recording eye movement
  • A61B 3/11 - Objective types, i.e. instruments for examining the eyes independent of the patients perceptions or reactions for measuring interpupillary distance or diameter of pupils
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/01 - Head-up displays
  • G02B 30/40 - Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images giving the observer of a single two-dimensional [2D] image a perception of depth
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
  • G06V 40/18 - Eye characteristics, e.g. of the iris
  • H04N 13/344 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
  • H04N 13/383 - Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes

19.

MISCALIBRATION DETECTION FOR VIRTUAL REALITY AND AUGMENTED REALITY SYSTEMS

      
Application Number 18265085
Status Pending
Filing Date 2021-11-29
First Publication Date 2024-03-28
Owner Magic Leap, Inc. (USA)
Inventor
  • Sokol, Gil
  • Bouhnik, Moshe
  • Gupta, Ankur
  • Gadot Kabasu, David
  • Zampogiannis, Konstantinos

Abstract

Methods, systems, and apparatus, including computer programs encoded on computer storage media, for performing miscalibration detection. One of the methods includes receiving sensor data from each of multiple sensors of a device in a system configured to provide augmented reality or mixed reality output to a user. Feature values are determined based on the sensor data for a predetermined set of features. The determined feature values are processed using a miscalibration detection model that has been trained, based on examples of captured sensor data from one or more devices, to predict whether a miscalibration condition of one or more of the multiple sensors has occurred. Based on the output of the miscalibration detection model, the system determines whether to initiate recalibration of extrinsic parameters for at least one of the multiple sensors or to bypass recalibration.

IPC Classes  ?

  • H04N 13/327 - Calibration thereof
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06V 10/44 - Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
  • G06V 20/70 - Labelling scene content, e.g. deriving syntactic or semantic representations
  • H04N 13/246 - Calibration of cameras
  • H04N 13/344 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays

20.

AUGMENTED AND VIRTUAL REALITY DISPLAY SYSTEMS FOR OCULOMETRIC ASSESSMENTS

      
Application Number 18524254
Status Pending
Filing Date 2023-11-30
First Publication Date 2024-03-28
Owner Magic Leap, Inc. (USA)
Inventor
  • Farmer, Daniel
  • Liston, Dorion Bryce

Abstract

Example techniques are disclosed for increasing the sensitivity of an augmented or virtual reality display system to collecting eye-tracking data for detecting physiological conditions, such as neural processes. An example method includes accessing eye-tracking information associated with a control population and an experimental population, the eye-tracking information reflecting, for each user of the control population and the experimental population, eye-tracking metrics associated with the user; scaling the eye-tracking information based on the eye-tracking information associated with the control population; and determining a sensitivity measure reflecting a distance measure between the control population and experimental population. The sensitivity measure may be utilized to modify physical or operational parameters for the display system and/or the protocol for performing a test.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G02B 27/01 - Head-up displays

21.

SURFACE APPROPRIATE COLLISIONS

      
Application Number 18531583
Status Pending
Filing Date 2023-12-06
First Publication Date 2024-03-28
Owner Magic Leap, Inc. (USA)
Inventor
  • Tajik, Anastasia Andreyevna
  • Leider, Colby Nelson
  • Perry, Omer

Abstract

Disclosed herein are systems and methods for presenting an audio signal associated with presentation of a virtual object colliding with a surface. The virtual object and the surface may be associated with a mixed reality environment. Generation of the audio signal may be based on at least one of an audio stream from a microphone and a video stream form a sensor. In some embodiments, the collision between the virtual object and the surface is associated with a footstep on the surface.

IPC Classes  ?

  • G10K 15/02 - Synthesis of acoustic waves
  • G06V 20/40 - Scenes; Scene-specific elements in video content
  • G10L 25/57 - Speech or voice analysis techniques not restricted to a single one of groups specially adapted for particular use for comparison or discrimination for processing of video signals
  • H04R 1/08 - Mouthpieces; Attachments therefor
  • H04S 3/00 - Systems employing more than two channels, e.g. quadraphonic
  • H04S 7/00 - Indicating arrangements; Control arrangements, e.g. balance control

22.

Packaging insert

      
Application Number 29716277
Grant Number D1019391
Status In Force
Filing Date 2019-12-09
First Publication Date 2024-03-26
Grant Date 2024-03-26
Owner Magic Leap, Inc. (USA)
Inventor
  • Natsume, Shigeru
  • Hoit, Sarah
  • Palmer, James William
  • Gamez Castillejos, Daniel Marcelo
  • Palmer, Christopher G.

23.

TUNABLE CYLINDRICAL LENSES AND HEAD-MOUNTED DISPLAY INCLUDING THE SAME

      
Application Number 18510932
Status Pending
Filing Date 2023-11-16
First Publication Date 2024-03-21
Owner Magic Leap, Inc. (USA)
Inventor
  • Russell, Andrew Ian
  • Haddock, Joshua Naaman

Abstract

Systems include three optical elements arranged along an optical axis each having a different cylinder axis and a variable cylinder refractive power. Collectively, the three elements form a compound optical element having an overall spherical refractive power (SPH), cylinder refractive power (CYL), and cylinder axis (Axis) that can be varied according to a prescription (Rx).

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/02 - Viewing or reading apparatus

24.

DISPLAY SYSTEMS AND METHODS FOR DETERMINING REGISTRATION BETWEEN A DISPLAY AND A USER'S EYES

      
Application Number 18521613
Status Pending
Filing Date 2023-11-28
First Publication Date 2024-03-21
Owner Magic Leap, Inc. (USA)
Inventor
  • Edwin, Lionel Ernest
  • Nienstedt, Zachary C.
  • Yeoh, Ivan Li Chuen
  • Miller, Samuel A.
  • Xu, Yan
  • Cazamias, Jordan Alexander

Abstract

A wearable device may include a head-mounted display (HMD) for rendering a three-dimensional (3D) virtual object which appears to be located in an ambient environment of a user of the display. The relative positions of the HMD and one or more eyes of the user may not be in desired positions to receive, or register, image information outputted by the HMD. For example, the HMD-to-eye alignment may vary for different users and may change over time (e.g., as a user moves around and/or the HMD slips or is otherwise displaced). The wearable device may determine a relative position or alignment between the HMD and the user's eyes. Based on the relative positions, the wearable device may determine if it is properly fitted to the user, may provide feedback on the quality of the fit to the user, and may take actions to reduce or minimize effects of any misalignment.

IPC Classes  ?

  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • A61B 3/11 - Objective types, i.e. instruments for examining the eyes independent of the patients perceptions or reactions for measuring interpupillary distance or diameter of pupils
  • A61B 3/113 - Objective types, i.e. instruments for examining the eyes independent of the patients perceptions or reactions for determining or recording eye movement
  • G02B 27/01 - Head-up displays
  • G02B 30/00 - Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
  • G02B 30/40 - Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images giving the observer of a single two-dimensional [2D] image a perception of depth
  • G06F 1/16 - Constructional details or arrangements
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/0346 - Pointing devices displaced or positioned by the user; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
  • G06F 3/16 - Sound input; Sound output
  • G06V 40/18 - Eye characteristics, e.g. of the iris
  • H04N 13/344 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
  • H04N 13/383 - Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes

25.

VIRTUAL LOCATION SELECTION FOR VIRTUAL CONTENT

      
Application Number 18523382
Status Pending
Filing Date 2023-11-29
First Publication Date 2024-03-21
Owner Magic Leap, Inc. (USA)
Inventor
  • Warren, Silas
  • Khan, Omar
  • Miller, Samuel A.
  • Arora, Tushar

Abstract

A method for placing content in an augmented reality system. A notification is received regarding availability of new content to display in the augmented reality system. A confirmation is received that indicates acceptance of the new content. Three dimensional information that describes the physical environment is provided, to an external computing device, to enable the external computing device to be used for selecting an assigned location in the physical environment for the new content. Location information is received, from the external computing device, that indicates the assigned location. A display location on a display system of the augmented reality system at which to display the new content so that the new content appears to the user to be displayed as an overlay at the assigned location in the physical environment is determined, based on the location information. The new content is displayed on the display system at the display location.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts

26.

MATCHING CONTENT TO A SPATIAL 3D ENVIRONMENT

      
Application Number 18523763
Status Pending
Filing Date 2023-11-29
First Publication Date 2024-03-21
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Bastov, Denys
  • Ng-Thow-Hing, Victor
  • Reinhardt, Benjamin Zaaron
  • Zolotarev, Leonid
  • Pellet, Yannick
  • Marchenko, Aleksei
  • Meaney, Brian Everett
  • Shelton, Marc Coleman
  • Geiman, Megan Ann
  • Gotcher, John A.
  • Bogue, Matthew Schon
  • Balasubramanyam, Shivakumar
  • Ruediger, Jeffrey Edward
  • Lundmark, David Charles

Abstract

Systems and methods for matching content elements to surfaces in a spatially organized 3D environment. The method includes receiving content, identifying one or more elements in the content, determining one or more surfaces, matching the one or more elements to the one or more surfaces, and displaying the one or more elements as virtual content onto the one or more surfaces.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts

27.

SECURE EXCHANGE OF CRYPTOGRAPHICALLY SIGNED RECORDS

      
Application Number 18525698
Status Pending
Filing Date 2023-11-30
First Publication Date 2024-03-21
Owner Magic Leap, Inc. (USA)
Inventor Kaehler, Adrian

Abstract

Systems and methods for securely exchanging cryptographically signed records are disclosed. In one aspect, after receiving a content request, a sender device can send a record to a receiver device (e.g., an agent device) making the request. The record can be sent via a short range link in a decentralized (e.g., peer-to-peer) manner while the devices may not be in communication with a centralized processing platform. The record can comprise a sender signature created using the sender device's private key. The receiver device can verify the authenticity of the sender signature using the sender device's public key. After adding a cryptography-based receiver signature, the receiver device can redeem the record with the platform. Upon successful verification of the record, the platform can perform as instructed by a content of the record (e.g., modifying or updating a user account).

IPC Classes  ?

  • H04L 9/32 - Arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system
  • H04L 9/14 - Arrangements for secret or secure communications; Network security protocols using a plurality of keys or algorithms
  • H04L 9/30 - Public key, i.e. encryption algorithm being computationally infeasible to invert and users' encryption keys not requiring secrecy
  • H04L 9/40 - Network security protocols

28.

Wearable accessory with cameras

      
Application Number 29717289
Grant Number D1018624
Status In Force
Filing Date 2019-12-16
First Publication Date 2024-03-19
Grant Date 2024-03-19
Owner Magic Leap, Inc. (USA)
Inventor
  • Natsume, Shigeru
  • Gunther, Sebastian Gonzalo Arrieta

29.

METHOD AND SYSTEM FOR INTEGRATION OF REFRACTIVE OPTICS WITH A DIFFRACTIVE EYEPIECE WAVEGUIDE DISPLAY

      
Application Number 18513308
Status Pending
Filing Date 2023-11-17
First Publication Date 2024-03-14
Owner Magic Leap, Inc. (USA)
Inventor
  • Oh, Chulwoo
  • Komanduri, Ravi Kumar
  • Singh, Vikramjit
  • Yang, Shuqiang
  • Xu, Frank Y.

Abstract

A method of fabricating an optical element includes providing a substrate, forming a castable material coupled to the substrate, and casting the castable material using a mold. The method also includes curing the castable material and removing the mold. The optical element comprises a planar region and a clear aperture adjacent the planar region and characterized by an optical power.

IPC Classes  ?

30.

SYSTEMS AND METHODS FOR ARTIFICIAL INTELLIGENCE-BASED VIRTUAL AND AUGMENTED REALITY

      
Application Number 18513312
Status Pending
Filing Date 2023-11-17
First Publication Date 2024-03-14
Owner Magic Leap, Inc. (USA)
Inventor
  • Rabinovich, Andrew
  • Monos, John

Abstract

Examples of the disclosure describe systems and methods for generating and displaying a virtual companion. In an example method, a first input from an environment of a user is received at a first time via a first sensor. An occurrence of an event in the environment is determined based on the first input. A second input from the user is received via a second sensor, and an emotional reaction of the user is identified based on the second input. An association is determined between the emotional reaction and the event. A view of the environment is presented at a second time later than the first time via a display. A stimulus is presented at the second time via a virtual companion displayed via the display, wherein the stimulus is determined based on the determined association between the emotional reaction and the event.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G02B 27/01 - Head-up displays
  • G06F 1/16 - Constructional details or arrangements
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 21/53 - Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems during program execution, e.g. stack integrity, buffer overflow or preventing unwanted data erasure by executing in a restricted environment, e.g. sandbox or secure virtual machine
  • G06T 15/00 - 3D [Three Dimensional] image rendering

31.

WEARABLE SYSTEM SPEECH PROCESSING

      
Application Number 18510376
Status Pending
Filing Date 2023-11-15
First Publication Date 2024-03-14
Owner Magic Leap, Inc. (USA)
Inventor Leider, Colby Nelson

Abstract

A method of processing an acoustic signal is disclosed. According to one or more embodiments, a first acoustic signal is received via a first microphone. The first acoustic signal is associated with a first speech of a user of a wearable headgear unit. A first sensor input is received via a sensor, a control parameter is determined based on the sensor input. The control parameter is applied to one or more of the first acoustic signal, the wearable headgear unit, and the first microphone. Determining the control parameter comprises determining, based on the first sensor input, a relationship between the first speech and the first acoustic signal.

IPC Classes  ?

  • G10L 21/0208 - Noise filtering
  • G02B 27/01 - Head-up displays
  • G10L 15/18 - Speech classification or search using natural language modelling
  • G10L 15/22 - Procedures used during a speech recognition process, e.g. man-machine dialog
  • G10L 15/26 - Speech to text systems

32.

DETERMINING INPUT FOR SPEECH PROCESSING ENGINE

      
Application Number 18506866
Status Pending
Filing Date 2023-11-10
First Publication Date 2024-03-14
Owner Magic Leap, Inc. (USA)
Inventor
  • Sheeder, Anthony Robert
  • Leider, Colby Nelson

Abstract

A method of presenting a signal to a speech processing engine is disclosed. According to an example of the method, an audio signal is received via a microphone. A portion of the audio signal is identified, and a probability is determined that the portion comprises speech directed by a user of the speech processing engine as input to the speech processing engine. In accordance with a determination that the probability exceeds a threshold, the portion of the audio signal is presented as input to the speech processing engine. In accordance with a determination that the probability does not exceed the threshold, the portion of the audio signal is not presented as input to the speech processing engine.

IPC Classes  ?

  • G10L 15/22 - Procedures used during a speech recognition process, e.g. man-machine dialog
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G10L 15/14 - Speech classification or search using statistical models, e.g. Hidden Markov Models [HMM]
  • G10L 15/25 - Speech recognition using non-acoustical features using position of the lips, movement of the lips or face analysis
  • G10L 15/30 - Distributed recognition, e.g. in client-server systems, for mobile phones or network applications

33.

INTERAURAL TIME DIFFERENCE CROSSFADER FOR BINAURAL AUDIO RENDERING

      
Application Number 18510472
Status Pending
Filing Date 2023-11-15
First Publication Date 2024-03-14
Owner Magic Leap, Inc. (USA)
Inventor
  • Dicker, Samuel Charles
  • Barbhaiya, Harsh Mayur

Abstract

Examples of the disclosure describe systems and methods for presenting an audio signal to a user of a wearable head device. In an example, a received first input audio signal is processed to generate a left output audio signal and a right output audio signal presented to ears of the user. Processing the first input audio signal comprises applying a delay process to the first input audio signal to generate a left audio signal and a right audio signal; adjusting gains of the left audio signal and the right audio signal; applying head-related transfer functions (HRTFs) to the left and right audio signals to generate the left and right output audio signals. Applying the delay process to the first input audio signal comprises applying an interaural time delay (ITD) to the first input audio signal, the ITD determined based on the source location.

IPC Classes  ?

  • H04S 7/00 - Indicating arrangements; Control arrangements, e.g. balance control
  • H04R 3/04 - Circuits for transducers for correcting frequency response
  • H04R 5/033 - Headphones for stereophonic communication
  • H04R 5/04 - Circuit arrangements
  • H04S 3/00 - Systems employing more than two channels, e.g. quadraphonic

34.

CROSS REALITY SYSTEM WITH PRIORITIZATION OF GEOLOCATION INFORMATION FOR LOCALIZATION

      
Application Number 18510623
Status Pending
Filing Date 2023-11-15
First Publication Date 2024-03-14
Owner Magic Leap, Inc. (USA)
Inventor
  • Zhao, Xuan
  • Moore, Christian Ivan Robert
  • Lin, Sen
  • Shahrokni, Ali
  • Swaminathan, Ashwin

Abstract

A cross reality system enables any of multiple devices to efficiently access previously stored maps. Both stored maps and tracking maps used by portable devices may have any of multiple types of location metadata associated with them. The location metadata may be used to select a set of candidate maps for operations, such as localization or map merge, that involve finding a match between a location defined by location information from a portable device and any of a number of previously stored maps. The types of location metadata may prioritized for use in selecting the subset. To aid in selection of candidate maps, a universe of stored maps may be indexed based on geo-location information. A cross reality platform may update that index as it interacts with devices that supply geo-location information in connection with location information and may propagate that geo-location information to devices that do not supply it.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 9/54 - Interprogram communication
  • G06F 16/29 - Geographical information databases
  • G06F 16/907 - Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually

35.

SESSION MANAGER

      
Application Number 18513443
Status Pending
Filing Date 2023-11-17
First Publication Date 2024-03-14
Owner Magic Leap, Inc. (USA)
Inventor
  • Bailey, Richard St. Clair
  • Pothapragada, Siddartha
  • Mori, Koichi
  • Stolzenberg, Karen
  • Niles, Savannah
  • Noriega-Padilla, Domingo
  • Heiner, Cole Parker

Abstract

Disclosed are systems and methods for mixed reality collaboration. A method may include receiving persistent coordinate data; presenting a first virtual session handle to a first user at a first position via a transmissive display of a wearable device, wherein the first position is based on the persistent coordinate data; presenting a virtual object to the first user at a second location via the transmissive display, wherein the second position is based on the first position; receiving location data from a second user, wherein the location data relates a position of the second user to a position of a second virtual session handle; presenting a virtual avatar to the first user at a third position via the transmissive display, wherein the virtual avatar corresponds to the second user, wherein the third position is based on the location data, and wherein the third position is further based on the first position.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • H04L 67/131 - Protocols for games, networked simulations or virtual reality

36.

EYEPIECES FOR USE IN WEARABLE DISPLAY SYSTEMS

      
Application Number 18514500
Status Pending
Filing Date 2023-11-20
First Publication Date 2024-03-14
Owner Magic Leap, Inc. (USA)
Inventor
  • Lin, Dianmin
  • St. Hilaire, Pierre

Abstract

An example a head-mounted display device includes a light projector and an eyepiece. The eyepiece is arranged to receive light from the light projector and direct the light to a user during use of the wearable display system. The eyepiece includes a waveguide having an edge positioned to receive light from the display light source module and couple the light into the waveguide. The waveguide includes a first surface and a second surface opposite the first surface. The waveguide includes several different regions, each having different grating structures configured to diffract light according to different sets of grating vectors.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems

37.

AUGMENTED AND VIRTUAL REALITY DISPLAY SYSTEMS WITH CORRELATED IN-COUPLING AND OUT-COUPLING OPTICAL REGIONS FOR EFFICIENT LIGHT UTILIZATION

      
Application Number 18516321
Status Pending
Filing Date 2023-11-21
First Publication Date 2024-03-14
Owner Magic Leap, Inc. (USA)
Inventor Schowengerdt, Brian T.

Abstract

Augmented reality and virtual reality display systems and devices are configured for efficient use of projected light. In some aspects, a display system includes a light projection system and a head-mounted display configured to project light into an eye of the user to display virtual image content. The head-mounted display includes at least one waveguide comprising a plurality of in-coupling regions each configured to receive, from the light projection system, light corresponding to a portion of the user's field of view and to in-couple the light into the waveguide; and a plurality of out-coupling regions configured to out-couple the light out of the waveguide to display the virtual content, wherein each of the out-coupling regions are configured to receive light from different ones of the in-coupling regions. In some implementations, each in-coupling region has a one-to-one correspondence with a unique corresponding out-coupling region.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems

38.

EYE TRACKING USING ALTERNATE SAMPLING

      
Application Number 18516469
Status Pending
Filing Date 2023-11-21
First Publication Date 2024-03-14
Owner Magic Leap, Inc. (USA)
Inventor Russell, Andrew Ian

Abstract

An eye tracking system can include a first camera configured to capture a first plurality of visual data of a right eye at a first sampling rate. The system can include a second camera configured to capture a second plurality of visual data of a left eye at a second sampling rate. The second plurality of visual data can be captured during different sampling times than the first plurality of visual data. The system can estimate, based on at least some visual data of the first and second plurality of visual data, visual data of at least one of the right or left eye at a sampling time during which visual data of an eye for which the visual data is being estimated are not being captured. Eye movements of the eye based on at least some of the estimated visual data and at least some visual data of the first or second plurality of visual data can be determined.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/01 - Head-up displays
  • G06T 7/20 - Analysis of motion

39.

POLYCHROMATIC LIGHT OUT-COUPLING APPARATUS, NEAR-EYE DISPLAYS COMPRISING THE SAME, AND METHOD OF OUT-COUPLING POLYCHROMATIC LIGHT

      
Application Number 18517915
Status Pending
Filing Date 2023-11-22
First Publication Date 2024-03-14
Owner Magic Leap, Inc. (USA)
Inventor
  • Kimmel, Jyrki
  • Jarvenpaa, Toni
  • Eskolin, Peter
  • Salmimaa, Marja

Abstract

The present invention provides an apparatus (3) comprising a first out-coupling diffractive optical element (10) and a second out-coupling diffractive optical element (20). Each of the first and second out-coupling diffractive optical elements comprises a first region (12a, 22a) having a first repeated diffraction spacing, d1, and a second region (12b, 22b) adjacent to the first region having a second repeated diffraction spacing, d2, different from the first spacing, d1. The first region (12a) of the first out-coupling diffractive optical element (10) is superposed on and aligned with the second region (22b) of the second out-coupling diffractive optical element (20). The second region (12b) of the first out-coupling diffractive optical element (10) is superposed on and aligned with the first region (22a) of the second out-coupling diffractive optical element (20).

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 5/18 - Diffracting gratings
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,

40.

BUNDLE ADJUSTMENT USING EPIPOLAR CONSTRAINTS

      
Application Number 18266756
Status Pending
Filing Date 2021-12-03
First Publication Date 2024-03-14
Owner Magic Leap, Inc. (USA)
Inventor
  • Souiai, Mohamed
  • Gupta, Ankur

Abstract

Methods, systems, and apparatus for performing bundling adjustment using epipolar constraints. A method includes receiving image data from a headset for a particular pose. The image data includes a first image from a first camera of the headset and a second image from a second camera of the headset. The method includes identifying at least one key point in a three-dimensional model of an environment at least partly represented in the first image and the second image and performing bundle adjustment. Bundle adjustment is performed by jointly optimizing a reprojection error for the at least one key point and an epipolar error for the at least one key point. Results of the bundle adjustment are used to perform at least one of (i) updating the three-dimensional model, (ii) determining a position of the headset at the particular pose, or (iii) determining extrinsic parameters of the first camera and second camera.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/01 - Head-up displays
  • G06T 7/70 - Determining position or orientation of objects or cameras
  • G06T 7/80 - Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

41.

EYE CENTER OF ROTATION DETERMINATION WITH ONE OR MORE EYE TRACKING CAMERAS

      
Application Number 18387745
Status Pending
Filing Date 2023-11-07
First Publication Date 2024-03-07
Owner Magic Leap, Inc. (USA)
Inventor
  • Cohen, David
  • Joseph, Elad
  • Ferens, Ron Nisim
  • Preter, Eyal
  • Bar-On, Eitan Shmuel
  • Yahav, Giora

Abstract

A display system can include a head-mounted display configured to project light to an eye of a user to display virtual image content at different amounts of divergence and collimation. The display system can include an inward-facing imaging system possibly comprising a plurality of cameras that image the user's eye and glints for thereon and processing electronics that are in communication with the inward-facing imaging system and that are configured to obtain an estimate of a center of rotation of the user's eye using cornea data derived from the glint images. The display system may render virtual image content with a render camera positioned at the determined position of the center of rotation of said eye.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/01 - Head-up displays
  • G06T 7/246 - Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
  • G06T 7/292 - Multi-camera tracking
  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods

42.

SYSTEM FOR PROVIDING ILLUMINATION OF THE EYE

      
Application Number 18497518
Status Pending
Filing Date 2023-10-30
First Publication Date 2024-03-07
Owner Magic Leap, Inc. (USA)
Inventor
  • Meitav, Nizan
  • Yaras, Fahri
  • Jurbergs, David Carl

Abstract

A thin transparent layer can be integrated in a head mounted display device and disposed in front of the eye of a wearer. The thin transparent layer may be configured to output light such that light is directed onto the eye to create reflections therefrom that can be used, for example, for glint based tracking. The thin transparent layer can be configured to reduced obstructions in the field of the view of the user.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems

43.

PLENOPTIC CAMERA MEASUREMENT AND CALIBRATION OF HEAD-MOUNTED DISPLAYS

      
Application Number 18505437
Status Pending
Filing Date 2023-11-09
First Publication Date 2024-03-07
Owner Magic Leap, Inc. (USA)
Inventor Schuck, Iii, Miller Harry

Abstract

A method for measuring performance of a head-mounted display module, the method including arranging the head-mounted display module relative to a plenoptic camera assembly so that an exit pupil of the head-mounted display module coincides with a pupil of the plenoptic camera assembly; emitting light from the head-mounted display module while the head-mounted display module is arranged relative to the plenoptic camera assembly; filtering the light at the exit pupil of the head-mounted display module; acquiring, with the plenoptic camera assembly, one or more light field images projected from the head-mounted display module with the filtered light; and determining information about the performance of the head-mounted display module based on acquired light field image.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 5/20 - Filters
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/28 - Optical systems or apparatus not provided for by any of the groups , for polarising

44.

BIASED TOTAL THICKNESS VARIATIONS IN WAVEGUIDE DISPLAY SUBSTRATES

      
Application Number 18505762
Status Pending
Filing Date 2023-11-09
First Publication Date 2024-03-07
Owner Magic Leap, Inc. (USA)
Inventor
  • Bhargava, Samarth
  • Peroz, Christophe
  • Liu, Victor Kai

Abstract

A plurality of waveguide display substrates, each waveguide display substrate having a cylindrical portion having a diameter and a planar surface, a curved portion opposite the planar surface defining a nonlinear change in thickness across the substrate and having a maximum height D with respect to the cylindrical portion, and a wedge portion between the cylindrical portion and the curved portion defining a linear change in thickness across the substrate and having a maximum height W with respect to the cylindrical portion. A target maximum height Dt of the curved portion is 10−7 to 10−6 times the diameter, D is between about 70% and about 130% of Dt, and W is less than about 30% of Dt.

IPC Classes  ?

  • G02B 6/13 - Integrated optical circuits characterised by the manufacturing method
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems
  • G02B 6/122 - Basic optical elements, e.g. light-guiding paths
  • G02B 26/08 - Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light

45.

FRAME-BY-FRAME RENDERING FOR AUGMENTED OR VIRTUAL REALITY SYSTEMS

      
Application Number 18506947
Status Pending
Filing Date 2023-11-10
First Publication Date 2024-03-07
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Schowengerdt, Brian T.
  • Miller, Samuel A.

Abstract

One embodiment is directed to a user display device comprising a housing frame mountable on the head of the user, a lens mountable on the housing frame and a projection sub system coupled to the housing frame to determine a location of appearance of a display object in a field of view of the user based at least in part on at least one of a detection of a head movement of the user and a prediction of a head movement of the user, and to project the display object to the user based on the determined location of appearance of the display object.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/01 - Head-up displays
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 1/60 - Memory management
  • G06T 7/70 - Determining position or orientation of objects or cameras
  • G09G 3/00 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes

46.

DYNAMICALLY ACTUABLE DIFFRACTIVE OPTICAL ELEMENT

      
Application Number 18388104
Status Pending
Filing Date 2023-11-08
First Publication Date 2024-03-07
Owner Magic Leap, Inc. (USA)
Inventor
  • Yeoh, Ivan Li Chuen
  • Edwin, Lionel Ernest

Abstract

A dynamically actuable lens includes a substrate having a surface and a metasurface diffractive optical element (DOE) formed on the surface. The metasurface DOE includes a plurality of raised portions and defines a plurality of recesses between adjacent raised portions. The dynamically actuable lens also includes a movable cover overlying the metasurface DOE and comprising a hydrophilic material, a quantity of a fluid disposed on the movable cover, and a drive mechanism coupled to the movable cover. The drive mechanism is configured to move the movable cover toward the metasurface DOE to displace a portion of the quantity of the fluid into the plurality of recesses, thereby rendering the metasurface DOE in an “off” state, and move the movable cover away from the metasurface DOE, causing the portion of the quantity of the fluid retracting from the plurality of recesses, thereby rendering the metasurface DOE in an “on” state.

IPC Classes  ?

  • G02B 27/42 - Diffraction optics
  • G02B 5/18 - Diffracting gratings
  • G02B 26/00 - Optical devices or arrangements for the control of light using movable or deformable optical elements
  • G02B 26/08 - Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
  • G02B 27/01 - Head-up displays

47.

AUGMENTED REALITY SYSTEM AND METHOD FOR SPECTROSCOPIC ANALYSIS

      
Application Number 18388421
Status Pending
Filing Date 2023-11-09
First Publication Date 2024-02-29
Owner Magic Leap, Inc. (USA)
Inventor
  • Kaehler, Adrian
  • Harrises, Christopher M.
  • Baerenrodt, Eric
  • Baerenrodt, Mark
  • Robaina, Natasja U.
  • Samec, Nicole Elizabeth
  • Powers, Tammy Sherri
  • Yeoh, Ivan Li Chuen
  • Wright, Adam Carl

Abstract

Wearable spectroscopy systems and methods for identifying one or more characteristics of a target object are described. Spectroscopy systems may include a light source configured to emit light in an irradiated field of view and an electromagnetic radiation detector configured to receive reflected light from a target object irradiated by the light source. One or more processors of the systems may identify a characteristic of the target object based on a determined level of light absorption by the target object. Some systems and methods may include one or more corrections for scattered and/or ambient light such as applying an ambient light correction, passing the reflected light through an anti-scatter grid, or using a time-dependent variation in the emitted light.

IPC Classes  ?

  • G01J 3/02 - Spectrometry; Spectrophotometry; Monochromators; Measuring colours - Details
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems
  • G01J 3/42 - Absorption spectrometry; Double-beam spectrometry; Flicker spectrometry; Reflection spectrometry
  • G01N 21/25 - Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
  • G01N 21/27 - Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands using photo-electric detection
  • G02B 27/01 - Head-up displays
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G09G 5/37 - Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of individual graphic patterns using a bit-mapped memory - Details of the operation on graphic patterns

48.

DISPLAY SYSTEM WITH LOW-LATENCY PUPIL TRACKER

      
Application Number 18500868
Status Pending
Filing Date 2023-11-02
First Publication Date 2024-02-29
Owner Magic Leap, Inc. (USA)
Inventor Klug, Michael Anthony

Abstract

A display system aligns the location of its exit pupil with the location of a viewer's pupil by changing the location of the portion of a light source that outputs light. The light source may include an array of pixels that output light, thereby allowing an image to be displayed on the light source. The display system includes a camera that captures images of the eye and negatives of the images are displayed by the light source. In the negative image, the dark pupil of the eye is a bright spot which, when displayed by the light source, defines the exit pupil of the display system. The location of the pupil of the eye may be tracked by capturing the images of the eye, and the location of the exit pupil of the display system may be adjusted by displaying negatives of the captured images using the light source.

IPC Classes  ?

  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/01 - Head-up displays

49.

EYEPIECE IMAGING ASSEMBLIES FOR A HEAD MOUNTED DISPLAY

      
Application Number 18259044
Status Pending
Filing Date 2021-12-22
First Publication Date 2024-02-29
Owner Magic Leap, Inc. (USA)
Inventor
  • Jia, Zhiheng
  • Cohen, David
  • Edwin, Lionel Ernest
  • Schabacker, Charles Robert

Abstract

A head mounted display can include a frame, an eyepiece, an image injection device, a sensor array, a reflector, and an off-axis optical element. The frame can be configured to be supported on the head of the user. The eyepiece can be coupled to the frame and configured to be disposed in front of an eye of the user. The eyepiece can include a plurality of layers. The image injection device can be configured to provide image content to the eyepiece for viewing by the user. The sensor array can be integrated in or one the eyepiece. The reflector can be disposed in or on the eyepiece and configured to reflect light received from an object for imaging by the sensor array. The off-axis optical element can be disposed in or one the eyepiece. The off-axis optical element can be configured to receive light reflected from the reflector and direct at least a portion of the light toward the sensor array.

IPC Classes  ?

  • H04N 13/344 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
  • G02B 5/28 - Interference filters
  • G02B 25/00 - Eyepieces; Magnifying glasses
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/01 - Head-up displays

50.

RENDERING LOCATION SPECIFIC VIRTUAL CONTENT IN ANY LOCATION

      
Application Number 18460873
Status Pending
Filing Date 2023-09-05
First Publication Date 2024-02-29
Owner Magic Leap, Inc. (USA)
Inventor
  • Brodsky, Jonathan
  • Busto, Javier Antonio
  • Smith, Martin Wilkins

Abstract

Augmented reality systems and methods for creating, saving and rendering designs comprising multiple items of virtual content in a three-dimensional (3D) environment of a user. The designs may be saved as a scene, which is built by a user from pre-built sub-components, built components, and/or previously saved scenes. Location information, expressed as a saved scene anchor and position relative to the saved scene anchor for each item of virtual content, may also be saved. Upon opening the scene, the saved scene anchor node may be correlated to a location within the mixed reality environment of the user for whom the scene is opened. The virtual items of the scene may be positioned with the same relationship to that location as they have to the saved scene anchor node. That location may be selected automatically and/or by user input.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G02B 27/01 - Head-up displays
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06V 10/764 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
  • G06V 10/82 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
  • G06V 20/20 - Scenes; Scene-specific elements in augmented reality scenes

51.

Set of accessory boxes

      
Application Number 29716337
Grant Number D1015871
Status In Force
Filing Date 2019-12-09
First Publication Date 2024-02-27
Grant Date 2024-02-27
Owner Magic Leap, Inc. (USA)
Inventor
  • Natsume, Shigeru
  • Hoit, Sarah
  • Palmer, James William
  • Gamez Castillejos, Daniel Marcelo
  • Palmer, Christopher G.

52.

Mobile device accessory with cameras

      
Application Number 29717206
Grant Number D1016119
Status In Force
Filing Date 2019-12-16
First Publication Date 2024-02-27
Grant Date 2024-02-27
Owner Magic Leap, Inc. (USA)
Inventor
  • Natsume, Shigeru
  • Gunther, Sebastian Gonzalo Arrieta
  • Awad, Haney
  • Green Mercer, Bryson John

53.

CALIBRATION FOR VIRTUAL OR AUGMENTED REALITY SYSTEMS

      
Application Number 18266937
Status Pending
Filing Date 2021-12-21
First Publication Date 2024-02-22
Owner Magic Leap, Inc. (USA)
Inventor
  • Gupta, Ankur
  • Souiai, Mohamed

Abstract

Techniques for addressing deformations in a virtual or augmented headset described. In some implementations, cameras in a headset can obtain image data at different times as the headset moves through a series of poses of the headset. One or more miscalibration conditions for the headset that have occurred as the headset moved through the series of poses can be detected. The series of poses can be divided into groups of poses based on the one or more miscalibration conditions, and bundle adjustment for the groups of poses can be performed using a separate set of camera calibration data. The bundle adjustment for the poses in each group is performed using a same set of calibration data for the group. The camera calibration data for each group is estimated jointly with bundle adjustment estimation for the poses in the group.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 7/70 - Determining position or orientation of objects or cameras
  • G06T 7/80 - Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

54.

CROSS REALITY SYSTEM WITH ACCURATE SHARED MAPS

      
Application Number 18457314
Status Pending
Filing Date 2023-08-28
First Publication Date 2024-02-22
Owner Magic Leap, Inc. (USA)
Inventor
  • Velasquez, Miguel Andres Granados
  • Gomez Gonzalez, Javier Victorio
  • Prasad, Mukta
  • Guendelman, Eran
  • Shahrokni, Ali
  • Swaminathan, Ashwin

Abstract

A cross reality system enables any of multiple devices to efficiently and accurately access previously persisted maps of very large scale environments and render virtual content specified in relation to those maps. The cross reality system may build a persisted map, which may be in canonical form, by merging tracking maps from the multiple devices. A map merge process determines mergibility of a tracking map with a canonical map and merges a tracking map with a canonical map in accordance with mergibility criteria, such as, when a gravity direction of the tracking map aligns with a gravity direction of the canonical map. Refraining from merging maps if the orientation of the tracking map with respect to gravity is not preserved avoids distortions in persisted maps and results in multiple devices, which may use the maps to determine their locations, to present more realistic and immersive experiences for their users.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G01C 21/00 - Navigation; Navigational instruments not provided for in groups

55.

TUNABLE ATTENUATION OF LIGHT TRANSMISSION ARTIFACTS IN WEARABLE DISPLAYS

      
Application Number 18497659
Status Pending
Filing Date 2023-10-30
First Publication Date 2024-02-22
Owner Magic Leap, Inc. (USA)
Inventor
  • Cheng, Hui-Chuan
  • Manly, David
  • Mathur, Vaibhav
  • Haddock, Joshua Naaman
  • Messer, Kevin
  • Carlisle, Clinton

Abstract

A method for displaying an image using a wearable display system including directing display light from a display towards a user through an eyepiece to project images in the user's field of view, determining a relative location between an ambient light source and the eyepiece, and adjusting an attenuation of ambient light from the ambient light source through the eyepiece depending on the relative location between the ambient light source and the eyepiece.

IPC Classes  ?

  • G02F 1/137 - Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour based on liquid crystals, e.g. single liquid crystal display cells characterised by the electro-optical or magneto-optical effect, e.g. field-induced phase transition, orientation effect, guest-host interaction or dynamic scattering
  • G02F 1/13363 - Birefringent elements, e.g. for optical compensation
  • G02F 1/1335 - Structural association of cells with optical devices, e.g. polarisers or reflectors
  • G02F 1/01 - Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour
  • G02F 1/1347 - Arrangement of liquid crystal layers or cells in which the final condition of one light beam is achieved by the addition of the effects of two or more layers or cells
  • G02B 27/01 - Head-up displays
  • G02B 25/00 - Eyepieces; Magnifying glasses

56.

SINGLE PUPIL RGB LIGHT SOURCE

      
Application Number 18260708
Status Pending
Filing Date 2022-01-07
First Publication Date 2024-02-22
Owner Magic Leap, Inc. (USA)
Inventor
  • Curtis, Kevin Richard
  • Hall, Heidi Leising
  • Trisnadi, Jahja L.

Abstract

Embodiments of this disclosure systems and methods for displays. In embodiments, a display system includes a light source configured to emit a first light, a lens configured to receive the first light, and an image generator configured receive the first light and emit a second light. The display system further includes a plurality of waveguides, where at least two of the plurality of waveguides include an in-coupling grating configured to selectively couple the second light. In some embodiments, the light source can comprise a single pupil light source having a reflector and a micro-LED array disposed in the reflector.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems

57.

LIDAR SIMULTANEOUS LOCALIZATION AND MAPPING

      
Application Number 18264572
Status Pending
Filing Date 2022-02-11
First Publication Date 2024-02-22
Owner Magic Leap, Inc. (USA)
Inventor
  • Zhou, Lipu
  • Swaminathan, Ashwin
  • Agarwal, Lomesh

Abstract

Disclosed here in are systems and methods for mapping environment information. In some embodiments, the systems and methods are configured for mapping information in a mixed reality environment. In some embodiments, the system is configured to perform a method including scanning an environment including capturing, with a sensor, a plurality of points of the environment; tracking a plane of the environment; updating observations associated with the environment by inserting a keyframe into the observations; determining whether the plane is coplanar with a second plane of the environment; in accordance with a determination that the plane is coplanar with the second plane, performing planar bundle adjustment on the observations associated with the environment; and in accordance with a determination that the plane is not coplanar with the second plane, performing planar bundle adjustment on a portion of the observations associated with the environment.

IPC Classes  ?

  • G06T 7/579 - Depth or shape recovery from multiple images from motion
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

58.

SYSTEMS AND METHODS FOR SIGN LANGUAGE RECOGNITION

      
Application Number 18357531
Status Pending
Filing Date 2023-07-24
First Publication Date 2024-02-22
Owner Magic Leap, Inc. (USA)
Inventor
  • Browy, Eric
  • Woods, Michael Janusz
  • Rabinovich, Andrew

Abstract

A sensory eyewear system for a mixed reality device can facilitate user's interactions with the other people or with the environment. As one example, the sensory eyewear system can recognize and interpret a sign language, and present the translated information to a user of the mixed reality device. The wearable system can also recognize text in the user's environment, modify the text (e.g., by changing the content or display characteristics of the text), and render the modified text to occlude the original text.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 1/16 - Constructional details or arrangements
  • G06F 3/03 - Arrangements for converting the position or the displacement of a member into a coded form
  • G06F 40/58 - Use of machine translation, e.g. for multi-lingual retrieval, for server-side translation for client devices or for real-time translation
  • G06V 20/20 - Scenes; Scene-specific elements in augmented reality scenes
  • G06V 40/20 - Movements or behaviour, e.g. gesture recognition

59.

METHODS FOR REFINING RGBD CAMERA POSES

      
Application Number 18384627
Status Pending
Filing Date 2023-10-27
First Publication Date 2024-02-22
Owner Magic Leap, Inc. (USA)
Inventor Wei, Xiaolin

Abstract

A method for refining poses includes receiving a plurality of poses and computing a relative pose set by determining a first set of relative poses between image frame pairs for a first subset of the image frame pairs having a temporal separation between image frames of the image frame pairs less than a threshold, and determining a second set of relative poses between image frame pairs for a second subset of the image frame pairs having a temporal separation between image frames of the image frame pairs greater than the threshold.

IPC Classes  ?

  • H04N 23/10 - Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
  • G06T 7/33 - Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods

60.

Head mounted audio-visual display system connection cable

      
Application Number 29709698
Grant Number D1015305
Status In Force
Filing Date 2019-10-16
First Publication Date 2024-02-20
Grant Date 2024-02-20
Owner Magic Leap, Inc. (USA)
Inventor
  • Lundmark, David Charles
  • Sommers, Jeffrey Scott

61.

PROCESSING SECURE CONTENT ON A VIRTUAL REALITY SYSTEM

      
Application Number 18380099
Status Pending
Filing Date 2023-10-13
First Publication Date 2024-02-15
Owner Magic Leap, Inc. (USA)
Inventor
  • Taylor, Robert Blake
  • Pastouchenko, Dmitry
  • Plourde, Frederic

Abstract

Described herein are techniques and technologies to identify an encrypted content within a field of view of a user of a VR/AR system and process the encrypted content appropriately. The user of the VR/AR technology may have protected content in a field of view of the user. Encrypted content is mapped to one or more protected surfaces on a display device. Contents mapped to a protected surface may be rendered on the display device but prevented from being replicated from the display device.

IPC Classes  ?

  • G06F 21/62 - Protecting access to data via a platform, e.g. using keys or access control rules
  • G06F 3/14 - Digital output to display device
  • G06F 21/60 - Protecting data
  • G06F 21/84 - Protecting input, output or interconnection devices output devices, e.g. displays or monitors

62.

IMAGING MODIFICATION, DISPLAY AND VISUALIZATION USING AUGMENTED AND VIRTUAL REALITY EYEWEAR

      
Application Number 18475688
Status Pending
Filing Date 2023-09-27
First Publication Date 2024-02-15
Owner Magic Leap, Inc. (USA)
Inventor
  • Robaina, Nastasja U.
  • Samec, Nicole Elizabeth
  • Harrises, Christopher M.
  • Abovitz, Rony
  • Baerenrodt, Mark
  • Schmidt, Brian Lloyd

Abstract

A display system can include a head-mounted display configured to project light to an eye of a user to display augmented reality image content to the user. The display system can include one or more user sensors configured to sense the user and can include one or more environmental sensors configured to sense surroundings of the user. The display system can also include processing electronics in communication with the display, the one or more user sensors, and the one or more environmental sensors. The processing electronics can be configured to sense a situation involving user focus, determine user intent for the situation, and alter user perception of a real or virtual object within the vision field of the user based at least in part on the user intent and/or sensed situation involving user focus. The processing electronics can be configured to at least one of enhance or de-emphasize the user perception of the real or virtual object within the vision field of the user.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • A61B 90/00 - Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups , e.g. for luxation treatment or for protecting wound edges
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/03 - Arrangements for converting the position or the displacement of a member into a coded form
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery

63.

SYSTEMS AND METHODS FOR VIRTUAL AND AUGMENTED REALITY

      
Application Number 18493633
Status Pending
Filing Date 2023-10-24
First Publication Date 2024-02-15
Owner Magic Leap, Inc. (USA)
Inventor Browy, Eric C.

Abstract

Disclosed herein are systems and methods for distributed computing and/or networking for mixed reality systems. A method may include capturing an image via a camera of a head-wearable device. Inertial data may be captured via an inertial measurement unit of the head-wearable device. A position of the head-wearable device can be estimated based on the image and the inertial data via one or more processors of the head-wearable device. The image can be transmitted to a remote server. A neural network can be trained based on the image via the remote server. A trained neural network can be transmitted to the head-wearable device.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/14 - Digital output to display device
  • H04B 7/155 - Ground-based stations
  • G06F 18/214 - Generating training patterns; Bootstrap methods, e.g. bagging or boosting
  • G06V 10/764 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
  • G06V 20/20 - Scenes; Scene-specific elements in augmented reality scenes
  • G06V 40/16 - Human faces, e.g. facial parts, sketches or expressions
  • G06V 40/18 - Eye characteristics, e.g. of the iris

64.

PAIRING WITH COMPANION DEVICE

      
Application Number 18493113
Status Pending
Filing Date 2023-10-24
First Publication Date 2024-02-15
Owner Magic Leap, Inc. (USA)
Inventor
  • Singh, Nitin
  • Kaehler, Adrian

Abstract

This disclosure describes techniques for device authentication and/or pairing. A display system can comprise a head mountable display, computer memory, and processor(s). In response to receiving a request to authenticate a connection between the display system and a companion device (e.g., controller or other computer device), first data may be determined, the first data based at least partly on audio data spoken by a user. The first data may be sent to an authentication device configured to compare the first data to second data received from the companion device, the second data based at least partly on the audio data. Based at least partly on a correspondence between the first and second data, the authentication device can send a confirmation to the display system to permit communication between the display system and companion device.

IPC Classes  ?

  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G06F 21/32 - User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
  • G02B 27/01 - Head-up displays
  • H04W 4/80 - Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication
  • H04W 8/00 - Network data management
  • H04M 1/60 - Substation equipment, e.g. for use by subscribers including speech amplifiers
  • G06F 21/44 - Program or device authentication
  • H04L 9/40 - Network security protocols
  • H04W 12/50 - Secure pairing of devices
  • H04W 12/065 - Continuous authentication
  • H04W 12/069 - Authentication using certificates or pre-shared keys
  • H04W 12/77 - Graphical identity
  • G06V 20/80 - Recognising image objects characterised by unique random patterns

65.

SYSTEMS AND METHODS FOR CROSS-APPLICATION AUTHORING, TRANSFER, AND EVALUATION OF RIGGING CONTROL SYSTEMS FOR VIRTUAL CHARACTERS

      
Application Number 18493439
Status Pending
Filing Date 2023-10-24
First Publication Date 2024-02-15
Owner Magic Leap, Inc. (USA)
Inventor
  • Wedig, Geoffrey
  • Bancroft, James Jonathan

Abstract

Various examples of cross-application systems and methods for authoring, transferring, and evaluating rigging control systems for virtual characters are disclosed. Embodiments of a method include the steps or processes of creating, in a first application which implements a first rigging control protocol, a rigging control system description; writing the rigging control system description to a data file; and initiating transfer of the data file to a second application. In such embodiments, the rigging control system description may be defined according to a different second rigging control protocol. The rigging control system description may specify a rigging control input, such as a lower-order rigging element (e.g., a core skeleton for a virtual character), and at least one rule for operating on the rigging control input to produce a rigging control output, such as a higher-order skeleton or other higher-order rigging element.

IPC Classes  ?

  • G06T 13/40 - 3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

66.

CROSS REALITY SYSTEM WITH LOCALIZATION SERVICE AND SHARED LOCATION-BASED CONTENT

      
Application Number 18496407
Status Pending
Filing Date 2023-10-27
First Publication Date 2024-02-15
Owner Magic Leap, Inc. (USA)
Inventor
  • Caswell, Timothy Dean
  • Piascik, Konrad
  • Zolotarev, Leonid
  • Rushton, Mark Ashley

Abstract

A cross reality system enables any of multiple devices to efficiently render shared location-based content. The cross reality system may include a cloud-based service that responds to requests from devices to localize with respect to a stored map. The service may return to the device information that localizes the device with respect to the stored map. In conjunction with localization information, the service may provide information about locations in the physical world proximate the device for which virtual content has been provided. Based on information received from the service, the device may render, or stop rendering, virtual content to each of multiple users based on the user's location and specified locations for the virtual content.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06T 15/00 - 3D [Three Dimensional] image rendering
  • G06T 15/20 - Perspective computation

67.

EYEPIECE FOR HEAD-MOUNTED DISPLAY AND METHOD FOR MAKING THE SAME

      
Application Number 18238635
Status Pending
Filing Date 2023-08-28
First Publication Date 2024-02-08
Owner Magic Leap, Inc. (USA)
Inventor
  • Chang, Chieh
  • Peroz, Christophe
  • Ong, Ryan Jason
  • Li, Ling
  • Bhagat, Sharad D.
  • Bhargava, Samarth

Abstract

A method, includes providing a wafer including a first surface grating extending over a first area of a surface of the wafer and a second surface grating extending over a second area of the surface of the wafer; de-functionalizing a portion of the surface grating in at least one of the first surface grating area and the second surface grating area; and singulating an eyepiece from the wafer, the eyepiece including a portion of the first surface grating area and a portion of the second surface grating area. The first surface grating in the eyepiece corresponds to an input coupling grating for a head-mounted display and the second surface grating corresponds to a pupil expander grating for the head-mounted display.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 6/02 - Optical fibres with cladding

68.

CONCURRENT CAMERA CALIBRATION AND BUNDLE ADJUSTMENT

      
Application Number 18245816
Status Pending
Filing Date 2021-08-17
First Publication Date 2024-02-08
Owner Magic Leap, Inc. (USA)
Inventor
  • Souai, Mohamed
  • Gupta, Ankur
  • Napolskikh, Igor

Abstract

Methods, systems, and apparatus, including computer programs encoded on computer storage media, for camera calibration during bundle adjustment. One of the methods includes maintaining a three-dimensional model of an environment and a plurality of image data clusters that each include data generated from images captured by two or more cameras included in a device. The method includes jointly determining, for a three-dimensional point represented by an image data cluster (i) the newly estimated coordinates for the three-dimensional point for an update to the three-dimensional model or a trajectory of the device, and (ii) the newly estimated calibration data that represents the spatial relationship between the two or more cameras.

IPC Classes  ?

  • G06T 7/80 - Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

69.

IMPRINT LITHOGRAPHY USING MULTI-LAYER COATING ARCHITECTURE

      
Application Number 18555502
Status Pending
Filing Date 2022-04-21
First Publication Date 2024-02-08
Owner Magic Leap, Inc. (USA)
Inventor
  • Menezes, Marlon Edward
  • Singh, Vikramjit
  • Xu, Frank Y.

Abstract

Structures for forming an optical feature and methods for forming the optical feature are disclosed. In some embodiments, the structure comprises a patterned layer comprising a pattern corresponding to the optical feature; a base layer; and an intermediate layer bonded to the patterned layer and the base layer.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 6/34 - Optical coupling means utilising prism or grating
  • B29D 11/00 - Producing optical elements, e.g. lenses or prisms

70.

SYSTEM AND METHOD FOR PRESENTING IMAGE CONTENT ON MULTIPLE DEPTH PLANES BY PROVIDING MULTIPLE INTRA-PUPIL PARALLAX VIEWS

      
Application Number 18490169
Status Pending
Filing Date 2023-10-19
First Publication Date 2024-02-08
Owner Magic Leap, Inc. (USA)
Inventor
  • Klug, Michael Anthony
  • Konrad, Robert
  • Wetzstein, Gordon
  • Schowengerdt, Brian T.
  • Vaughn, Michal Beau Dennison

Abstract

An augmented reality display system is configured to direct a plurality of parallactically-disparate intra-pupil images into a viewer's eye. The parallactically-disparate intra-pupil images provide different parallax views of a virtual object, and impinge on the pupil from different angles. In the aggregate, the wavefronts of light forming the images approximate a continuous divergent wavefront and provide selectable accommodation cues for the user, depending on the amount of parallax disparity between the intra-pupil images. The amount of parallax disparity is selected using a light source that outputs light for different images from different locations, with spatial differences in the locations of the light output providing differences in the paths that the light takes to the eye, which in turn provide different amounts of parallax disparity. Advantageously, the wavefront divergence, and the accommodation cue provided to the eye of the user, may be varied by appropriate selection of parallax disparity, which may be set by selecting the amount of spatial separation between the locations of light output.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 30/24 - Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer’s left and right eyes of the stereoscopic type involving temporal multiplexing, e.g. using sequentially activated left and right shutters
  • G02B 30/34 - Stereoscopes providing a stereoscopic pair of separated images corresponding to parallactically displaced views of the same object, e.g. 3D slide viewers
  • H04N 13/128 - Adjusting depth or disparity
  • H04N 13/383 - Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes
  • H04N 13/344 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
  • H04N 13/341 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using temporal multiplexing
  • H04N 13/339 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using spatial multiplexing
  • H04N 13/398 - Synchronisation thereof; Control thereof

71.

SYSTEMS AND METHODS FOR VIRTUAL AND AUGMENTED REALITY

      
Application Number 18490518
Status Pending
Filing Date 2023-10-19
First Publication Date 2024-02-08
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Lundmark, David Charles
  • Broadmore, Gregory Michael

Abstract

An apparatus configured to be head-worn by a user, includes: a transparent screen configured to allow the user to see therethrough; a sensor system configured to sense a characteristic of a physical object in an environment in which the user is located; and a processing unit coupled to the sensor system, the processing unit configured to: cause the screen to display a user-controllable object, and cause the screen to display an image of a feature that is resulted from a virtual interaction between the user-controllable object and the physical object, so that the feature will appear to be a part of the physical object in the environment or appear to be emanating from the physical object.

IPC Classes  ?

  • G09G 5/377 - Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of individual graphic patterns using a bit-mapped memory - Details of the operation on graphic patterns for mixing or overlaying two or more graphic patterns
  • G06T 11/00 - 2D [Two Dimensional] image generation
  • G09G 5/38 - Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of individual graphic patterns using a bit-mapped memory with means for controlling the display position
  • G02B 27/01 - Head-up displays

72.

COMPENSATION FOR DEFORMATION IN HEAD MOUNTED DISPLAY SYSTEMS

      
Application Number 18454912
Status Pending
Filing Date 2023-08-24
First Publication Date 2024-02-08
Owner Magic Leap, Inc. (USA)
Inventor
  • Edwin, Lionel Ernest
  • Miller, Samuel A.
  • Grossmann, Etienne Gregoire
  • Clark, Brian Christopher
  • Johnson, Michael Robert
  • Zhao, Wenyi
  • Shah, Nukul Sanjay
  • Huang, Po-Kang

Abstract

The systems and methods described can include approaches to calibrate head-mounted displays for improved viewing experiences. Some methods include receiving data of a first target image associated with an undeformed state of a first eyepiece of a head-mounted display device; receiving data of a first captured image associated with deformed state of the first eyepiece of the head-mounted display device; determining a first transformation that maps the first captured image to the image; and applying the first transformation to a subsequent image for viewing on the first eyepiece of the head-mounted display device.

IPC Classes  ?

  • G06T 5/00 - Image enhancement or restoration
  • G02B 27/01 - Head-up displays
  • G06T 3/00 - Geometric image transformation in the plane of the image
  • H04N 13/344 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays

73.

EFFICIENT RENDERING OF VIRTUAL SOUNDFIELDS

      
Application Number 18486938
Status Pending
Filing Date 2023-10-13
First Publication Date 2024-02-08
Owner Magic Leap, Inc. (USA)
Inventor
  • Schmidt, Brian Lloyd
  • Dicker, Samuel Charles

Abstract

An audio system and method of spatially rendering audio signals that uses modified virtual speaker panning is disclosed. The audio system may include a fixed number F of virtual speakers, and the modified virtual speaker panning may dynamically select and use a subset P of the fixed virtual speakers. The subset P of virtual speakers may be selected using a low energy speaker detection and culling method, a source geometry-based culling method, or both. One or more processing blocks in the decoder/virtualizer may be bypassed based on the energy level of the associated audio signal or the location of the sound source relative to the user/listener, respectively. In some embodiments, a virtual speaker that is designated as an active virtual speaker at a first time, may also be designated as an active virtual speaker at a second time to ensure the processing completes.

IPC Classes  ?

  • H04S 7/00 - Indicating arrangements; Control arrangements, e.g. balance control
  • G10L 19/008 - Multichannel audio signal coding or decoding using interchannel correlation to reduce redundancy, e.g. joint-stereo, intensity-coding or matrixing
  • G10L 25/21 - Speech or voice analysis techniques not restricted to a single one of groups characterised by the type of extracted parameters the extracted parameters being power information
  • H04S 3/00 - Systems employing more than two channels, e.g. quadraphonic

74.

VIRTUAL, AUGMENTED, AND MIXED REALITY SYSTEMS AND METHODS

      
Application Number 18487794
Status Pending
Filing Date 2023-10-16
First Publication Date 2024-02-08
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Rodriguez, Jose Felix
  • Perez, Ricardo Martinez

Abstract

A virtual, augmented, or mixed reality display system includes a display configured to display virtual, augmented, or mixed reality image data, the display including one or more optical components which introduce optical distortions or aberrations to the image data. The system also includes a display controller configured to provide the image data to the display. The display controller includes memory for storing optical distortion correction information, and one or more processing elements to at least partially correct the image data for the optical distortions or aberrations using the optical distortion correction information.

IPC Classes  ?

  • G06T 5/00 - Image enhancement or restoration
  • G06T 3/00 - Geometric image transformation in the plane of the image
  • G06T 3/40 - Scaling of a whole image or part thereof
  • G06F 3/14 - Digital output to display device
  • G06F 1/3203 - Power management, i.e. event-based initiation of a power-saving mode

75.

EYE IMAGING WITH AN OFF-AXIS IMAGER

      
Application Number 18354575
Status Pending
Filing Date 2023-07-18
First Publication Date 2024-02-01
Owner Magic Leap, Inc. (USA)
Inventor
  • Klug, Michael Anthony
  • Kaehler, Adrian

Abstract

Examples of an imaging system for use with a head mounted display (HMD) are disclosed. The imaging system can include a forward-facing imaging camera and a surface of a display of the HMD can include an off-axis diffractive optical element (DOE) or hot mirror configured to reflect light to the imaging camera. The DOE or hot mirror can be segmented. The imaging system can be used for eye tracking, biometric identification, multiscopic reconstruction of the three-dimensional shape of the eye, etc.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • A61B 3/10 - Objective types, i.e. instruments for examining the eyes independent of the patients perceptions or reactions
  • A61B 3/113 - Objective types, i.e. instruments for examining the eyes independent of the patients perceptions or reactions for determining or recording eye movement
  • A61B 3/14 - Arrangements specially adapted for eye photography
  • A61B 3/12 - Objective types, i.e. instruments for examining the eyes independent of the patients perceptions or reactions for looking at the eye fundus, e.g. ophthalmoscopes
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • A61B 5/11 - Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
  • A61B 5/00 - Measuring for diagnostic purposes ; Identification of persons
  • A61B 5/16 - Devices for psychotechnics; Testing reaction times

76.

DISPLAY SYSTEM AND METHOD FOR PROVIDING VARIABLE ACCOMMODATION CUES USING MULTIPLE INTRA-PUPIL PARALLAX VIEWS FORMED BY LIGHT EMITTER ARRAYS

      
Application Number 18482893
Status Pending
Filing Date 2023-10-08
First Publication Date 2024-02-01
Owner Magic Leap, Inc. (USA)
Inventor Klug, Michael Anthony

Abstract

A display system is configured to direct a plurality of parallactically-disparate intra-pupil images into a viewer's eye. The parallactically-disparate intra-pupil images provide different parallax views of a virtual object, and impinge on the pupil from different angles. In the aggregate, the wavefronts of light forming the images approximate a continuous divergent wavefront and provide selectable accommodation cues for the user, depending on the amount of parallax disparity between the intra-pupil images. The amount of parallax disparity may be selected using an array of shutters that selectively regulate the entry of image light into an eye. Each opened shutter in the array provides a different intra-pupil image, and the locations of the open shutters provide the desired amount of parallax disparity between the images. In some other embodiments, the images may be formed by an emissive micro-display. Each pixel formed by the micro-display may be formed by one of a group of light emitters, which are at different locations such that the emitted light takes different paths to the eye, the different paths providing different amounts of parallax disparity.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • H04N 13/344 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
  • H04N 13/398 - Synchronisation thereof; Control thereof
  • G02B 30/24 - Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer’s left and right eyes of the stereoscopic type involving temporal multiplexing, e.g. using sequentially activated left and right shutters
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems
  • G02B 25/00 - Eyepieces; Magnifying glasses
  • G02B 27/10 - Beam splitting or combining systems
  • G02B 27/14 - Beam splitting or combining systems operating by reflection only
  • G02B 27/30 - Collimators
  • G09G 3/00 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes

77.

METHOD AND SYSTEM FOR PATTERNING A LIQUID CRYSTAL LAYER

      
Application Number 18482896
Status Pending
Filing Date 2023-10-08
First Publication Date 2024-02-01
Owner Magic Leap, Inc. (USA)
Inventor
  • Komanduri, Ravi Kumar
  • Oh, Chulwoo

Abstract

An optical master is created by using a nanoimprint alignment layer to pattern a liquid crystal layer. The nanoimprint alignment layer and the liquid crystal layer constitute the optical master. The optical master is positioned above a photo-alignment layer. The optical master is illuminated and light propagating through the nanoimprinted alignment layer and the liquid crystal layer is diffracted and subsequently strikes the photo-alignment layer. The incident diffracted light causes the pattern in the liquid crystal layer to be transferred to the photo-alignment layer. A second liquid crystal layer is deposited onto the patterned photo-alignment layer, which subsequently is used to align the molecules of the second liquid crystal layer. The second liquid crystal layer in the patterned photo-alignment layer may be utilized as a replica optical master or as a diffractive optical element for directing light in optical devices such as augmented reality display devices.

IPC Classes  ?

  • G03H 1/02 - HOLOGRAPHIC PROCESSES OR APPARATUS - Details peculiar thereto - Details

78.

TILTING ARRAY BASED DISPLAY

      
Application Number 18483981
Status Pending
Filing Date 2023-10-10
First Publication Date 2024-02-01
Owner Magic Leap, Inc. (USA)
Inventor
  • Sissom, Bradley Jay
  • Curtis, Kevin Richard
  • Cheng, Hui-Chuan
  • Schuck, Iii, Miller Harry
  • Bhargava, Samarth

Abstract

This disclosure describes a wearable display system configured to project light to the eye(s) of a user to display virtual (e.g., augmented reality) image content in a vision field of the user. The system can include light source(s) that output light, spatial light modulator(s) that modulate the light to provide the virtual image content, and an eyepiece configured to convey the modulated light toward the eye(s) of the user. The eyepiece can include waveguide(s) and a plurality of in-coupling optical elements arranged on or in the waveguide(s) to in-couple the modulated light received from the spatial light modulator(s) into the waveguide(s) to be guided toward the user's eye(s). The spatial light modulator(s) may be movable, and/or may include movable components, to direct different portions of the modulated light toward different ones of the in-coupling optical elements at different times.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems
  • G02B 25/00 - Eyepieces; Magnifying glasses

79.

SYSTEMS, METHODS, AND DEVICES FOR ADHESION OF INTERIOR WAVEGUIDE PILLARS

      
Application Number 18257516
Status Pending
Filing Date 2021-12-21
First Publication Date 2024-02-01
Owner Magic Leap, Inc. (USA)
Inventor
  • Li, Ling
  • Peroz, Christophe
  • Chang, Chieh
  • Bhagat, Sharad D.
  • Ong, Ryan Jason
  • Karbasi, Ali
  • Rugg, Stephen Richard
  • Melli, Mauro
  • Messer, Kevin
  • Hill, Brian George
  • West, Melanie Maputol

Abstract

In some embodiments, a near-eye, near-eye display system comprises a stack of waveguides having pillars in a central, active portion of the waveguides. The active portion may include light outcoupling optical elements configured to outcouple image light from the waveguides towards the eye of a viewer. The pillars extend between and separate neighboring ones of the waveguides. The light outcoupling optical elements may include diffractive optical elements that are formed simultaneously with the pillars, for example, by imprinting or casting. The pillars are disposed on one or more major surfaces of each of the waveguides. The pillars may define a distance between two adjacent waveguides of the stack of waveguides. The pillars may be bonded to adjacent waveguides may be using one or more of the systems, methods, or devices herein. The bonding provides a high level of thermal stability to the waveguide stack, to resist deformation as temperatures change.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems

80.

DISPLAY SYSTEMS AND METHODS FOR DETERMINING VERTICAL ALIGNMENT BETWEEN LEFT AND RIGHT DISPLAYS AND A USER'S EYES

      
Application Number 18486848
Status Pending
Filing Date 2023-10-13
First Publication Date 2024-02-01
Owner Magic Leap, Inc. (USA)
Inventor Vlaskamp, Bjorn Nicolaas Servatius

Abstract

A wearable device may include a head-mounted display (HMD) for rendering a three-dimensional (3D) virtual object which appears to be located in an ambient environment of a user of the display. The relative positions of the HMD and one or more eyes of the user may not be in desired positions to receive image information outputted by the HMD. For example, the HAMID-to-eye vertical alignment may be different between the left and right eyes. The wearable device may determine if the HMD is level on the user's head and may then provide the user with a left-eye alignment marker and a right-eye alignment marker. Based on user feedback, the wearable device may determine if there is any left-right vertical misalignment and may take actions to reduce or minimize the effects of any misalignment.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • H04N 13/344 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems
  • G02B 27/01 - Head-up displays
  • G09G 5/38 - Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of individual graphic patterns using a bit-mapped memory with means for controlling the display position

81.

Furniture accessory with cameras

      
Application Number 29717260
Grant Number D1013007
Status In Force
Filing Date 2019-12-16
First Publication Date 2024-01-30
Grant Date 2024-01-30
Owner Magic Leap, Inc. (USA)
Inventor
  • Natsume, Shigeru
  • Gunther, Sebastian Gonzalo Arrieta
  • Awad, Haney
  • Green Mercer, Bryson John

82.

VIRTUAL OBJECT MOVEMENT SPEED CURVE FOR VIRTUAL AND AUGMENTED REALITY DISPLAY SYSTEMS

      
Application Number 18340778
Status Pending
Filing Date 2023-06-23
First Publication Date 2024-01-25
Owner Magic Leap, Inc. (USA)
Inventor
  • Xu, Yan
  • Fushiki, Ikko
  • Shanbhag, Suraj Manjunath
  • Das, Shiuli
  • Lee, Jung-Suk

Abstract

Systems and methods for regulating the speed of movement of virtual objects presented by a wearable system are described. The wearable system may present three-dimensional (3D) virtual content that moves, e.g., laterally across the user's field of view and/or in perceived depth from the user. The speed of the movement may follow the profile of an S-curve, with a gradual increase to a maximum speed, and a subsequent gradual decrease in speed until an end point of the movement is reached. The decrease in speed may be more gradual than the increase in speed. This speed curve may be utilized in the movement of virtual objections for eye-tracking calibration. The wearable system may track the position of a virtual object (an eye-tracking target) which moves with a speed following the S-curve. This speed curve allows for rapid movement of the eye-tracking target, while providing a comfortable viewing experience and high accuracy in determining the initial and final positions of the eye as it tracks the target.

IPC Classes  ?

  • G02C 7/06 - Lenses; Lens systems multifocal
  • G02C 7/02 - Lenses; Lens systems
  • G02B 27/01 - Head-up displays
  • G02B 27/36 - Fiducial marks or measuring scales within the optical system adjustable

83.

CROSS REALITY SYSTEM WITH SIMPLIFIED PROGRAMMING OF VIRTUAL CONTENT

      
Application Number 18353775
Status Pending
Filing Date 2023-07-17
First Publication Date 2024-01-25
Owner Magic Leap, Inc. (USA)
Inventor
  • Zhang, Haiyan
  • Macdonald, Robert John Cummings

Abstract

A cross reality system that renders virtual content generated by executing native mode applications may be configured to render web-based content using components that render content from native applications. The system may include a Prism manager that provides Prisms in which content from executing native applications is rendered. For rendering web based content, a browser, accessing the web based content, may be associated with a Prism and may render content into its associated Prism, creating the same immersive experience for the user as when content is generated by a native application. The user may access the web application from the same program launcher menu as native applications. The system may have tools that enable a user to access these capabilities, including by creating for a web location an installable entity that, when processed by the system, results in an icon for the web content in a program launcher menu.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
  • G06F 3/0482 - Interaction with lists of selectable items, e.g. menus
  • G06F 9/54 - Interprogram communication
  • G06F 16/955 - Retrieval from the web using information identifiers, e.g. uniform resource locators [URL]
  • G06F 16/954 - Navigation, e.g. using categorised browsing
  • G06F 3/04817 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons

84.

NEURAL NETWORK FOR EYE IMAGE SEGMENTATION AND IMAGE QUALITY ESTIMATION

      
Application Number 18455093
Status Pending
Filing Date 2023-08-24
First Publication Date 2024-01-25
Owner Magic Leap, Inc. (USA)
Inventor
  • Spizhevoy, Alexey
  • Kaehler, Adrian
  • Badrinarayanan, Vijay

Abstract

Systems and methods for eye image segmentation and image quality estimation are disclosed. In one aspect, after receiving an eye image, a device such as an augmented reality device can process the eye image using a convolutional neural network with a merged architecture to generate both a segmented eye image and a quality estimation of the eye image. The segmented eye image can include a background region, a sclera region, an iris region, or a pupil region. In another aspect, a convolutional neural network with a merged architecture can be trained for eye image segmentation and image quality estimation. In yet another aspect, the device can use the segmented eye image to determine eye contours such as a pupil contour and an iris contour. The device can use the eye contours to create a polar image of the iris region for computing an iris code or biometric authentication.

IPC Classes  ?

  • G06T 7/12 - Edge-based segmentation
  • G06T 7/11 - Region-based segmentation
  • G06T 7/194 - Segmentation; Edge detection involving foreground-background segmentation
  • G06V 10/56 - Extraction of image or video features relating to colour
  • G06V 10/44 - Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
  • G06V 10/98 - Detection or correction of errors, e.g. by rescanning the pattern or by human intervention; Evaluation of the quality of the acquired patterns
  • G06V 40/18 - Eye characteristics, e.g. of the iris
  • G06F 18/2413 - Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
  • G06V 10/764 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
  • G06V 10/82 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
  • G06T 7/10 - Segmentation; Edge detection
  • G06T 7/00 - Image analysis

85.

EYEPIECES FOR AUGMENTED REALITY DISPLAY SYSTEM

      
Application Number 18213124
Status Pending
Filing Date 2023-06-22
First Publication Date 2024-01-25
Owner Magic Leap, Inc. (USA)
Inventor
  • Bhargava, Samarth
  • Liu, Victor Kai
  • Messer, Kevin

Abstract

An eyepiece waveguide for an augmented reality display system. The eyepiece waveguide can include an input coupling grating (ICG) region. The ICG region can couple an input beam into the substrate of the eyepiece waveguide as a guided beam. A first combined pupil expander-extractor (CPE) grating region can be formed on or in a surface of the substrate. The first CPE grating region can receive the guided beam, create a first plurality of diffracted beams at a plurality of distributed locations, and out-couple a first plurality of output beams. The eyepiece waveguide can also include a second CPE grating region formed on or in the opposite surface of the substrate. The second CPE grating region can receive the guided beam, create a second plurality of diffracted beams at a plurality of distributed locations, and out-couple a second plurality of output beams.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G02B 27/28 - Optical systems or apparatus not provided for by any of the groups , for polarising

86.

EFFICIENT LOCALIZATION BASED ON MULTIPLE FEATURE TYPES

      
Application Number 18353851
Status Pending
Filing Date 2023-07-17
First Publication Date 2024-01-25
Owner Magic Leap, Inc. (USA)
Inventor
  • Zhou, Lipu
  • Swaminathan, Ashwin
  • Steinbruecker, Frank Thomas
  • Koppel, Daniel Esteban

Abstract

A method of efficiently and accurately computing a pose of an image with respect to other image information. The image may be acquired with a camera on a portable device and the other information may be a map, such that the computation of pose localizes the device relative to the map. Such a technique may be applied in a cross reality system to enable devices to efficiently and accurately access previously persisted maps. Localizing with respect to a map may enable multiple cross reality devices to render virtual content at locations specified in relation to those maps, providing an enhanced experience for uses of the system. The method may be used in other devices and for other purposes, such as for navigation of autonomous vehicles.

IPC Classes  ?

  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods

87.

THREE DIMENSIONAL VIRTUAL AND AUGMENTED REALITY DISPLAY SYSTEM

      
Application Number 18481090
Status Pending
Filing Date 2023-10-04
First Publication Date 2024-01-25
Owner Magic Leap, Inc. (USA)
Inventor Macnamara, John Graham

Abstract

A system may comprise a selectively transparent projection device for projecting an image toward an eye of a viewer from a projection device position in space relative to the eye of the viewer, the projection device being capable of assuming a substantially transparent state when no image is projected; an occlusion mask device coupled to the projection device and configured to selectively block light traveling toward the eye from one or more positions opposite of the projection device from the eye of the viewer in an occluding pattern correlated with the image projected by the projection device; and a zone plate diffraction patterning device interposed between the eye of the viewer and the projection device and configured to cause light from the projection device to pass through a diffraction pattern having a selectable geometry as it travels to the eye.

IPC Classes  ?

  • G02B 30/24 - Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer’s left and right eyes of the stereoscopic type involving temporal multiplexing, e.g. using sequentially activated left and right shutters
  • G03B 21/00 - Projectors or projection-type viewers; Accessories therefor
  • G02B 30/34 - Stereoscopes providing a stereoscopic pair of separated images corresponding to parallactically displaced views of the same object, e.g. 3D slide viewers
  • G02B 30/52 - Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels the 3D volume being constructed from a stack or sequence of 2D planes, e.g. depth sampling systems
  • G02B 27/01 - Head-up displays
  • G02B 5/18 - Diffracting gratings
  • G03B 35/18 - Stereoscopic photography by simultaneous viewing

88.

LOW PROFILE INTERCONNECT FOR LIGHT EMITTER

      
Application Number 18476611
Status Pending
Filing Date 2023-09-28
First Publication Date 2024-01-18
Owner Magic Leap, Inc. (USA)
Inventor Curtis, Kevin

Abstract

In some embodiments, an interconnect electrical connects a light emitter to wiring on a substrate. The interconnect may be deposited by 3D printing and lays flat on the light emitter and substrate. In some embodiments, the interconnect has a generally rectangular or oval cross-sectional profile and extends above the light emitter to a height of about 50 μm or less, or about 35 μm or less. This small height allows close spacing between an overlying optical structure and the light emitter, thereby providing high efficiency in the injection of light from the light emitter into the optical structure, such as a light pipe.

IPC Classes  ?

  • H01L 33/62 - Arrangements for conducting electric current to or from the semiconductor body, e.g. leadframe, wire-bond or solder balls
  • G02B 27/01 - Head-up displays
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems

89.

Portable camera device

      
Application Number 29717216
Grant Number D1011401
Status In Force
Filing Date 2019-12-16
First Publication Date 2024-01-16
Grant Date 2024-01-16
Owner Magic Leap, Inc. (USA)
Inventor
  • Natsume, Shigeru
  • Gunther, Sebastian Gonzalo Arrieta
  • Awad, Haney
  • Green Mercer, Bryson John

90.

SCANNING MIRROR SYSTEMS AND METHODS OF MANUFACTURE

      
Application Number 18370009
Status Pending
Filing Date 2023-09-19
First Publication Date 2024-01-11
Owner Magic Leap, Inc. (USA)
Inventor
  • Gamet, Julien
  • Gamper, Stephan Arthur

Abstract

A scanning micromirror system includes a base having an axis passing therethrough, a plurality of support flexures coupled to the base, and a platform coupled to the base by the plurality of support flexures. The platform has a first side and a second side opposing the first side and is operable to oscillate about the axis. The scanning micromirror system also includes a stress relief layer positioned on the first side of the platform and a reflector positioned on the first side of the platform. The stress relief layer is positioned between the reflector and the platform.

IPC Classes  ?

  • G02B 26/10 - Scanning systems
  • G02B 26/08 - Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
  • B81B 3/00 - Devices comprising flexible or deformable elements, e.g. comprising elastic tongues or membranes
  • G02B 27/01 - Head-up displays

91.

METHOD OF REDUCING OPTICAL ARTIFACTS

      
Application Number 18371888
Status Pending
Filing Date 2023-09-22
First Publication Date 2024-01-11
Owner Magic Leap, Inc. (USA)
Inventor
  • Curtis, Kevin Richard
  • Cheng, Hui-Chuan
  • Greco, Paul M.
  • Welch, William Hudson
  • Browy, Eric C.
  • Schuck, Iii, Miller Harry
  • Sissom, Bradley Jay

Abstract

A method of reducing optical artifacts includes injecting a light beam generated by an illumination source into a polarizing beam splitter (PBS), reflecting a spatially defined portion of the light beam from a display panel, reflecting, at an interface in the PBS, the spatially defined portion of the light beam towards a projector lens, passing at least a portion of the spatially defined portion of the light beam through a circular polarizer disposed between the PBS and the projector lens, reflecting, by one or more elements of the projector lens, a return portion of the spatially defined portion of the light beam, and attenuating, at the circular polarizer, the return portion of the spatially defined portion of the light beam.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 27/28 - Optical systems or apparatus not provided for by any of the groups , for polarising
  • G03B 21/00 - Projectors or projection-type viewers; Accessories therefor
  • G02B 5/30 - Polarising elements
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems

92.

METHODS, DEVICES, AND SYSTEMS FOR ILLUMINATING SPATIAL LIGHT MODULATORS

      
Application Number 18470801
Status Pending
Filing Date 2023-09-20
First Publication Date 2024-01-11
Owner Magic Leap, Inc. (USA)
Inventor
  • Cheng, Hui-Chuan
  • Chung, Hyunsun
  • Trisnadi, Jahja I.
  • Carlisle, Clinton
  • Curtis, Kevin Richard
  • Oh, Chulwoo

Abstract

An optical device may include a wedge-shaped light turning element. The optical device can include a first surface that is parallel to a horizontal axis and a second surface opposite to the first surface that is inclined with respect to the horizontal axis by a wedge angle. The optical device may include a light module that includes a plurality of light emitters. The light module can be configured to combine light for the plurality of emitters. The optical device can further include a light input surface that is between the first and the second surfaces and is disposed with respect to the light module to receive light emitted from the plurality of emitters. The optical device may include an end reflector that is disposed on a side opposite the light input surface. The second surface may be inclined such that a height of the light input surface is less than a height of the side opposite the light input surface. The light coupled into the wedge-shaped light turning element may be reflected by the end reflector and/or reflected from the second surface towards the first surface.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 27/14 - Beam splitting or combining systems operating by reflection only
  • G03B 21/00 - Projectors or projection-type viewers; Accessories therefor
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G09G 3/24 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix using controlled light sources using incandescent filaments
  • G02B 30/26 - Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer’s left and right eyes of the autostereoscopic type
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G02B 5/30 - Polarising elements
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems
  • G02F 1/137 - Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour based on liquid crystals, e.g. single liquid crystal display cells characterised by the electro-optical or magneto-optical effect, e.g. field-induced phase transition, orientation effect, guest-host interaction or dynamic scattering

93.

MIXED REALITY VIRTUAL REVERBERATION

      
Application Number 18471071
Status Pending
Filing Date 2023-09-20
First Publication Date 2024-01-11
Owner Magic Leap, Inc. (USA)
Inventor
  • Tajik, Anastasia Andreyevna
  • Jot, Jean-Marc

Abstract

A method of presenting an audio signal to a user of a mixed reality environment is disclosed, the method comprising the steps of detecting a first audio signal in the mixed reality environment, where the first audio signal is a real audio signal; identifying a virtual object intersected by the first audio signal in the mixed reality environment; identifying a listener coordinate associated with the user; determining, using the virtual object and the listener coordinate, a transfer function; applying the transfer function to the first audio signal to produce a second audio signal; and presenting, to the user, the second audio signal.

IPC Classes  ?

  • H04N 21/422 - Input-only peripherals, e.g. global positioning system [GPS]
  • G06F 3/16 - Sound input; Sound output
  • H04N 21/439 - Processing of audio elementary streams

94.

IMMERSIVE AUDIO PLATFORM

      
Application Number 18471216
Status Pending
Filing Date 2023-09-20
First Publication Date 2024-01-11
Owner Magic Leap, Inc. (USA)
Inventor
  • Jot, Jean-Marc
  • Minnick, Michael
  • Pastouchenko, Dmitry
  • Simon, Michael Aaron
  • Scott, Iii, John Emmitt
  • Bailey, Richard St. Clair
  • Balasubramanyan, Shivakumar
  • Agadi, Harsharaj

Abstract

Disclosed herein are systems and methods for presenting audio content in mixed reality environments. A method may include receiving a first input from an application program; in response to receiving the first input, receiving, via a first service, an encoded audio stream; generating, via the first service, a decoded audio stream based on the encoded audio stream; receiving, via a second service, the decoded audio stream; receiving a second input from one or more sensors of a wearable head device; receiving, via the second service, a third input from the application program, wherein the third input corresponds to a position of one or more virtual speakers; generating, via the second service, a spatialized audio stream based on the decoded audio stream, the second input, and the third input; presenting, via one or more speakers of the wearable head device, the spatialized audio stream.

IPC Classes  ?

  • H04S 7/00 - Indicating arrangements; Control arrangements, e.g. balance control
  • G02B 27/01 - Head-up displays

95.

STACKED WAVEGUIDES HAVING DIFFERENT DIFFRACTION GRATINGS FOR COMBINED FIELD OF VIEW

      
Application Number 18471740
Status Pending
Filing Date 2023-09-21
First Publication Date 2024-01-11
Owner Magic Leap, Inc. (USA)
Inventor
  • Oh, Chulwoo
  • Parthiban, Vikraman

Abstract

In one aspect, an optical device comprises a plurality of waveguides formed over one another and having formed thereon respective diffraction gratings, wherein the respective diffraction gratings are configured to diffract visible light incident thereon into respective waveguides, such that visible light diffracted into the respective waveguides propagates therewithin. The respective diffraction gratings are configured to diffract the visible light into the respective waveguides within respective field of views (FOVs) with respect to layer normal directions of the respective waveguides. The respective FOVs are such that the plurality of waveguides are configured to diffract the visible light within a combined FOV that is continuous and greater than each of the respective FOVs

IPC Classes  ?

  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems
  • G02B 6/10 - Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings of the optical waveguide type
  • G02F 1/29 - Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the position or the direction of light beams, i.e. deflection
  • G02B 27/01 - Head-up displays
  • H04N 9/31 - Projection devices for colour picture display
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/42 - Diffraction optics
  • G02B 5/18 - Diffracting gratings

96.

AUTOMATIC PLACEMENT OF A VIRTUAL OBJECT IN A THREE-DIMENSIONAL SPACE

      
Application Number 18473017
Status Pending
Filing Date 2023-09-22
First Publication Date 2024-01-11
Owner Magic Leap, Inc. (USA)
Inventor
  • Hoover, Paul Armistead
  • Mann, Jonathan Lawrence

Abstract

Augmented reality systems and methods for automatically repositioning a virtual object with respect to a destination object in a three-dimensional (3D) environment of a user are disclosed. The systems and methods can automatically attach the target virtual object to the destination object and re-orient the target virtual object based on the affordances of the virtual object or the destination object. The systems and methods can also track the movement of a user and detach the virtual object from the destination object when the user's movement passes a threshold condition.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • H04N 13/156 - Mixing image signals
  • H04N 13/279 - Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals the virtual viewpoint locations being selected by the viewers or determined by tracking
  • H04N 13/344 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
  • H04N 13/239 - Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
  • H04N 13/395 - Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume with depth sampling, i.e. the volume being constructed from a stack or sequence of 2D image planes
  • G02B 30/34 - Stereoscopes providing a stereoscopic pair of separated images corresponding to parallactically displaced views of the same object, e.g. 3D slide viewers
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts

97.

Combined packaging insert and tray

      
Application Number 29716355
Grant Number D1010446
Status In Force
Filing Date 2019-12-09
First Publication Date 2024-01-09
Grant Date 2024-01-09
Owner Magic Leap, Inc. (USA)
Inventor
  • Natsume, Shigeru
  • Awad, Haney
  • Palmer, James William
  • Gamez Castillejos, Daniel Marcelo
  • Palmer, Christopher G.

98.

TRANSMODAL INPUT FUSION FOR MULTI-USER GROUP INTENT PROCESSING IN VIRTUAL ENVIRONMENTS

      
Application Number 18252574
Status Pending
Filing Date 2021-11-09
First Publication Date 2024-01-04
Owner Magic Leap, Inc. (USA)
Inventor
  • Lacey, Paul
  • Schwab, Brian David
  • Miller, Samuel A.
  • Sands, John Andrew
  • Bryant, Colman Thomas

Abstract

This document describes imaging and visualization systems in which the intent of a group of users in a shared space is determined and acted upon. In one aspect, a method includes identifying, for a group of users in a shared virtual space, a respective objective for each of two or more of the users in the group of users. For each of the two or more users, a determination is made, based on inputs from multiple sensors having different input modalities, a respective intent of the user. At least a portion of the multiple sensors are sensors of a device of the user that enables the user to participate in the shared virtual space. A determination is made, based on the respective intent, whether the user is performing the respective objective for the user. Output data is generated and provided based on the respective objectives respective intents.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

99.

MASSIVE SIMULTANEOUS REMOTE DIGITAL PRESENCE WORLD

      
Application Number 18306387
Status Pending
Filing Date 2023-04-25
First Publication Date 2024-01-04
Owner Magic Leap, Inc. (USA)
Inventor Abovitz, Rony

Abstract

Various methods and apparatus are described herein for enabling one or more users to interface with virtual or augmented reality environments. An example system includes a computing network having computer servers interconnected through high bandwidth interfaces to gateways for processing data and/or for enabling communication of data between the servers and one or more local user interface devices. The servers include memory, processing circuitry, and software for designing and/or controlling virtual worlds, as well as for storing and processing user data and data provided by other components of the system. One or more virtual worlds may be presented to a user through a user device for the user to experience and interact. A large number of users may each use a device to simultaneously interface with one or more digital worlds by using the device to observe and interact with each other and with objects produced within the digital worlds.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

100.

IMAGE DESCRIPTOR NETWORK WITH IMPOSED HIERARCHICAL NORMALIZATION

      
Application Number 18368153
Status Pending
Filing Date 2023-09-14
First Publication Date 2024-01-04
Owner Magic Leap, Inc. (USA)
Inventor Sato, Koichi

Abstract

Techniques are disclosed for using and training a descriptor network. An image may be received and provided to the descriptor network. The descriptor network may generate an image descriptor based on the image. The image descriptor may include a set of elements distributed between a major vector comprising a first subset of the set of elements and a minor vector comprising a second subset of the set of elements. The second subset of the set of elements may include more elements than the first subset of the set of elements. A hierarchical normalization may be imposed onto the image descriptor by normalizing the major vector to a major normalization amount and normalizing the minor vector to a minor normalization amount. The minor normalization amount may be less than the major normalization amount.

IPC Classes  ?

  • G06F 16/56 - Information retrieval; Database structures therefor; File system structures therefor of still image data having vectorial format
  • G06N 3/08 - Learning methods
  • G06V 10/46 - Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
  • G06V 10/44 - Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
  • G06F 18/214 - Generating training patterns; Bootstrap methods, e.g. bagging or boosting
  • G06V 10/764 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
  • G06V 10/82 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
  1     2     3     ...     20        Next Page