Magic Leap, Inc.

United States of America

Back to Profile

1-100 of 3,003 for Magic Leap, Inc. Sort by
Query
Aggregations
IP Type
        Patent 2,855
        Trademark 148
Jurisdiction
        United States 1,991
        World 680
        Canada 318
        Europe 14
Date
New (last 4 weeks) 38
2024 March (MTD) 24
2024 February 30
2024 January 27
2023 December 39
See more
IPC Class
G02B 27/01 - Head-up displays 1,280
G06T 19/00 - Manipulating 3D models or images for computer graphics 731
G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer 704
G02B 27/00 - Optical systems or apparatus not provided for by any of the groups , 429
F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems 253
See more
NICE Class
09 - Scientific and electric apparatus and instruments 133
42 - Scientific, technological and industrial services, research and design 59
41 - Education, entertainment, sporting and cultural services 42
38 - Telecommunications services 36
35 - Advertising and business services 31
See more
Status
Pending 608
Registered / In Force 2,395
  1     2     3     ...     31        Next Page

1.

SECURE EXCHANGE OF CRYPTOGRAPHICALLY SIGNED RECORDS

      
Application Number 18525698
Status Pending
Filing Date 2023-11-30
First Publication Date 2024-03-21
Owner Magic Leap, Inc. (USA)
Inventor Kaehler, Adrian

Abstract

Systems and methods for securely exchanging cryptographically signed records are disclosed. In one aspect, after receiving a content request, a sender device can send a record to a receiver device (e.g., an agent device) making the request. The record can be sent via a short range link in a decentralized (e.g., peer-to-peer) manner while the devices may not be in communication with a centralized processing platform. The record can comprise a sender signature created using the sender device's private key. The receiver device can verify the authenticity of the sender signature using the sender device's public key. After adding a cryptography-based receiver signature, the receiver device can redeem the record with the platform. Upon successful verification of the record, the platform can perform as instructed by a content of the record (e.g., modifying or updating a user account).

IPC Classes  ?

  • H04L 9/32 - Arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system
  • H04L 9/14 - Arrangements for secret or secure communications; Network security protocols using a plurality of keys or algorithms
  • H04L 9/30 - Public key, i.e. encryption algorithm being computationally infeasible to invert and users' encryption keys not requiring secrecy
  • H04L 9/40 - Network security protocols

2.

DISPLAY SYSTEMS AND METHODS FOR DETERMINING REGISTRATION BETWEEN A DISPLAY AND A USER'S EYES

      
Application Number 18521613
Status Pending
Filing Date 2023-11-28
First Publication Date 2024-03-21
Owner Magic Leap, Inc. (USA)
Inventor
  • Edwin, Lionel Ernest
  • Nienstedt, Zachary C.
  • Yeoh, Ivan Li Chuen
  • Miller, Samuel A.
  • Xu, Yan
  • Cazamias, Jordan Alexander

Abstract

A wearable device may include a head-mounted display (HMD) for rendering a three-dimensional (3D) virtual object which appears to be located in an ambient environment of a user of the display. The relative positions of the HMD and one or more eyes of the user may not be in desired positions to receive, or register, image information outputted by the HMD. For example, the HMD-to-eye alignment may vary for different users and may change over time (e.g., as a user moves around and/or the HMD slips or is otherwise displaced). The wearable device may determine a relative position or alignment between the HMD and the user's eyes. Based on the relative positions, the wearable device may determine if it is properly fitted to the user, may provide feedback on the quality of the fit to the user, and may take actions to reduce or minimize effects of any misalignment.

IPC Classes  ?

  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • A61B 3/11 - Objective types, i.e. instruments for examining the eyes independent of the patients perceptions or reactions for measuring interpupillary distance or diameter of pupils
  • A61B 3/113 - Objective types, i.e. instruments for examining the eyes independent of the patients perceptions or reactions for determining or recording eye movement
  • G02B 27/01 - Head-up displays
  • G02B 30/00 - Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
  • G02B 30/40 - Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images giving the observer of a single two-dimensional [2D] image a perception of depth
  • G06F 1/16 - Constructional details or arrangements
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/0346 - Pointing devices displaced or positioned by the user; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
  • G06F 3/16 - Sound input; Sound output
  • G06V 40/18 - Eye characteristics, e.g. of the iris
  • H04N 13/344 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
  • H04N 13/383 - Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes

3.

VIRTUAL LOCATION SELECTION FOR VIRTUAL CONTENT

      
Application Number 18523382
Status Pending
Filing Date 2023-11-29
First Publication Date 2024-03-21
Owner Magic Leap, Inc. (USA)
Inventor
  • Warren, Silas
  • Khan, Omar
  • Miller, Samuel A.
  • Arora, Tushar

Abstract

A method for placing content in an augmented reality system. A notification is received regarding availability of new content to display in the augmented reality system. A confirmation is received that indicates acceptance of the new content. Three dimensional information that describes the physical environment is provided, to an external computing device, to enable the external computing device to be used for selecting an assigned location in the physical environment for the new content. Location information is received, from the external computing device, that indicates the assigned location. A display location on a display system of the augmented reality system at which to display the new content so that the new content appears to the user to be displayed as an overlay at the assigned location in the physical environment is determined, based on the location information. The new content is displayed on the display system at the display location.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts

4.

TUNABLE CYLINDRICAL LENSES AND HEAD-MOUNTED DISPLAY INCLUDING THE SAME

      
Application Number 18510932
Status Pending
Filing Date 2023-11-16
First Publication Date 2024-03-21
Owner Magic Leap, Inc. (USA)
Inventor
  • Russell, Andrew Ian
  • Haddock, Joshua Naaman

Abstract

Systems include three optical elements arranged along an optical axis each having a different cylinder axis and a variable cylinder refractive power. Collectively, the three elements form a compound optical element having an overall spherical refractive power (SPH), cylinder refractive power (CYL), and cylinder axis (Axis) that can be varied according to a prescription (Rx).

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/02 - Viewing or reading apparatus

5.

MATCHING CONTENT TO A SPATIAL 3D ENVIRONMENT

      
Application Number 18523763
Status Pending
Filing Date 2023-11-29
First Publication Date 2024-03-21
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Bastov, Denys
  • Ng-Thow-Hing, Victor
  • Reinhardt, Benjamin Zaaron
  • Zolotarev, Leonid
  • Pellet, Yannick
  • Marchenko, Aleksei
  • Meaney, Brian Everett
  • Shelton, Marc Coleman
  • Geiman, Megan Ann
  • Gotcher, John A.
  • Bogue, Matthew Schon
  • Balasubramanyam, Shivakumar
  • Ruediger, Jeffrey Edward
  • Lundmark, David Charles

Abstract

Systems and methods for matching content elements to surfaces in a spatially organized 3D environment. The method includes receiving content, identifying one or more elements in the content, determining one or more surfaces, matching the one or more elements to the one or more surfaces, and displaying the one or more elements as virtual content onto the one or more surfaces.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts

6.

Wearable accessory with cameras

      
Application Number 29717289
Grant Number D1018624
Status In Force
Filing Date 2019-12-16
First Publication Date 2024-03-19
Grant Date 2024-03-19
Owner Magic Leap, Inc. (USA)
Inventor
  • Natsume, Shigeru
  • Gunther, Sebastian Gonzalo Arrieta

7.

WEARABLE SYSTEM SPEECH PROCESSING

      
Application Number 18510376
Status Pending
Filing Date 2023-11-15
First Publication Date 2024-03-14
Owner Magic Leap, Inc. (USA)
Inventor Leider, Colby Nelson

Abstract

A method of processing an acoustic signal is disclosed. According to one or more embodiments, a first acoustic signal is received via a first microphone. The first acoustic signal is associated with a first speech of a user of a wearable headgear unit. A first sensor input is received via a sensor, a control parameter is determined based on the sensor input. The control parameter is applied to one or more of the first acoustic signal, the wearable headgear unit, and the first microphone. Determining the control parameter comprises determining, based on the first sensor input, a relationship between the first speech and the first acoustic signal.

IPC Classes  ?

  • G10L 21/0208 - Noise filtering
  • G02B 27/01 - Head-up displays
  • G10L 15/18 - Speech classification or search using natural language modelling
  • G10L 15/22 - Procedures used during a speech recognition process, e.g. man-machine dialog
  • G10L 15/26 - Speech to text systems

8.

METHOD AND SYSTEM FOR INTEGRATION OF REFRACTIVE OPTICS WITH A DIFFRACTIVE EYEPIECE WAVEGUIDE DISPLAY

      
Application Number 18513308
Status Pending
Filing Date 2023-11-17
First Publication Date 2024-03-14
Owner Magic Leap, Inc. (USA)
Inventor
  • Oh, Chulwoo
  • Komanduri, Ravi Kumar
  • Singh, Vikramjit
  • Yang, Shuqiang
  • Xu, Frank Y.

Abstract

A method of fabricating an optical element includes providing a substrate, forming a castable material coupled to the substrate, and casting the castable material using a mold. The method also includes curing the castable material and removing the mold. The optical element comprises a planar region and a clear aperture adjacent the planar region and characterized by an optical power.

IPC Classes  ?

9.

SYSTEMS AND METHODS FOR ARTIFICIAL INTELLIGENCE-BASED VIRTUAL AND AUGMENTED REALITY

      
Application Number 18513312
Status Pending
Filing Date 2023-11-17
First Publication Date 2024-03-14
Owner Magic Leap, Inc. (USA)
Inventor
  • Rabinovich, Andrew
  • Monos, John

Abstract

Examples of the disclosure describe systems and methods for generating and displaying a virtual companion. In an example method, a first input from an environment of a user is received at a first time via a first sensor. An occurrence of an event in the environment is determined based on the first input. A second input from the user is received via a second sensor, and an emotional reaction of the user is identified based on the second input. An association is determined between the emotional reaction and the event. A view of the environment is presented at a second time later than the first time via a display. A stimulus is presented at the second time via a virtual companion displayed via the display, wherein the stimulus is determined based on the determined association between the emotional reaction and the event.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G02B 27/01 - Head-up displays
  • G06F 1/16 - Constructional details or arrangements
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 21/53 - Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems during program execution, e.g. stack integrity, buffer overflow or preventing unwanted data erasure by executing in a restricted environment, e.g. sandbox or secure virtual machine
  • G06T 15/00 - 3D [Three Dimensional] image rendering

10.

BUNDLE ADJUSTMENT USING EPIPOLAR CONSTRAINTS

      
Application Number 18266756
Status Pending
Filing Date 2021-12-03
First Publication Date 2024-03-14
Owner Magic Leap, Inc. (USA)
Inventor
  • Souiai, Mohamed
  • Gupta, Ankur

Abstract

Methods, systems, and apparatus for performing bundling adjustment using epipolar constraints. A method includes receiving image data from a headset for a particular pose. The image data includes a first image from a first camera of the headset and a second image from a second camera of the headset. The method includes identifying at least one key point in a three-dimensional model of an environment at least partly represented in the first image and the second image and performing bundle adjustment. Bundle adjustment is performed by jointly optimizing a reprojection error for the at least one key point and an epipolar error for the at least one key point. Results of the bundle adjustment are used to perform at least one of (i) updating the three-dimensional model, (ii) determining a position of the headset at the particular pose, or (iii) determining extrinsic parameters of the first camera and second camera.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/01 - Head-up displays
  • G06T 7/70 - Determining position or orientation of objects or cameras
  • G06T 7/80 - Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

11.

DETERMINING INPUT FOR SPEECH PROCESSING ENGINE

      
Application Number 18506866
Status Pending
Filing Date 2023-11-10
First Publication Date 2024-03-14
Owner Magic Leap, Inc. (USA)
Inventor
  • Sheeder, Anthony Robert
  • Leider, Colby Nelson

Abstract

A method of presenting a signal to a speech processing engine is disclosed. According to an example of the method, an audio signal is received via a microphone. A portion of the audio signal is identified, and a probability is determined that the portion comprises speech directed by a user of the speech processing engine as input to the speech processing engine. In accordance with a determination that the probability exceeds a threshold, the portion of the audio signal is presented as input to the speech processing engine. In accordance with a determination that the probability does not exceed the threshold, the portion of the audio signal is not presented as input to the speech processing engine.

IPC Classes  ?

  • G10L 15/22 - Procedures used during a speech recognition process, e.g. man-machine dialog
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G10L 15/14 - Speech classification or search using statistical models, e.g. Hidden Markov Models [HMM]
  • G10L 15/25 - Speech recognition using non-acoustical features using position of the lips, movement of the lips or face analysis
  • G10L 15/30 - Distributed recognition, e.g. in client-server systems, for mobile phones or network applications

12.

INTERAURAL TIME DIFFERENCE CROSSFADER FOR BINAURAL AUDIO RENDERING

      
Application Number 18510472
Status Pending
Filing Date 2023-11-15
First Publication Date 2024-03-14
Owner Magic Leap, Inc. (USA)
Inventor
  • Dicker, Samuel Charles
  • Barbhaiya, Harsh Mayur

Abstract

Examples of the disclosure describe systems and methods for presenting an audio signal to a user of a wearable head device. In an example, a received first input audio signal is processed to generate a left output audio signal and a right output audio signal presented to ears of the user. Processing the first input audio signal comprises applying a delay process to the first input audio signal to generate a left audio signal and a right audio signal; adjusting gains of the left audio signal and the right audio signal; applying head-related transfer functions (HRTFs) to the left and right audio signals to generate the left and right output audio signals. Applying the delay process to the first input audio signal comprises applying an interaural time delay (ITD) to the first input audio signal, the ITD determined based on the source location.

IPC Classes  ?

  • H04S 7/00 - Indicating arrangements; Control arrangements, e.g. balance control
  • H04R 3/04 - Circuits for transducers for correcting frequency response
  • H04R 5/033 - Headphones for stereophonic communication
  • H04R 5/04 - Circuit arrangements
  • H04S 3/00 - Systems employing more than two channels, e.g. quadraphonic

13.

CROSS REALITY SYSTEM WITH PRIORITIZATION OF GEOLOCATION INFORMATION FOR LOCALIZATION

      
Application Number 18510623
Status Pending
Filing Date 2023-11-15
First Publication Date 2024-03-14
Owner Magic Leap, Inc. (USA)
Inventor
  • Zhao, Xuan
  • Moore, Christian Ivan Robert
  • Lin, Sen
  • Shahrokni, Ali
  • Swaminathan, Ashwin

Abstract

A cross reality system enables any of multiple devices to efficiently access previously stored maps. Both stored maps and tracking maps used by portable devices may have any of multiple types of location metadata associated with them. The location metadata may be used to select a set of candidate maps for operations, such as localization or map merge, that involve finding a match between a location defined by location information from a portable device and any of a number of previously stored maps. The types of location metadata may prioritized for use in selecting the subset. To aid in selection of candidate maps, a universe of stored maps may be indexed based on geo-location information. A cross reality platform may update that index as it interacts with devices that supply geo-location information in connection with location information and may propagate that geo-location information to devices that do not supply it.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 9/54 - Interprogram communication
  • G06F 16/29 - Geographical information databases
  • G06F 16/907 - Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually

14.

SESSION MANAGER

      
Application Number 18513443
Status Pending
Filing Date 2023-11-17
First Publication Date 2024-03-14
Owner Magic Leap, Inc. (USA)
Inventor
  • Bailey, Richard St. Clair
  • Pothapragada, Siddartha
  • Mori, Koichi
  • Stolzenberg, Karen
  • Niles, Savannah
  • Noriega-Padilla, Domingo
  • Heiner, Cole Parker

Abstract

Disclosed are systems and methods for mixed reality collaboration. A method may include receiving persistent coordinate data; presenting a first virtual session handle to a first user at a first position via a transmissive display of a wearable device, wherein the first position is based on the persistent coordinate data; presenting a virtual object to the first user at a second location via the transmissive display, wherein the second position is based on the first position; receiving location data from a second user, wherein the location data relates a position of the second user to a position of a second virtual session handle; presenting a virtual avatar to the first user at a third position via the transmissive display, wherein the virtual avatar corresponds to the second user, wherein the third position is based on the location data, and wherein the third position is further based on the first position.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • H04L 67/131 - Protocols for games, networked simulations or virtual reality

15.

EYEPIECES FOR USE IN WEARABLE DISPLAY SYSTEMS

      
Application Number 18514500
Status Pending
Filing Date 2023-11-20
First Publication Date 2024-03-14
Owner Magic Leap, Inc. (USA)
Inventor
  • Lin, Dianmin
  • St. Hilaire, Pierre

Abstract

An example a head-mounted display device includes a light projector and an eyepiece. The eyepiece is arranged to receive light from the light projector and direct the light to a user during use of the wearable display system. The eyepiece includes a waveguide having an edge positioned to receive light from the display light source module and couple the light into the waveguide. The waveguide includes a first surface and a second surface opposite the first surface. The waveguide includes several different regions, each having different grating structures configured to diffract light according to different sets of grating vectors.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems

16.

AUGMENTED AND VIRTUAL REALITY DISPLAY SYSTEMS WITH CORRELATED IN-COUPLING AND OUT-COUPLING OPTICAL REGIONS FOR EFFICIENT LIGHT UTILIZATION

      
Application Number 18516321
Status Pending
Filing Date 2023-11-21
First Publication Date 2024-03-14
Owner Magic Leap, Inc. (USA)
Inventor Schowengerdt, Brian T.

Abstract

Augmented reality and virtual reality display systems and devices are configured for efficient use of projected light. In some aspects, a display system includes a light projection system and a head-mounted display configured to project light into an eye of the user to display virtual image content. The head-mounted display includes at least one waveguide comprising a plurality of in-coupling regions each configured to receive, from the light projection system, light corresponding to a portion of the user's field of view and to in-couple the light into the waveguide; and a plurality of out-coupling regions configured to out-couple the light out of the waveguide to display the virtual content, wherein each of the out-coupling regions are configured to receive light from different ones of the in-coupling regions. In some implementations, each in-coupling region has a one-to-one correspondence with a unique corresponding out-coupling region.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems

17.

EYE TRACKING USING ALTERNATE SAMPLING

      
Application Number 18516469
Status Pending
Filing Date 2023-11-21
First Publication Date 2024-03-14
Owner Magic Leap, Inc. (USA)
Inventor Russell, Andrew Ian

Abstract

An eye tracking system can include a first camera configured to capture a first plurality of visual data of a right eye at a first sampling rate. The system can include a second camera configured to capture a second plurality of visual data of a left eye at a second sampling rate. The second plurality of visual data can be captured during different sampling times than the first plurality of visual data. The system can estimate, based on at least some visual data of the first and second plurality of visual data, visual data of at least one of the right or left eye at a sampling time during which visual data of an eye for which the visual data is being estimated are not being captured. Eye movements of the eye based on at least some of the estimated visual data and at least some visual data of the first or second plurality of visual data can be determined.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/01 - Head-up displays
  • G06T 7/20 - Analysis of motion

18.

POLYCHROMATIC LIGHT OUT-COUPLING APPARATUS, NEAR-EYE DISPLAYS COMPRISING THE SAME, AND METHOD OF OUT-COUPLING POLYCHROMATIC LIGHT

      
Application Number 18517915
Status Pending
Filing Date 2023-11-22
First Publication Date 2024-03-14
Owner Magic Leap, Inc. (USA)
Inventor
  • Kimmel, Jyrki
  • Jarvenpaa, Toni
  • Eskolin, Peter
  • Salmimaa, Marja

Abstract

The present invention provides an apparatus (3) comprising a first out-coupling diffractive optical element (10) and a second out-coupling diffractive optical element (20). Each of the first and second out-coupling diffractive optical elements comprises a first region (12a, 22a) having a first repeated diffraction spacing, d1, and a second region (12b, 22b) adjacent to the first region having a second repeated diffraction spacing, d2, different from the first spacing, d1. The first region (12a) of the first out-coupling diffractive optical element (10) is superposed on and aligned with the second region (22b) of the second out-coupling diffractive optical element (20). The second region (12b) of the first out-coupling diffractive optical element (10) is superposed on and aligned with the first region (22a) of the second out-coupling diffractive optical element (20).

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 5/18 - Diffracting gratings
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,

19.

EYE CENTER OF ROTATION DETERMINATION WITH ONE OR MORE EYE TRACKING CAMERAS

      
Application Number 18387745
Status Pending
Filing Date 2023-11-07
First Publication Date 2024-03-07
Owner Magic Leap, Inc. (USA)
Inventor
  • Cohen, David
  • Joseph, Elad
  • Ferens, Ron Nisim
  • Preter, Eyal
  • Bar-On, Eitan Shmuel
  • Yahav, Giora

Abstract

A display system can include a head-mounted display configured to project light to an eye of a user to display virtual image content at different amounts of divergence and collimation. The display system can include an inward-facing imaging system possibly comprising a plurality of cameras that image the user's eye and glints for thereon and processing electronics that are in communication with the inward-facing imaging system and that are configured to obtain an estimate of a center of rotation of the user's eye using cornea data derived from the glint images. The display system may render virtual image content with a render camera positioned at the determined position of the center of rotation of said eye.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/01 - Head-up displays
  • G06T 7/246 - Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
  • G06T 7/292 - Multi-camera tracking
  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods

20.

SYSTEM FOR PROVIDING ILLUMINATION OF THE EYE

      
Application Number 18497518
Status Pending
Filing Date 2023-10-30
First Publication Date 2024-03-07
Owner Magic Leap, Inc. (USA)
Inventor
  • Meitav, Nizan
  • Yaras, Fahri
  • Jurbergs, David Carl

Abstract

A thin transparent layer can be integrated in a head mounted display device and disposed in front of the eye of a wearer. The thin transparent layer may be configured to output light such that light is directed onto the eye to create reflections therefrom that can be used, for example, for glint based tracking. The thin transparent layer can be configured to reduced obstructions in the field of the view of the user.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems

21.

PLENOPTIC CAMERA MEASUREMENT AND CALIBRATION OF HEAD-MOUNTED DISPLAYS

      
Application Number 18505437
Status Pending
Filing Date 2023-11-09
First Publication Date 2024-03-07
Owner Magic Leap, Inc. (USA)
Inventor Schuck, Iii, Miller Harry

Abstract

A method for measuring performance of a head-mounted display module, the method including arranging the head-mounted display module relative to a plenoptic camera assembly so that an exit pupil of the head-mounted display module coincides with a pupil of the plenoptic camera assembly; emitting light from the head-mounted display module while the head-mounted display module is arranged relative to the plenoptic camera assembly; filtering the light at the exit pupil of the head-mounted display module; acquiring, with the plenoptic camera assembly, one or more light field images projected from the head-mounted display module with the filtered light; and determining information about the performance of the head-mounted display module based on acquired light field image.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 5/20 - Filters
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/28 - Optical systems or apparatus not provided for by any of the groups , for polarising

22.

BIASED TOTAL THICKNESS VARIATIONS IN WAVEGUIDE DISPLAY SUBSTRATES

      
Application Number 18505762
Status Pending
Filing Date 2023-11-09
First Publication Date 2024-03-07
Owner Magic Leap, Inc. (USA)
Inventor
  • Bhargava, Samarth
  • Peroz, Christophe
  • Liu, Victor Kai

Abstract

A plurality of waveguide display substrates, each waveguide display substrate having a cylindrical portion having a diameter and a planar surface, a curved portion opposite the planar surface defining a nonlinear change in thickness across the substrate and having a maximum height D with respect to the cylindrical portion, and a wedge portion between the cylindrical portion and the curved portion defining a linear change in thickness across the substrate and having a maximum height W with respect to the cylindrical portion. A target maximum height Dt of the curved portion is 10−7 to 10−6 times the diameter, D is between about 70% and about 130% of Dt, and W is less than about 30% of Dt.

IPC Classes  ?

  • G02B 6/13 - Integrated optical circuits characterised by the manufacturing method
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems
  • G02B 6/122 - Basic optical elements, e.g. light-guiding paths
  • G02B 26/08 - Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light

23.

FRAME-BY-FRAME RENDERING FOR AUGMENTED OR VIRTUAL REALITY SYSTEMS

      
Application Number 18506947
Status Pending
Filing Date 2023-11-10
First Publication Date 2024-03-07
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Schowengerdt, Brian T.
  • Miller, Samuel A.

Abstract

One embodiment is directed to a user display device comprising a housing frame mountable on the head of the user, a lens mountable on the housing frame and a projection sub system coupled to the housing frame to determine a location of appearance of a display object in a field of view of the user based at least in part on at least one of a detection of a head movement of the user and a prediction of a head movement of the user, and to project the display object to the user based on the determined location of appearance of the display object.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/01 - Head-up displays
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 1/60 - Memory management
  • G06T 7/70 - Determining position or orientation of objects or cameras
  • G09G 3/00 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes

24.

DYNAMICALLY ACTUABLE DIFFRACTIVE OPTICAL ELEMENT

      
Application Number 18388104
Status Pending
Filing Date 2023-11-08
First Publication Date 2024-03-07
Owner Magic Leap, Inc. (USA)
Inventor
  • Yeoh, Ivan Li Chuen
  • Edwin, Lionel Ernest

Abstract

A dynamically actuable lens includes a substrate having a surface and a metasurface diffractive optical element (DOE) formed on the surface. The metasurface DOE includes a plurality of raised portions and defines a plurality of recesses between adjacent raised portions. The dynamically actuable lens also includes a movable cover overlying the metasurface DOE and comprising a hydrophilic material, a quantity of a fluid disposed on the movable cover, and a drive mechanism coupled to the movable cover. The drive mechanism is configured to move the movable cover toward the metasurface DOE to displace a portion of the quantity of the fluid into the plurality of recesses, thereby rendering the metasurface DOE in an “off” state, and move the movable cover away from the metasurface DOE, causing the portion of the quantity of the fluid retracting from the plurality of recesses, thereby rendering the metasurface DOE in an “on” state.

IPC Classes  ?

  • G02B 27/42 - Diffraction optics
  • G02B 5/18 - Diffracting gratings
  • G02B 26/00 - Optical devices or arrangements for the control of light using movable or deformable optical elements
  • G02B 26/08 - Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
  • G02B 27/01 - Head-up displays

25.

AUGMENTED REALITY SYSTEM AND METHOD FOR SPECTROSCOPIC ANALYSIS

      
Application Number 18388421
Status Pending
Filing Date 2023-11-09
First Publication Date 2024-02-29
Owner Magic Leap, Inc. (USA)
Inventor
  • Kaehler, Adrian
  • Harrises, Christopher M.
  • Baerenrodt, Eric
  • Baerenrodt, Mark
  • Robaina, Natasja U.
  • Samec, Nicole Elizabeth
  • Powers, Tammy Sherri
  • Yeoh, Ivan Li Chuen
  • Wright, Adam Carl

Abstract

Wearable spectroscopy systems and methods for identifying one or more characteristics of a target object are described. Spectroscopy systems may include a light source configured to emit light in an irradiated field of view and an electromagnetic radiation detector configured to receive reflected light from a target object irradiated by the light source. One or more processors of the systems may identify a characteristic of the target object based on a determined level of light absorption by the target object. Some systems and methods may include one or more corrections for scattered and/or ambient light such as applying an ambient light correction, passing the reflected light through an anti-scatter grid, or using a time-dependent variation in the emitted light.

IPC Classes  ?

  • G01J 3/02 - Spectrometry; Spectrophotometry; Monochromators; Measuring colours - Details
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems
  • G01J 3/42 - Absorption spectrometry; Double-beam spectrometry; Flicker spectrometry; Reflection spectrometry
  • G01N 21/25 - Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
  • G01N 21/27 - Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands using photo-electric detection
  • G02B 27/01 - Head-up displays
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G09G 5/37 - Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of individual graphic patterns using a bit-mapped memory - Details of the operation on graphic patterns

26.

EYEPIECE IMAGING ASSEMBLIES FOR A HEAD MOUNTED DISPLAY

      
Application Number 18259044
Status Pending
Filing Date 2021-12-22
First Publication Date 2024-02-29
Owner Magic Leap, Inc. (USA)
Inventor
  • Jia, Zhiheng
  • Cohen, David
  • Edwin, Lionel Ernest
  • Schabacker, Charles Robert

Abstract

A head mounted display can include a frame, an eyepiece, an image injection device, a sensor array, a reflector, and an off-axis optical element. The frame can be configured to be supported on the head of the user. The eyepiece can be coupled to the frame and configured to be disposed in front of an eye of the user. The eyepiece can include a plurality of layers. The image injection device can be configured to provide image content to the eyepiece for viewing by the user. The sensor array can be integrated in or one the eyepiece. The reflector can be disposed in or on the eyepiece and configured to reflect light received from an object for imaging by the sensor array. The off-axis optical element can be disposed in or one the eyepiece. The off-axis optical element can be configured to receive light reflected from the reflector and direct at least a portion of the light toward the sensor array.

IPC Classes  ?

  • H04N 13/344 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
  • G02B 5/28 - Interference filters
  • G02B 25/00 - Eyepieces; Magnifying glasses
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/01 - Head-up displays

27.

RENDERING LOCATION SPECIFIC VIRTUAL CONTENT IN ANY LOCATION

      
Application Number 18460873
Status Pending
Filing Date 2023-09-05
First Publication Date 2024-02-29
Owner Magic Leap, Inc. (USA)
Inventor
  • Brodsky, Jonathan
  • Busto, Javier Antonio
  • Smith, Martin Wilkins

Abstract

Augmented reality systems and methods for creating, saving and rendering designs comprising multiple items of virtual content in a three-dimensional (3D) environment of a user. The designs may be saved as a scene, which is built by a user from pre-built sub-components, built components, and/or previously saved scenes. Location information, expressed as a saved scene anchor and position relative to the saved scene anchor for each item of virtual content, may also be saved. Upon opening the scene, the saved scene anchor node may be correlated to a location within the mixed reality environment of the user for whom the scene is opened. The virtual items of the scene may be positioned with the same relationship to that location as they have to the saved scene anchor node. That location may be selected automatically and/or by user input.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G02B 27/01 - Head-up displays
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06V 10/764 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
  • G06V 10/82 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
  • G06V 20/20 - Scenes; Scene-specific elements in augmented reality scenes

28.

DISPLAY SYSTEM WITH LOW-LATENCY PUPIL TRACKER

      
Application Number 18500868
Status Pending
Filing Date 2023-11-02
First Publication Date 2024-02-29
Owner Magic Leap, Inc. (USA)
Inventor Klug, Michael Anthony

Abstract

A display system aligns the location of its exit pupil with the location of a viewer's pupil by changing the location of the portion of a light source that outputs light. The light source may include an array of pixels that output light, thereby allowing an image to be displayed on the light source. The display system includes a camera that captures images of the eye and negatives of the images are displayed by the light source. In the negative image, the dark pupil of the eye is a bright spot which, when displayed by the light source, defines the exit pupil of the display system. The location of the pupil of the eye may be tracked by capturing the images of the eye, and the location of the exit pupil of the display system may be adjusted by displaying negatives of the captured images using the light source.

IPC Classes  ?

  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/01 - Head-up displays

29.

Set of accessory boxes

      
Application Number 29716337
Grant Number D1015871
Status In Force
Filing Date 2019-12-09
First Publication Date 2024-02-27
Grant Date 2024-02-27
Owner Magic Leap, Inc. (USA)
Inventor
  • Natsume, Shigeru
  • Hoit, Sarah
  • Palmer, James William
  • Gamez Castillejos, Daniel Marcelo
  • Palmer, Christopher G.

30.

Mobile device accessory with cameras

      
Application Number 29717206
Grant Number D1016119
Status In Force
Filing Date 2019-12-16
First Publication Date 2024-02-27
Grant Date 2024-02-27
Owner Magic Leap, Inc. (USA)
Inventor
  • Natsume, Shigeru
  • Gunther, Sebastian Gonzalo Arrieta
  • Awad, Haney
  • Green Mercer, Bryson John

31.

CALIBRATION FOR VIRTUAL OR AUGMENTED REALITY SYSTEMS

      
Application Number 18266937
Status Pending
Filing Date 2021-12-21
First Publication Date 2024-02-22
Owner Magic Leap, Inc. (USA)
Inventor
  • Gupta, Ankur
  • Souiai, Mohamed

Abstract

Techniques for addressing deformations in a virtual or augmented headset described. In some implementations, cameras in a headset can obtain image data at different times as the headset moves through a series of poses of the headset. One or more miscalibration conditions for the headset that have occurred as the headset moved through the series of poses can be detected. The series of poses can be divided into groups of poses based on the one or more miscalibration conditions, and bundle adjustment for the groups of poses can be performed using a separate set of camera calibration data. The bundle adjustment for the poses in each group is performed using a same set of calibration data for the group. The camera calibration data for each group is estimated jointly with bundle adjustment estimation for the poses in the group.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 7/70 - Determining position or orientation of objects or cameras
  • G06T 7/80 - Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

32.

CROSS REALITY SYSTEM WITH ACCURATE SHARED MAPS

      
Application Number 18457314
Status Pending
Filing Date 2023-08-28
First Publication Date 2024-02-22
Owner Magic Leap, Inc. (USA)
Inventor
  • Velasquez, Miguel Andres Granados
  • Gomez Gonzalez, Javier Victorio
  • Prasad, Mukta
  • Guendelman, Eran
  • Shahrokni, Ali
  • Swaminathan, Ashwin

Abstract

A cross reality system enables any of multiple devices to efficiently and accurately access previously persisted maps of very large scale environments and render virtual content specified in relation to those maps. The cross reality system may build a persisted map, which may be in canonical form, by merging tracking maps from the multiple devices. A map merge process determines mergibility of a tracking map with a canonical map and merges a tracking map with a canonical map in accordance with mergibility criteria, such as, when a gravity direction of the tracking map aligns with a gravity direction of the canonical map. Refraining from merging maps if the orientation of the tracking map with respect to gravity is not preserved avoids distortions in persisted maps and results in multiple devices, which may use the maps to determine their locations, to present more realistic and immersive experiences for their users.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G01C 21/00 - Navigation; Navigational instruments not provided for in groups

33.

SINGLE PUPIL RGB LIGHT SOURCE

      
Application Number 18260708
Status Pending
Filing Date 2022-01-07
First Publication Date 2024-02-22
Owner Magic Leap, Inc. (USA)
Inventor
  • Curtis, Kevin Richard
  • Hall, Heidi Leising
  • Trisnadi, Jahja L.

Abstract

Embodiments of this disclosure systems and methods for displays. In embodiments, a display system includes a light source configured to emit a first light, a lens configured to receive the first light, and an image generator configured receive the first light and emit a second light. The display system further includes a plurality of waveguides, where at least two of the plurality of waveguides include an in-coupling grating configured to selectively couple the second light. In some embodiments, the light source can comprise a single pupil light source having a reflector and a micro-LED array disposed in the reflector.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems

34.

LIDAR SIMULTANEOUS LOCALIZATION AND MAPPING

      
Application Number 18264572
Status Pending
Filing Date 2022-02-11
First Publication Date 2024-02-22
Owner Magic Leap, Inc. (USA)
Inventor
  • Zhou, Lipu
  • Swaminathan, Ashwin
  • Agarwal, Lomesh

Abstract

Disclosed here in are systems and methods for mapping environment information. In some embodiments, the systems and methods are configured for mapping information in a mixed reality environment. In some embodiments, the system is configured to perform a method including scanning an environment including capturing, with a sensor, a plurality of points of the environment; tracking a plane of the environment; updating observations associated with the environment by inserting a keyframe into the observations; determining whether the plane is coplanar with a second plane of the environment; in accordance with a determination that the plane is coplanar with the second plane, performing planar bundle adjustment on the observations associated with the environment; and in accordance with a determination that the plane is not coplanar with the second plane, performing planar bundle adjustment on a portion of the observations associated with the environment.

IPC Classes  ?

  • G06T 7/579 - Depth or shape recovery from multiple images from motion
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

35.

SYSTEMS AND METHODS FOR SIGN LANGUAGE RECOGNITION

      
Application Number 18357531
Status Pending
Filing Date 2023-07-24
First Publication Date 2024-02-22
Owner Magic Leap, Inc. (USA)
Inventor
  • Browy, Eric
  • Woods, Michael Janusz
  • Rabinovich, Andrew

Abstract

A sensory eyewear system for a mixed reality device can facilitate user's interactions with the other people or with the environment. As one example, the sensory eyewear system can recognize and interpret a sign language, and present the translated information to a user of the mixed reality device. The wearable system can also recognize text in the user's environment, modify the text (e.g., by changing the content or display characteristics of the text), and render the modified text to occlude the original text.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 1/16 - Constructional details or arrangements
  • G06F 3/03 - Arrangements for converting the position or the displacement of a member into a coded form
  • G06F 40/58 - Use of machine translation, e.g. for multi-lingual retrieval, for server-side translation for client devices or for real-time translation
  • G06V 20/20 - Scenes; Scene-specific elements in augmented reality scenes
  • G06V 40/20 - Movements or behaviour, e.g. gesture recognition

36.

METHODS FOR REFINING RGBD CAMERA POSES

      
Application Number 18384627
Status Pending
Filing Date 2023-10-27
First Publication Date 2024-02-22
Owner Magic Leap, Inc. (USA)
Inventor Wei, Xiaolin

Abstract

A method for refining poses includes receiving a plurality of poses and computing a relative pose set by determining a first set of relative poses between image frame pairs for a first subset of the image frame pairs having a temporal separation between image frames of the image frame pairs less than a threshold, and determining a second set of relative poses between image frame pairs for a second subset of the image frame pairs having a temporal separation between image frames of the image frame pairs greater than the threshold.

IPC Classes  ?

  • H04N 23/10 - Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
  • G06T 7/33 - Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods

37.

TUNABLE ATTENUATION OF LIGHT TRANSMISSION ARTIFACTS IN WEARABLE DISPLAYS

      
Application Number 18497659
Status Pending
Filing Date 2023-10-30
First Publication Date 2024-02-22
Owner Magic Leap, Inc. (USA)
Inventor
  • Cheng, Hui-Chuan
  • Manly, David
  • Mathur, Vaibhav
  • Haddock, Joshua Naaman
  • Messer, Kevin
  • Carlisle, Clinton

Abstract

A method for displaying an image using a wearable display system including directing display light from a display towards a user through an eyepiece to project images in the user's field of view, determining a relative location between an ambient light source and the eyepiece, and adjusting an attenuation of ambient light from the ambient light source through the eyepiece depending on the relative location between the ambient light source and the eyepiece.

IPC Classes  ?

  • G02F 1/137 - Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour based on liquid crystals, e.g. single liquid crystal display cells characterised by the electro-optical or magneto-optical effect, e.g. field-induced phase transition, orientation effect, guest-host interaction or dynamic scattering
  • G02F 1/13363 - Birefringent elements, e.g. for optical compensation
  • G02F 1/1335 - Structural association of cells with optical devices, e.g. polarisers or reflectors
  • G02F 1/01 - Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour
  • G02F 1/1347 - Arrangement of liquid crystal layers or cells in which the final condition of one light beam is achieved by the addition of the effects of two or more layers or cells
  • G02B 27/01 - Head-up displays
  • G02B 25/00 - Eyepieces; Magnifying glasses

38.

Head mounted audio-visual display system connection cable

      
Application Number 29709698
Grant Number D1015305
Status In Force
Filing Date 2019-10-16
First Publication Date 2024-02-20
Grant Date 2024-02-20
Owner Magic Leap, Inc. (USA)
Inventor
  • Lundmark, David Charles
  • Sommers, Jeffrey Scott

39.

PROCESSING SECURE CONTENT ON A VIRTUAL REALITY SYSTEM

      
Application Number 18380099
Status Pending
Filing Date 2023-10-13
First Publication Date 2024-02-15
Owner Magic Leap, Inc. (USA)
Inventor
  • Taylor, Robert Blake
  • Pastouchenko, Dmitry
  • Plourde, Frederic

Abstract

Described herein are techniques and technologies to identify an encrypted content within a field of view of a user of a VR/AR system and process the encrypted content appropriately. The user of the VR/AR technology may have protected content in a field of view of the user. Encrypted content is mapped to one or more protected surfaces on a display device. Contents mapped to a protected surface may be rendered on the display device but prevented from being replicated from the display device.

IPC Classes  ?

  • G06F 21/62 - Protecting access to data via a platform, e.g. using keys or access control rules
  • G06F 3/14 - Digital output to display device
  • G06F 21/60 - Protecting data
  • G06F 21/84 - Protecting input, output or interconnection devices output devices, e.g. displays or monitors

40.

IMAGING MODIFICATION, DISPLAY AND VISUALIZATION USING AUGMENTED AND VIRTUAL REALITY EYEWEAR

      
Application Number 18475688
Status Pending
Filing Date 2023-09-27
First Publication Date 2024-02-15
Owner Magic Leap, Inc. (USA)
Inventor
  • Robaina, Nastasja U.
  • Samec, Nicole Elizabeth
  • Harrises, Christopher M.
  • Abovitz, Rony
  • Baerenrodt, Mark
  • Schmidt, Brian Lloyd

Abstract

A display system can include a head-mounted display configured to project light to an eye of a user to display augmented reality image content to the user. The display system can include one or more user sensors configured to sense the user and can include one or more environmental sensors configured to sense surroundings of the user. The display system can also include processing electronics in communication with the display, the one or more user sensors, and the one or more environmental sensors. The processing electronics can be configured to sense a situation involving user focus, determine user intent for the situation, and alter user perception of a real or virtual object within the vision field of the user based at least in part on the user intent and/or sensed situation involving user focus. The processing electronics can be configured to at least one of enhance or de-emphasize the user perception of the real or virtual object within the vision field of the user.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • A61B 90/00 - Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups , e.g. for luxation treatment or for protecting wound edges
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/03 - Arrangements for converting the position or the displacement of a member into a coded form
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery

41.

SYSTEMS AND METHODS FOR VIRTUAL AND AUGMENTED REALITY

      
Application Number 18493633
Status Pending
Filing Date 2023-10-24
First Publication Date 2024-02-15
Owner Magic Leap, Inc. (USA)
Inventor Browy, Eric C.

Abstract

Disclosed herein are systems and methods for distributed computing and/or networking for mixed reality systems. A method may include capturing an image via a camera of a head-wearable device. Inertial data may be captured via an inertial measurement unit of the head-wearable device. A position of the head-wearable device can be estimated based on the image and the inertial data via one or more processors of the head-wearable device. The image can be transmitted to a remote server. A neural network can be trained based on the image via the remote server. A trained neural network can be transmitted to the head-wearable device.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/14 - Digital output to display device
  • H04B 7/155 - Ground-based stations
  • G06F 18/214 - Generating training patterns; Bootstrap methods, e.g. bagging or boosting
  • G06V 10/764 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
  • G06V 20/20 - Scenes; Scene-specific elements in augmented reality scenes
  • G06V 40/16 - Human faces, e.g. facial parts, sketches or expressions
  • G06V 40/18 - Eye characteristics, e.g. of the iris

42.

PAIRING WITH COMPANION DEVICE

      
Application Number 18493113
Status Pending
Filing Date 2023-10-24
First Publication Date 2024-02-15
Owner Magic Leap, Inc. (USA)
Inventor
  • Singh, Nitin
  • Kaehler, Adrian

Abstract

This disclosure describes techniques for device authentication and/or pairing. A display system can comprise a head mountable display, computer memory, and processor(s). In response to receiving a request to authenticate a connection between the display system and a companion device (e.g., controller or other computer device), first data may be determined, the first data based at least partly on audio data spoken by a user. The first data may be sent to an authentication device configured to compare the first data to second data received from the companion device, the second data based at least partly on the audio data. Based at least partly on a correspondence between the first and second data, the authentication device can send a confirmation to the display system to permit communication between the display system and companion device.

IPC Classes  ?

  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G06F 21/32 - User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
  • G02B 27/01 - Head-up displays
  • H04W 4/80 - Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication
  • H04W 8/00 - Network data management
  • H04M 1/60 - Substation equipment, e.g. for use by subscribers including speech amplifiers
  • G06F 21/44 - Program or device authentication
  • H04L 9/40 - Network security protocols
  • H04W 12/50 - Secure pairing of devices
  • H04W 12/065 - Continuous authentication
  • H04W 12/069 - Authentication using certificates or pre-shared keys
  • H04W 12/77 - Graphical identity
  • G06V 20/80 - Recognising image objects characterised by unique random patterns

43.

SYSTEMS AND METHODS FOR CROSS-APPLICATION AUTHORING, TRANSFER, AND EVALUATION OF RIGGING CONTROL SYSTEMS FOR VIRTUAL CHARACTERS

      
Application Number 18493439
Status Pending
Filing Date 2023-10-24
First Publication Date 2024-02-15
Owner Magic Leap, Inc. (USA)
Inventor
  • Wedig, Geoffrey
  • Bancroft, James Jonathan

Abstract

Various examples of cross-application systems and methods for authoring, transferring, and evaluating rigging control systems for virtual characters are disclosed. Embodiments of a method include the steps or processes of creating, in a first application which implements a first rigging control protocol, a rigging control system description; writing the rigging control system description to a data file; and initiating transfer of the data file to a second application. In such embodiments, the rigging control system description may be defined according to a different second rigging control protocol. The rigging control system description may specify a rigging control input, such as a lower-order rigging element (e.g., a core skeleton for a virtual character), and at least one rule for operating on the rigging control input to produce a rigging control output, such as a higher-order skeleton or other higher-order rigging element.

IPC Classes  ?

  • G06T 13/40 - 3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

44.

CROSS REALITY SYSTEM WITH LOCALIZATION SERVICE AND SHARED LOCATION-BASED CONTENT

      
Application Number 18496407
Status Pending
Filing Date 2023-10-27
First Publication Date 2024-02-15
Owner Magic Leap, Inc. (USA)
Inventor
  • Caswell, Timothy Dean
  • Piascik, Konrad
  • Zolotarev, Leonid
  • Rushton, Mark Ashley

Abstract

A cross reality system enables any of multiple devices to efficiently render shared location-based content. The cross reality system may include a cloud-based service that responds to requests from devices to localize with respect to a stored map. The service may return to the device information that localizes the device with respect to the stored map. In conjunction with localization information, the service may provide information about locations in the physical world proximate the device for which virtual content has been provided. Based on information received from the service, the device may render, or stop rendering, virtual content to each of multiple users based on the user's location and specified locations for the virtual content.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06T 15/00 - 3D [Three Dimensional] image rendering
  • G06T 15/20 - Perspective computation

45.

EYEPIECE FOR HEAD-MOUNTED DISPLAY AND METHOD FOR MAKING THE SAME

      
Application Number 18238635
Status Pending
Filing Date 2023-08-28
First Publication Date 2024-02-08
Owner Magic Leap, Inc. (USA)
Inventor
  • Chang, Chieh
  • Peroz, Christophe
  • Ong, Ryan Jason
  • Li, Ling
  • Bhagat, Sharad D.
  • Bhargava, Samarth

Abstract

A method, includes providing a wafer including a first surface grating extending over a first area of a surface of the wafer and a second surface grating extending over a second area of the surface of the wafer; de-functionalizing a portion of the surface grating in at least one of the first surface grating area and the second surface grating area; and singulating an eyepiece from the wafer, the eyepiece including a portion of the first surface grating area and a portion of the second surface grating area. The first surface grating in the eyepiece corresponds to an input coupling grating for a head-mounted display and the second surface grating corresponds to a pupil expander grating for the head-mounted display.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 6/02 - Optical fibres with cladding

46.

CONCURRENT CAMERA CALIBRATION AND BUNDLE ADJUSTMENT

      
Application Number 18245816
Status Pending
Filing Date 2021-08-17
First Publication Date 2024-02-08
Owner Magic Leap, Inc. (USA)
Inventor
  • Souai, Mohamed
  • Gupta, Ankur
  • Napolskikh, Igor

Abstract

Methods, systems, and apparatus, including computer programs encoded on computer storage media, for camera calibration during bundle adjustment. One of the methods includes maintaining a three-dimensional model of an environment and a plurality of image data clusters that each include data generated from images captured by two or more cameras included in a device. The method includes jointly determining, for a three-dimensional point represented by an image data cluster (i) the newly estimated coordinates for the three-dimensional point for an update to the three-dimensional model or a trajectory of the device, and (ii) the newly estimated calibration data that represents the spatial relationship between the two or more cameras.

IPC Classes  ?

  • G06T 7/80 - Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

47.

IMPRINT LITHOGRAPHY USING MULTI-LAYER COATING ARCHITECTURE

      
Application Number 18555502
Status Pending
Filing Date 2022-04-21
First Publication Date 2024-02-08
Owner Magic Leap, Inc. (USA)
Inventor
  • Menezes, Marlon Edward
  • Singh, Vikramjit
  • Xu, Frank Y.

Abstract

Structures for forming an optical feature and methods for forming the optical feature are disclosed. In some embodiments, the structure comprises a patterned layer comprising a pattern corresponding to the optical feature; a base layer; and an intermediate layer bonded to the patterned layer and the base layer.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 6/34 - Optical coupling means utilising prism or grating
  • B29D 11/00 - Producing optical elements, e.g. lenses or prisms

48.

COMPENSATION FOR DEFORMATION IN HEAD MOUNTED DISPLAY SYSTEMS

      
Application Number 18454912
Status Pending
Filing Date 2023-08-24
First Publication Date 2024-02-08
Owner Magic Leap, Inc. (USA)
Inventor
  • Edwin, Lionel Ernest
  • Miller, Samuel A.
  • Grossmann, Etienne Gregoire
  • Clark, Brian Christopher
  • Johnson, Michael Robert
  • Zhao, Wenyi
  • Shah, Nukul Sanjay
  • Huang, Po-Kang

Abstract

The systems and methods described can include approaches to calibrate head-mounted displays for improved viewing experiences. Some methods include receiving data of a first target image associated with an undeformed state of a first eyepiece of a head-mounted display device; receiving data of a first captured image associated with deformed state of the first eyepiece of the head-mounted display device; determining a first transformation that maps the first captured image to the image; and applying the first transformation to a subsequent image for viewing on the first eyepiece of the head-mounted display device.

IPC Classes  ?

  • G06T 5/00 - Image enhancement or restoration
  • G02B 27/01 - Head-up displays
  • G06T 3/00 - Geometric image transformation in the plane of the image
  • H04N 13/344 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays

49.

EFFICIENT RENDERING OF VIRTUAL SOUNDFIELDS

      
Application Number 18486938
Status Pending
Filing Date 2023-10-13
First Publication Date 2024-02-08
Owner Magic Leap, Inc. (USA)
Inventor
  • Schmidt, Brian Lloyd
  • Dicker, Samuel Charles

Abstract

An audio system and method of spatially rendering audio signals that uses modified virtual speaker panning is disclosed. The audio system may include a fixed number F of virtual speakers, and the modified virtual speaker panning may dynamically select and use a subset P of the fixed virtual speakers. The subset P of virtual speakers may be selected using a low energy speaker detection and culling method, a source geometry-based culling method, or both. One or more processing blocks in the decoder/virtualizer may be bypassed based on the energy level of the associated audio signal or the location of the sound source relative to the user/listener, respectively. In some embodiments, a virtual speaker that is designated as an active virtual speaker at a first time, may also be designated as an active virtual speaker at a second time to ensure the processing completes.

IPC Classes  ?

  • H04S 7/00 - Indicating arrangements; Control arrangements, e.g. balance control
  • G10L 19/008 - Multichannel audio signal coding or decoding using interchannel correlation to reduce redundancy, e.g. joint-stereo, intensity-coding or matrixing
  • G10L 25/21 - Speech or voice analysis techniques not restricted to a single one of groups characterised by the type of extracted parameters the extracted parameters being power information
  • H04S 3/00 - Systems employing more than two channels, e.g. quadraphonic

50.

VIRTUAL, AUGMENTED, AND MIXED REALITY SYSTEMS AND METHODS

      
Application Number 18487794
Status Pending
Filing Date 2023-10-16
First Publication Date 2024-02-08
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Rodriguez, Jose Felix
  • Perez, Ricardo Martinez

Abstract

A virtual, augmented, or mixed reality display system includes a display configured to display virtual, augmented, or mixed reality image data, the display including one or more optical components which introduce optical distortions or aberrations to the image data. The system also includes a display controller configured to provide the image data to the display. The display controller includes memory for storing optical distortion correction information, and one or more processing elements to at least partially correct the image data for the optical distortions or aberrations using the optical distortion correction information.

IPC Classes  ?

  • G06T 5/00 - Image enhancement or restoration
  • G06T 3/00 - Geometric image transformation in the plane of the image
  • G06T 3/40 - Scaling of a whole image or part thereof
  • G06F 3/14 - Digital output to display device
  • G06F 1/3203 - Power management, i.e. event-based initiation of a power-saving mode

51.

SYSTEM AND METHOD FOR PRESENTING IMAGE CONTENT ON MULTIPLE DEPTH PLANES BY PROVIDING MULTIPLE INTRA-PUPIL PARALLAX VIEWS

      
Application Number 18490169
Status Pending
Filing Date 2023-10-19
First Publication Date 2024-02-08
Owner Magic Leap, Inc. (USA)
Inventor
  • Klug, Michael Anthony
  • Konrad, Robert
  • Wetzstein, Gordon
  • Schowengerdt, Brian T.
  • Vaughn, Michal Beau Dennison

Abstract

An augmented reality display system is configured to direct a plurality of parallactically-disparate intra-pupil images into a viewer's eye. The parallactically-disparate intra-pupil images provide different parallax views of a virtual object, and impinge on the pupil from different angles. In the aggregate, the wavefronts of light forming the images approximate a continuous divergent wavefront and provide selectable accommodation cues for the user, depending on the amount of parallax disparity between the intra-pupil images. The amount of parallax disparity is selected using a light source that outputs light for different images from different locations, with spatial differences in the locations of the light output providing differences in the paths that the light takes to the eye, which in turn provide different amounts of parallax disparity. Advantageously, the wavefront divergence, and the accommodation cue provided to the eye of the user, may be varied by appropriate selection of parallax disparity, which may be set by selecting the amount of spatial separation between the locations of light output.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 30/24 - Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer’s left and right eyes of the stereoscopic type involving temporal multiplexing, e.g. using sequentially activated left and right shutters
  • G02B 30/34 - Stereoscopes providing a stereoscopic pair of separated images corresponding to parallactically displaced views of the same object, e.g. 3D slide viewers
  • H04N 13/128 - Adjusting depth or disparity
  • H04N 13/383 - Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes
  • H04N 13/344 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
  • H04N 13/341 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using temporal multiplexing
  • H04N 13/339 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using spatial multiplexing
  • H04N 13/398 - Synchronisation thereof; Control thereof

52.

SYSTEMS AND METHODS FOR VIRTUAL AND AUGMENTED REALITY

      
Application Number 18490518
Status Pending
Filing Date 2023-10-19
First Publication Date 2024-02-08
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Lundmark, David Charles
  • Broadmore, Gregory Michael

Abstract

An apparatus configured to be head-worn by a user, includes: a transparent screen configured to allow the user to see therethrough; a sensor system configured to sense a characteristic of a physical object in an environment in which the user is located; and a processing unit coupled to the sensor system, the processing unit configured to: cause the screen to display a user-controllable object, and cause the screen to display an image of a feature that is resulted from a virtual interaction between the user-controllable object and the physical object, so that the feature will appear to be a part of the physical object in the environment or appear to be emanating from the physical object.

IPC Classes  ?

  • G09G 5/377 - Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of individual graphic patterns using a bit-mapped memory - Details of the operation on graphic patterns for mixing or overlaying two or more graphic patterns
  • G06T 11/00 - 2D [Two Dimensional] image generation
  • G09G 5/38 - Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of individual graphic patterns using a bit-mapped memory with means for controlling the display position
  • G02B 27/01 - Head-up displays

53.

EYE IMAGING WITH AN OFF-AXIS IMAGER

      
Application Number 18354575
Status Pending
Filing Date 2023-07-18
First Publication Date 2024-02-01
Owner Magic Leap, Inc. (USA)
Inventor
  • Klug, Michael Anthony
  • Kaehler, Adrian

Abstract

Examples of an imaging system for use with a head mounted display (HMD) are disclosed. The imaging system can include a forward-facing imaging camera and a surface of a display of the HMD can include an off-axis diffractive optical element (DOE) or hot mirror configured to reflect light to the imaging camera. The DOE or hot mirror can be segmented. The imaging system can be used for eye tracking, biometric identification, multiscopic reconstruction of the three-dimensional shape of the eye, etc.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • A61B 3/10 - Objective types, i.e. instruments for examining the eyes independent of the patients perceptions or reactions
  • A61B 3/113 - Objective types, i.e. instruments for examining the eyes independent of the patients perceptions or reactions for determining or recording eye movement
  • A61B 3/14 - Arrangements specially adapted for eye photography
  • A61B 3/12 - Objective types, i.e. instruments for examining the eyes independent of the patients perceptions or reactions for looking at the eye fundus, e.g. ophthalmoscopes
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • A61B 5/11 - Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
  • A61B 5/00 - Measuring for diagnostic purposes ; Identification of persons
  • A61B 5/16 - Devices for psychotechnics; Testing reaction times

54.

DISPLAY SYSTEM AND METHOD FOR PROVIDING VARIABLE ACCOMMODATION CUES USING MULTIPLE INTRA-PUPIL PARALLAX VIEWS FORMED BY LIGHT EMITTER ARRAYS

      
Application Number 18482893
Status Pending
Filing Date 2023-10-08
First Publication Date 2024-02-01
Owner Magic Leap, Inc. (USA)
Inventor Klug, Michael Anthony

Abstract

A display system is configured to direct a plurality of parallactically-disparate intra-pupil images into a viewer's eye. The parallactically-disparate intra-pupil images provide different parallax views of a virtual object, and impinge on the pupil from different angles. In the aggregate, the wavefronts of light forming the images approximate a continuous divergent wavefront and provide selectable accommodation cues for the user, depending on the amount of parallax disparity between the intra-pupil images. The amount of parallax disparity may be selected using an array of shutters that selectively regulate the entry of image light into an eye. Each opened shutter in the array provides a different intra-pupil image, and the locations of the open shutters provide the desired amount of parallax disparity between the images. In some other embodiments, the images may be formed by an emissive micro-display. Each pixel formed by the micro-display may be formed by one of a group of light emitters, which are at different locations such that the emitted light takes different paths to the eye, the different paths providing different amounts of parallax disparity.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • H04N 13/344 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
  • H04N 13/398 - Synchronisation thereof; Control thereof
  • G02B 30/24 - Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer’s left and right eyes of the stereoscopic type involving temporal multiplexing, e.g. using sequentially activated left and right shutters
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems
  • G02B 25/00 - Eyepieces; Magnifying glasses
  • G02B 27/10 - Beam splitting or combining systems
  • G02B 27/14 - Beam splitting or combining systems operating by reflection only
  • G02B 27/30 - Collimators
  • G09G 3/00 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes

55.

METHOD AND SYSTEM FOR PATTERNING A LIQUID CRYSTAL LAYER

      
Application Number 18482896
Status Pending
Filing Date 2023-10-08
First Publication Date 2024-02-01
Owner Magic Leap, Inc. (USA)
Inventor
  • Komanduri, Ravi Kumar
  • Oh, Chulwoo

Abstract

An optical master is created by using a nanoimprint alignment layer to pattern a liquid crystal layer. The nanoimprint alignment layer and the liquid crystal layer constitute the optical master. The optical master is positioned above a photo-alignment layer. The optical master is illuminated and light propagating through the nanoimprinted alignment layer and the liquid crystal layer is diffracted and subsequently strikes the photo-alignment layer. The incident diffracted light causes the pattern in the liquid crystal layer to be transferred to the photo-alignment layer. A second liquid crystal layer is deposited onto the patterned photo-alignment layer, which subsequently is used to align the molecules of the second liquid crystal layer. The second liquid crystal layer in the patterned photo-alignment layer may be utilized as a replica optical master or as a diffractive optical element for directing light in optical devices such as augmented reality display devices.

IPC Classes  ?

  • G03H 1/02 - HOLOGRAPHIC PROCESSES OR APPARATUS - Details peculiar thereto - Details

56.

TILTING ARRAY BASED DISPLAY

      
Application Number 18483981
Status Pending
Filing Date 2023-10-10
First Publication Date 2024-02-01
Owner Magic Leap, Inc. (USA)
Inventor
  • Sissom, Bradley Jay
  • Curtis, Kevin Richard
  • Cheng, Hui-Chuan
  • Schuck, Iii, Miller Harry
  • Bhargava, Samarth

Abstract

This disclosure describes a wearable display system configured to project light to the eye(s) of a user to display virtual (e.g., augmented reality) image content in a vision field of the user. The system can include light source(s) that output light, spatial light modulator(s) that modulate the light to provide the virtual image content, and an eyepiece configured to convey the modulated light toward the eye(s) of the user. The eyepiece can include waveguide(s) and a plurality of in-coupling optical elements arranged on or in the waveguide(s) to in-couple the modulated light received from the spatial light modulator(s) into the waveguide(s) to be guided toward the user's eye(s). The spatial light modulator(s) may be movable, and/or may include movable components, to direct different portions of the modulated light toward different ones of the in-coupling optical elements at different times.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems
  • G02B 25/00 - Eyepieces; Magnifying glasses

57.

SYSTEMS, METHODS, AND DEVICES FOR ADHESION OF INTERIOR WAVEGUIDE PILLARS

      
Application Number 18257516
Status Pending
Filing Date 2021-12-21
First Publication Date 2024-02-01
Owner Magic Leap, Inc. (USA)
Inventor
  • Li, Ling
  • Peroz, Christophe
  • Chang, Chieh
  • Bhagat, Sharad D.
  • Ong, Ryan Jason
  • Karbasi, Ali
  • Rugg, Stephen Richard
  • Melli, Mauro
  • Messer, Kevin
  • Hill, Brian George
  • West, Melanie Maputol

Abstract

In some embodiments, a near-eye, near-eye display system comprises a stack of waveguides having pillars in a central, active portion of the waveguides. The active portion may include light outcoupling optical elements configured to outcouple image light from the waveguides towards the eye of a viewer. The pillars extend between and separate neighboring ones of the waveguides. The light outcoupling optical elements may include diffractive optical elements that are formed simultaneously with the pillars, for example, by imprinting or casting. The pillars are disposed on one or more major surfaces of each of the waveguides. The pillars may define a distance between two adjacent waveguides of the stack of waveguides. The pillars may be bonded to adjacent waveguides may be using one or more of the systems, methods, or devices herein. The bonding provides a high level of thermal stability to the waveguide stack, to resist deformation as temperatures change.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems

58.

DISPLAY SYSTEMS AND METHODS FOR DETERMINING VERTICAL ALIGNMENT BETWEEN LEFT AND RIGHT DISPLAYS AND A USER'S EYES

      
Application Number 18486848
Status Pending
Filing Date 2023-10-13
First Publication Date 2024-02-01
Owner Magic Leap, Inc. (USA)
Inventor Vlaskamp, Bjorn Nicolaas Servatius

Abstract

A wearable device may include a head-mounted display (HMD) for rendering a three-dimensional (3D) virtual object which appears to be located in an ambient environment of a user of the display. The relative positions of the HMD and one or more eyes of the user may not be in desired positions to receive image information outputted by the HMD. For example, the HAMID-to-eye vertical alignment may be different between the left and right eyes. The wearable device may determine if the HMD is level on the user's head and may then provide the user with a left-eye alignment marker and a right-eye alignment marker. Based on user feedback, the wearable device may determine if there is any left-right vertical misalignment and may take actions to reduce or minimize the effects of any misalignment.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • H04N 13/344 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems
  • G02B 27/01 - Head-up displays
  • G09G 5/38 - Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of individual graphic patterns using a bit-mapped memory with means for controlling the display position

59.

Furniture accessory with cameras

      
Application Number 29717260
Grant Number D1013007
Status In Force
Filing Date 2019-12-16
First Publication Date 2024-01-30
Grant Date 2024-01-30
Owner Magic Leap, Inc. (USA)
Inventor
  • Natsume, Shigeru
  • Gunther, Sebastian Gonzalo Arrieta
  • Awad, Haney
  • Green Mercer, Bryson John

60.

VIRTUAL OBJECT MOVEMENT SPEED CURVE FOR VIRTUAL AND AUGMENTED REALITY DISPLAY SYSTEMS

      
Application Number 18340778
Status Pending
Filing Date 2023-06-23
First Publication Date 2024-01-25
Owner Magic Leap, Inc. (USA)
Inventor
  • Xu, Yan
  • Fushiki, Ikko
  • Shanbhag, Suraj Manjunath
  • Das, Shiuli
  • Lee, Jung-Suk

Abstract

Systems and methods for regulating the speed of movement of virtual objects presented by a wearable system are described. The wearable system may present three-dimensional (3D) virtual content that moves, e.g., laterally across the user's field of view and/or in perceived depth from the user. The speed of the movement may follow the profile of an S-curve, with a gradual increase to a maximum speed, and a subsequent gradual decrease in speed until an end point of the movement is reached. The decrease in speed may be more gradual than the increase in speed. This speed curve may be utilized in the movement of virtual objections for eye-tracking calibration. The wearable system may track the position of a virtual object (an eye-tracking target) which moves with a speed following the S-curve. This speed curve allows for rapid movement of the eye-tracking target, while providing a comfortable viewing experience and high accuracy in determining the initial and final positions of the eye as it tracks the target.

IPC Classes  ?

  • G02C 7/06 - Lenses; Lens systems multifocal
  • G02C 7/02 - Lenses; Lens systems
  • G02B 27/01 - Head-up displays
  • G02B 27/36 - Fiducial marks or measuring scales within the optical system adjustable

61.

CROSS REALITY SYSTEM WITH SIMPLIFIED PROGRAMMING OF VIRTUAL CONTENT

      
Application Number 18353775
Status Pending
Filing Date 2023-07-17
First Publication Date 2024-01-25
Owner Magic Leap, Inc. (USA)
Inventor
  • Zhang, Haiyan
  • Macdonald, Robert John Cummings

Abstract

A cross reality system that renders virtual content generated by executing native mode applications may be configured to render web-based content using components that render content from native applications. The system may include a Prism manager that provides Prisms in which content from executing native applications is rendered. For rendering web based content, a browser, accessing the web based content, may be associated with a Prism and may render content into its associated Prism, creating the same immersive experience for the user as when content is generated by a native application. The user may access the web application from the same program launcher menu as native applications. The system may have tools that enable a user to access these capabilities, including by creating for a web location an installable entity that, when processed by the system, results in an icon for the web content in a program launcher menu.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
  • G06F 3/0482 - Interaction with lists of selectable items, e.g. menus
  • G06F 9/54 - Interprogram communication
  • G06F 16/955 - Retrieval from the web using information identifiers, e.g. uniform resource locators [URL]
  • G06F 16/954 - Navigation, e.g. using categorised browsing
  • G06F 3/04817 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons

62.

NEURAL NETWORK FOR EYE IMAGE SEGMENTATION AND IMAGE QUALITY ESTIMATION

      
Application Number 18455093
Status Pending
Filing Date 2023-08-24
First Publication Date 2024-01-25
Owner Magic Leap, Inc. (USA)
Inventor
  • Spizhevoy, Alexey
  • Kaehler, Adrian
  • Badrinarayanan, Vijay

Abstract

Systems and methods for eye image segmentation and image quality estimation are disclosed. In one aspect, after receiving an eye image, a device such as an augmented reality device can process the eye image using a convolutional neural network with a merged architecture to generate both a segmented eye image and a quality estimation of the eye image. The segmented eye image can include a background region, a sclera region, an iris region, or a pupil region. In another aspect, a convolutional neural network with a merged architecture can be trained for eye image segmentation and image quality estimation. In yet another aspect, the device can use the segmented eye image to determine eye contours such as a pupil contour and an iris contour. The device can use the eye contours to create a polar image of the iris region for computing an iris code or biometric authentication.

IPC Classes  ?

  • G06T 7/12 - Edge-based segmentation
  • G06T 7/11 - Region-based segmentation
  • G06T 7/194 - Segmentation; Edge detection involving foreground-background segmentation
  • G06V 10/56 - Extraction of image or video features relating to colour
  • G06V 10/44 - Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
  • G06V 10/98 - Detection or correction of errors, e.g. by rescanning the pattern or by human intervention; Evaluation of the quality of the acquired patterns
  • G06V 40/18 - Eye characteristics, e.g. of the iris
  • G06F 18/2413 - Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
  • G06V 10/764 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
  • G06V 10/82 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
  • G06T 7/10 - Segmentation; Edge detection
  • G06T 7/00 - Image analysis

63.

EYEPIECES FOR AUGMENTED REALITY DISPLAY SYSTEM

      
Application Number 18213124
Status Pending
Filing Date 2023-06-22
First Publication Date 2024-01-25
Owner Magic Leap, Inc. (USA)
Inventor
  • Bhargava, Samarth
  • Liu, Victor Kai
  • Messer, Kevin

Abstract

An eyepiece waveguide for an augmented reality display system. The eyepiece waveguide can include an input coupling grating (ICG) region. The ICG region can couple an input beam into the substrate of the eyepiece waveguide as a guided beam. A first combined pupil expander-extractor (CPE) grating region can be formed on or in a surface of the substrate. The first CPE grating region can receive the guided beam, create a first plurality of diffracted beams at a plurality of distributed locations, and out-couple a first plurality of output beams. The eyepiece waveguide can also include a second CPE grating region formed on or in the opposite surface of the substrate. The second CPE grating region can receive the guided beam, create a second plurality of diffracted beams at a plurality of distributed locations, and out-couple a second plurality of output beams.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G02B 27/28 - Optical systems or apparatus not provided for by any of the groups , for polarising

64.

EFFICIENT LOCALIZATION BASED ON MULTIPLE FEATURE TYPES

      
Application Number 18353851
Status Pending
Filing Date 2023-07-17
First Publication Date 2024-01-25
Owner Magic Leap, Inc. (USA)
Inventor
  • Zhou, Lipu
  • Swaminathan, Ashwin
  • Steinbruecker, Frank Thomas
  • Koppel, Daniel Esteban

Abstract

A method of efficiently and accurately computing a pose of an image with respect to other image information. The image may be acquired with a camera on a portable device and the other information may be a map, such that the computation of pose localizes the device relative to the map. Such a technique may be applied in a cross reality system to enable devices to efficiently and accurately access previously persisted maps. Localizing with respect to a map may enable multiple cross reality devices to render virtual content at locations specified in relation to those maps, providing an enhanced experience for uses of the system. The method may be used in other devices and for other purposes, such as for navigation of autonomous vehicles.

IPC Classes  ?

  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods

65.

THREE DIMENSIONAL VIRTUAL AND AUGMENTED REALITY DISPLAY SYSTEM

      
Application Number 18481090
Status Pending
Filing Date 2023-10-04
First Publication Date 2024-01-25
Owner Magic Leap, Inc. (USA)
Inventor Macnamara, John Graham

Abstract

A system may comprise a selectively transparent projection device for projecting an image toward an eye of a viewer from a projection device position in space relative to the eye of the viewer, the projection device being capable of assuming a substantially transparent state when no image is projected; an occlusion mask device coupled to the projection device and configured to selectively block light traveling toward the eye from one or more positions opposite of the projection device from the eye of the viewer in an occluding pattern correlated with the image projected by the projection device; and a zone plate diffraction patterning device interposed between the eye of the viewer and the projection device and configured to cause light from the projection device to pass through a diffraction pattern having a selectable geometry as it travels to the eye.

IPC Classes  ?

  • G02B 30/24 - Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer’s left and right eyes of the stereoscopic type involving temporal multiplexing, e.g. using sequentially activated left and right shutters
  • G03B 21/00 - Projectors or projection-type viewers; Accessories therefor
  • G02B 30/34 - Stereoscopes providing a stereoscopic pair of separated images corresponding to parallactically displaced views of the same object, e.g. 3D slide viewers
  • G02B 30/52 - Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels the 3D volume being constructed from a stack or sequence of 2D planes, e.g. depth sampling systems
  • G02B 27/01 - Head-up displays
  • G02B 5/18 - Diffracting gratings
  • G03B 35/18 - Stereoscopic photography by simultaneous viewing

66.

LOW PROFILE INTERCONNECT FOR LIGHT EMITTER

      
Application Number 18476611
Status Pending
Filing Date 2023-09-28
First Publication Date 2024-01-18
Owner Magic Leap, Inc. (USA)
Inventor Curtis, Kevin

Abstract

In some embodiments, an interconnect electrical connects a light emitter to wiring on a substrate. The interconnect may be deposited by 3D printing and lays flat on the light emitter and substrate. In some embodiments, the interconnect has a generally rectangular or oval cross-sectional profile and extends above the light emitter to a height of about 50 μm or less, or about 35 μm or less. This small height allows close spacing between an overlying optical structure and the light emitter, thereby providing high efficiency in the injection of light from the light emitter into the optical structure, such as a light pipe.

IPC Classes  ?

  • H01L 33/62 - Arrangements for conducting electric current to or from the semiconductor body, e.g. leadframe, wire-bond or solder balls
  • G02B 27/01 - Head-up displays
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems

67.

ILLUMINATION LAYOUT FOR COMPACT PROJECTION SYSTEM

      
Application Number US2023026455
Publication Number 2024/015217
Status In Force
Filing Date 2023-06-28
Publication Date 2024-01-18
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Uhlendorf, Kristina
  • Curtis, Kevin Richard
  • Tekolste, Robert D.
  • Singh, Vikramjit

Abstract

An apparatus including a set of three illumination sources disposed in a first plane. Each of the set of three illumination sources is disposed at a position in the first plane offset from others of the set of three illumination sources by 120 degrees measured in polar coordinates. The apparatus also includes a set of three waveguide layers disposed adjacent the set of three illumination sources. Each of the set of three waveguide layers includes an incoupling diffractive element disposed at a lateral position offset by 180 degrees from a corresponding illumination source of the set of three illumination sources.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 5/18 - Diffracting gratings
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems
  • G03B 21/00 - Projectors or projection-type viewers; Accessories therefor
  • H04N 13/341 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using temporal multiplexing

68.

Portable camera device

      
Application Number 29717216
Grant Number D1011401
Status In Force
Filing Date 2019-12-16
First Publication Date 2024-01-16
Grant Date 2024-01-16
Owner Magic Leap, Inc. (USA)
Inventor
  • Natsume, Shigeru
  • Gunther, Sebastian Gonzalo Arrieta
  • Awad, Haney
  • Green Mercer, Bryson John

69.

SCANNING MIRROR SYSTEMS AND METHODS OF MANUFACTURE

      
Application Number 18370009
Status Pending
Filing Date 2023-09-19
First Publication Date 2024-01-11
Owner Magic Leap, Inc. (USA)
Inventor
  • Gamet, Julien
  • Gamper, Stephan Arthur

Abstract

A scanning micromirror system includes a base having an axis passing therethrough, a plurality of support flexures coupled to the base, and a platform coupled to the base by the plurality of support flexures. The platform has a first side and a second side opposing the first side and is operable to oscillate about the axis. The scanning micromirror system also includes a stress relief layer positioned on the first side of the platform and a reflector positioned on the first side of the platform. The stress relief layer is positioned between the reflector and the platform.

IPC Classes  ?

  • G02B 26/10 - Scanning systems
  • G02B 26/08 - Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
  • B81B 3/00 - Devices comprising flexible or deformable elements, e.g. comprising elastic tongues or membranes
  • G02B 27/01 - Head-up displays

70.

METHOD OF REDUCING OPTICAL ARTIFACTS

      
Application Number 18371888
Status Pending
Filing Date 2023-09-22
First Publication Date 2024-01-11
Owner Magic Leap, Inc. (USA)
Inventor
  • Curtis, Kevin Richard
  • Cheng, Hui-Chuan
  • Greco, Paul M.
  • Welch, William Hudson
  • Browy, Eric C.
  • Schuck, Iii, Miller Harry
  • Sissom, Bradley Jay

Abstract

A method of reducing optical artifacts includes injecting a light beam generated by an illumination source into a polarizing beam splitter (PBS), reflecting a spatially defined portion of the light beam from a display panel, reflecting, at an interface in the PBS, the spatially defined portion of the light beam towards a projector lens, passing at least a portion of the spatially defined portion of the light beam through a circular polarizer disposed between the PBS and the projector lens, reflecting, by one or more elements of the projector lens, a return portion of the spatially defined portion of the light beam, and attenuating, at the circular polarizer, the return portion of the spatially defined portion of the light beam.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 27/28 - Optical systems or apparatus not provided for by any of the groups , for polarising
  • G03B 21/00 - Projectors or projection-type viewers; Accessories therefor
  • G02B 5/30 - Polarising elements
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems

71.

METHOD AND SYSTEM FOR EYEPIECE WAVEGUIDE DISPLAYS UTILIZING MULTI-DIRECTIONAL LAUNCH ARCHITECTURES

      
Application Number US2022043721
Publication Number 2024/010600
Status In Force
Filing Date 2022-09-15
Publication Date 2024-01-11
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Khandekar, Chinmay
  • Tekolste, Robert, D.
  • Singh, Vikramjit
  • Liu, Victor Kai
  • Ong, Ryan
  • Uhlendorf, Kristina

Abstract

An eyepiece waveguide for augmented reality applications includes a substrate and a set of incoupling diffractive optical elements coupled to the substrate. A first subset of the set of incoupling diffractive optical elements is operable to diffract light into the substrate along a first range of propagation angles and a second subset of the set of incoupling diffractive optical elements is operable to diffract light into the substrate along a second range of propagation angles. The eyepiece waveguide also includes a combined pupil expander diffractive optical element coupled to the substrate.

IPC Classes  ?

72.

HIGHLY TRANSMISSIVE EYEPIECE ARCHITECTURE

      
Application Number US2023027172
Publication Number 2024/010956
Status In Force
Filing Date 2023-07-07
Publication Date 2024-01-11
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Khandekar, Chinmay
  • Tekolste, Robert D.
  • Faraji-Dana, Mohammadsadegh
  • Singh, Vikramjit

Abstract

An eyepiece includes a substrate, an input coupling grating on a first side of the substrate, and a morphed grating comprising characteristics of both a primary grating and a secondary grating on at least the first side of the substrate. The primary grating and the secondary grating may differ in pitch, orientation, and dimensions.

IPC Classes  ?

73.

METHODS, DEVICES, AND SYSTEMS FOR ILLUMINATING SPATIAL LIGHT MODULATORS

      
Application Number 18470801
Status Pending
Filing Date 2023-09-20
First Publication Date 2024-01-11
Owner Magic Leap, Inc. (USA)
Inventor
  • Cheng, Hui-Chuan
  • Chung, Hyunsun
  • Trisnadi, Jahja I.
  • Carlisle, Clinton
  • Curtis, Kevin Richard
  • Oh, Chulwoo

Abstract

An optical device may include a wedge-shaped light turning element. The optical device can include a first surface that is parallel to a horizontal axis and a second surface opposite to the first surface that is inclined with respect to the horizontal axis by a wedge angle. The optical device may include a light module that includes a plurality of light emitters. The light module can be configured to combine light for the plurality of emitters. The optical device can further include a light input surface that is between the first and the second surfaces and is disposed with respect to the light module to receive light emitted from the plurality of emitters. The optical device may include an end reflector that is disposed on a side opposite the light input surface. The second surface may be inclined such that a height of the light input surface is less than a height of the side opposite the light input surface. The light coupled into the wedge-shaped light turning element may be reflected by the end reflector and/or reflected from the second surface towards the first surface.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 27/14 - Beam splitting or combining systems operating by reflection only
  • G03B 21/00 - Projectors or projection-type viewers; Accessories therefor
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G09G 3/24 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix using controlled light sources using incandescent filaments
  • G02B 30/26 - Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer’s left and right eyes of the autostereoscopic type
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G02B 5/30 - Polarising elements
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems
  • G02F 1/137 - Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour based on liquid crystals, e.g. single liquid crystal display cells characterised by the electro-optical or magneto-optical effect, e.g. field-induced phase transition, orientation effect, guest-host interaction or dynamic scattering

74.

MIXED REALITY VIRTUAL REVERBERATION

      
Application Number 18471071
Status Pending
Filing Date 2023-09-20
First Publication Date 2024-01-11
Owner Magic Leap, Inc. (USA)
Inventor
  • Tajik, Anastasia Andreyevna
  • Jot, Jean-Marc

Abstract

A method of presenting an audio signal to a user of a mixed reality environment is disclosed, the method comprising the steps of detecting a first audio signal in the mixed reality environment, where the first audio signal is a real audio signal; identifying a virtual object intersected by the first audio signal in the mixed reality environment; identifying a listener coordinate associated with the user; determining, using the virtual object and the listener coordinate, a transfer function; applying the transfer function to the first audio signal to produce a second audio signal; and presenting, to the user, the second audio signal.

IPC Classes  ?

  • H04N 21/422 - Input-only peripherals, e.g. global positioning system [GPS]
  • G06F 3/16 - Sound input; Sound output
  • H04N 21/439 - Processing of audio elementary streams

75.

IMMERSIVE AUDIO PLATFORM

      
Application Number 18471216
Status Pending
Filing Date 2023-09-20
First Publication Date 2024-01-11
Owner Magic Leap, Inc. (USA)
Inventor
  • Jot, Jean-Marc
  • Minnick, Michael
  • Pastouchenko, Dmitry
  • Simon, Michael Aaron
  • Scott, Iii, John Emmitt
  • Bailey, Richard St. Clair
  • Balasubramanyan, Shivakumar
  • Agadi, Harsharaj

Abstract

Disclosed herein are systems and methods for presenting audio content in mixed reality environments. A method may include receiving a first input from an application program; in response to receiving the first input, receiving, via a first service, an encoded audio stream; generating, via the first service, a decoded audio stream based on the encoded audio stream; receiving, via a second service, the decoded audio stream; receiving a second input from one or more sensors of a wearable head device; receiving, via the second service, a third input from the application program, wherein the third input corresponds to a position of one or more virtual speakers; generating, via the second service, a spatialized audio stream based on the decoded audio stream, the second input, and the third input; presenting, via one or more speakers of the wearable head device, the spatialized audio stream.

IPC Classes  ?

  • H04S 7/00 - Indicating arrangements; Control arrangements, e.g. balance control
  • G02B 27/01 - Head-up displays

76.

STACKED WAVEGUIDES HAVING DIFFERENT DIFFRACTION GRATINGS FOR COMBINED FIELD OF VIEW

      
Application Number 18471740
Status Pending
Filing Date 2023-09-21
First Publication Date 2024-01-11
Owner Magic Leap, Inc. (USA)
Inventor
  • Oh, Chulwoo
  • Parthiban, Vikraman

Abstract

In one aspect, an optical device comprises a plurality of waveguides formed over one another and having formed thereon respective diffraction gratings, wherein the respective diffraction gratings are configured to diffract visible light incident thereon into respective waveguides, such that visible light diffracted into the respective waveguides propagates therewithin. The respective diffraction gratings are configured to diffract the visible light into the respective waveguides within respective field of views (FOVs) with respect to layer normal directions of the respective waveguides. The respective FOVs are such that the plurality of waveguides are configured to diffract the visible light within a combined FOV that is continuous and greater than each of the respective FOVs

IPC Classes  ?

  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems
  • G02B 6/10 - Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings of the optical waveguide type
  • G02F 1/29 - Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the position or the direction of light beams, i.e. deflection
  • G02B 27/01 - Head-up displays
  • H04N 9/31 - Projection devices for colour picture display
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/42 - Diffraction optics
  • G02B 5/18 - Diffracting gratings

77.

AUTOMATIC PLACEMENT OF A VIRTUAL OBJECT IN A THREE-DIMENSIONAL SPACE

      
Application Number 18473017
Status Pending
Filing Date 2023-09-22
First Publication Date 2024-01-11
Owner Magic Leap, Inc. (USA)
Inventor
  • Hoover, Paul Armistead
  • Mann, Jonathan Lawrence

Abstract

Augmented reality systems and methods for automatically repositioning a virtual object with respect to a destination object in a three-dimensional (3D) environment of a user are disclosed. The systems and methods can automatically attach the target virtual object to the destination object and re-orient the target virtual object based on the affordances of the virtual object or the destination object. The systems and methods can also track the movement of a user and detach the virtual object from the destination object when the user's movement passes a threshold condition.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • H04N 13/156 - Mixing image signals
  • H04N 13/279 - Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals the virtual viewpoint locations being selected by the viewers or determined by tracking
  • H04N 13/344 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
  • H04N 13/239 - Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
  • H04N 13/395 - Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume with depth sampling, i.e. the volume being constructed from a stack or sequence of 2D image planes
  • G02B 30/34 - Stereoscopes providing a stereoscopic pair of separated images corresponding to parallactically displaced views of the same object, e.g. 3D slide viewers
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts

78.

Combined packaging insert and tray

      
Application Number 29716355
Grant Number D1010446
Status In Force
Filing Date 2019-12-09
First Publication Date 2024-01-09
Grant Date 2024-01-09
Owner Magic Leap, Inc. (USA)
Inventor
  • Natsume, Shigeru
  • Awad, Haney
  • Palmer, James William
  • Gamez Castillejos, Daniel Marcelo
  • Palmer, Christopher G.

79.

TRANSMODAL INPUT FUSION FOR MULTI-USER GROUP INTENT PROCESSING IN VIRTUAL ENVIRONMENTS

      
Application Number 18252574
Status Pending
Filing Date 2021-11-09
First Publication Date 2024-01-04
Owner Magic Leap, Inc. (USA)
Inventor
  • Lacey, Paul
  • Schwab, Brian David
  • Miller, Samuel A.
  • Sands, John Andrew
  • Bryant, Colman Thomas

Abstract

This document describes imaging and visualization systems in which the intent of a group of users in a shared space is determined and acted upon. In one aspect, a method includes identifying, for a group of users in a shared virtual space, a respective objective for each of two or more of the users in the group of users. For each of the two or more users, a determination is made, based on inputs from multiple sensors having different input modalities, a respective intent of the user. At least a portion of the multiple sensors are sensors of a device of the user that enables the user to participate in the shared virtual space. A determination is made, based on the respective intent, whether the user is performing the respective objective for the user. Output data is generated and provided based on the respective objectives respective intents.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

80.

MASSIVE SIMULTANEOUS REMOTE DIGITAL PRESENCE WORLD

      
Application Number 18306387
Status Pending
Filing Date 2023-04-25
First Publication Date 2024-01-04
Owner Magic Leap, Inc. (USA)
Inventor Abovitz, Rony

Abstract

Various methods and apparatus are described herein for enabling one or more users to interface with virtual or augmented reality environments. An example system includes a computing network having computer servers interconnected through high bandwidth interfaces to gateways for processing data and/or for enabling communication of data between the servers and one or more local user interface devices. The servers include memory, processing circuitry, and software for designing and/or controlling virtual worlds, as well as for storing and processing user data and data provided by other components of the system. One or more virtual worlds may be presented to a user through a user device for the user to experience and interact. A large number of users may each use a device to simultaneously interface with one or more digital worlds by using the device to observe and interact with each other and with objects produced within the digital worlds.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

81.

IMAGE DESCRIPTOR NETWORK WITH IMPOSED HIERARCHICAL NORMALIZATION

      
Application Number 18368153
Status Pending
Filing Date 2023-09-14
First Publication Date 2024-01-04
Owner Magic Leap, Inc. (USA)
Inventor Sato, Koichi

Abstract

Techniques are disclosed for using and training a descriptor network. An image may be received and provided to the descriptor network. The descriptor network may generate an image descriptor based on the image. The image descriptor may include a set of elements distributed between a major vector comprising a first subset of the set of elements and a minor vector comprising a second subset of the set of elements. The second subset of the set of elements may include more elements than the first subset of the set of elements. A hierarchical normalization may be imposed onto the image descriptor by normalizing the major vector to a major normalization amount and normalizing the minor vector to a minor normalization amount. The minor normalization amount may be less than the major normalization amount.

IPC Classes  ?

  • G06F 16/56 - Information retrieval; Database structures therefor; File system structures therefor of still image data having vectorial format
  • G06N 3/08 - Learning methods
  • G06V 10/46 - Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
  • G06V 10/44 - Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
  • G06F 18/214 - Generating training patterns; Bootstrap methods, e.g. bagging or boosting
  • G06V 10/764 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
  • G06V 10/82 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks

82.

WAYPOINT CREATION IN MAP DETECTION

      
Application Number 18467064
Status Pending
Filing Date 2023-09-14
First Publication Date 2024-01-04
Owner Magic Leap, Inc. (USA)
Inventor
  • Busto, Javier Antonio
  • Brodsky, Jonathan

Abstract

An augmented reality (AR) device can be configured to generate a virtual representation of a user's physical environment. The AR device can capture images of the user's physical environment to generate a mesh map. The AR device can project graphics at designated locations on a virtual bounding box to guide the user to capture images of the user's physical environment. The AR device can provide visual, audible, or haptic guidance to direct the user of the AR device to look toward waypoints to generate the mesh map of the user's environment.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06T 17/20 - Wire-frame description, e.g. polygonalisation or tessellation
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 17/05 - Geographic models
  • G06V 20/20 - Scenes; Scene-specific elements in augmented reality scenes

83.

TOOL BRIDGE

      
Application Number 18468629
Status Pending
Filing Date 2023-09-15
First Publication Date 2024-01-04
Owner Magic Leap, Inc. (USA)
Inventor
  • Bailey, Richard St. Clair
  • Fong, Chun-Ip
  • Bridgewater, Erle Robert

Abstract

Disclosed herein are systems and methods for sharing and synchronizing virtual content. A method may include receiving, from a host application via a wearable device comprising a transmissive display, a first data package comprising first data; identifying virtual content based on the first data; presenting a view of the virtual content via the transmissive display; receiving, via the wearable device, first user input directed at the virtual content; generating second data based on the first data and the first user input; sending, to the host application via the wearable device, a second data package comprising the second data, wherein the host application is configured to execute via one or more processors of a computer system remote to the wearable device and in communication with the wearable device.

IPC Classes  ?

  • G06F 30/12 - Geometric CAD characterised by design entry means specially adapted for CAD, e.g. graphical user interfaces [GUI] specially adapted for CAD
  • G02B 27/01 - Head-up displays
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts

84.

INDIVIDUAL VIEWING IN A SHARED SPACE

      
Application Number 18470286
Status Pending
Filing Date 2023-09-19
First Publication Date 2024-01-04
Owner Magic Leap, Inc. (USA)
Inventor
  • Alexander, Iv, Earle M.
  • Arroyo, Pedro Luis
  • Venerin, Jean I.
  • Adams, William

Abstract

A mixed reality virtual environment is sharable among multiple users through the use of multiple view modes that are selectable by a presenter. Multiple users may wish to view a common virtual object, such as one that is used for educational purposes, such as a piece of art in a museum, automobile, biological specimen, chemical compound, etc. The virtual object may be presented in a virtual room to any number of users. A presentation may be controlled by a presenter (e.g., a teacher of a class of students) that leads multiple participants (e.g., students) through information associated with the virtual object. Use of different viewing modes allows individual users to see different virtual content despite being in a shared viewing space or alternatively, to see the same virtual content in different locations within a shared space.

IPC Classes  ?

  • G09B 5/12 - Electrically-operated educational appliances providing for individual presentation of information to a plurality of student stations different stations being capable of presenting different information simultaneously
  • H04L 12/18 - Arrangements for providing special services to substations for broadcast or conference
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

85.

Display panel or portion thereof with a transitional mixed reality graphical user interface

      
Application Number 29640776
Grant Number D1009925
Status In Force
Filing Date 2018-03-16
First Publication Date 2024-01-02
Grant Date 2024-01-02
Owner Magic Leap, Inc. (USA)
Inventor
  • Pazmino, Lorena
  • Heiner, Cole Parker
  • Tran, Gregory Minh
  • Machicado, Sarah

86.

EYEPIECE FOR VIRTUAL, AUGMENTED, OR MIXED REALITY SYSTEMS

      
Application Number 18348909
Status Pending
Filing Date 2023-07-07
First Publication Date 2023-12-28
Owner Magic Leap, Inc. (USA)
Inventor
  • Klug, Michael Anthony
  • Tekolste, Robert Dale
  • Welch, William Hudson
  • Browy, Eric
  • Liu, Victor Kai
  • Bhargava, Samarth

Abstract

An eyepiece for an augmented reality display system. The eyepiece can include a waveguide substrate. The waveguide substrate can include an input coupler grating (ICG), an orthogonal pupil expander (OPE) grating, a spreader grating, and an exit pupil expander (EPE) grating. The ICG can couple at least one input light beam into at least a first guided light beam that propagates inside the waveguide substrate. The OPE grating can divide the first guided light beam into a plurality of parallel, spaced-apart light beams. The spreader grating can receive the light beams from the OPE grating and spread their distribution. The spreader grating can include diffractive features oriented at approximately 90° to diffractive features of the OPE grating. The EPE grating can re-direct the light beams from the first OPE grating and the first spreader grating such that they exit the waveguide substrate.

IPC Classes  ?

  • G02B 6/122 - Basic optical elements, e.g. light-guiding paths
  • G02B 6/02 - Optical fibres with cladding
  • G02B 27/01 - Head-up displays
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

87.

METASURFACES WITH ASYMETRIC GRATINGS FOR REDIRECTING LIGHT AND METHODS FOR FABRICATING

      
Application Number 18463671
Status Pending
Filing Date 2023-09-08
First Publication Date 2023-12-28
Owner Magic Leap, Inc. (USA)
Inventor
  • Lin, Dianmin
  • Melli, Mauro
  • St. Hilaire, Pierre
  • Peroz, Christophe
  • Poliakov, Evgeni

Abstract

An optical system comprises an optically transmissive substrate comprising a multilevel metasurface which comprises a grating comprising a plurality of multilevel unit cells. Each unit cell comprises, on a lowermost level, a laterally-elongated first lowermost level nanobeam having a first width and a laterally-elongated second lowermost level nanobeam having a second width larger than the first width. Each unit cell further comprises, on an uppermost level, a laterally-elongated first uppermost level nanobeam above the first lowermost level nanobeam and a laterally-elongated second uppermost level nanobeam above the second lowermost level nanobeam.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • H04N 13/344 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
  • G02B 5/30 - Polarising elements
  • H04N 13/349 - Multi-view displays for displaying three or more geometrical viewpoints without viewer tracking
  • G02B 30/35 - Stereoscopes providing a stereoscopic pair of separated images corresponding to parallactically displaced views of the same object, e.g. 3D slide viewers using reflective optical elements in the optical path between the images and the observer

88.

VIRTUAL AND AUGMENTED REALITY SYSTEMS AND METHODS

      
Application Number 18462891
Status Pending
Filing Date 2023-09-07
First Publication Date 2023-12-28
Owner Magic Leap, Inc. (USA)
Inventor
  • Rodriguez, Jose Felix
  • Perez, Ricardo Martinez
  • Nourai, Reza

Abstract

A virtual or augmented reality display system that controls power inputs to the display system as a function of image data. Image data itself is made of a plurality of image data frames, each with constituent color components of, and depth planes for displaying on, rendered content. Light sources or spatial light modulators to relay illumination from the light sources may receive signals from a display controller to adjust a power setting to the light source or spatial light modulator, and/or control depth of displayed image content, based on control information embedded in an image data frame.

IPC Classes  ?

  • H04N 23/65 - Control of camera operation in relation to power supply
  • G02B 27/01 - Head-up displays
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G02B 27/10 - Beam splitting or combining systems

89.

SYSTEMS AND METHODS FOR OPERATING A HEAD-MOUNTED DISPLAY SYSTEM BASED ON USER IDENTITY

      
Application Number 18463201
Status Pending
Filing Date 2023-09-07
First Publication Date 2023-12-28
Owner Magic Leap, Inc. (USA)
Inventor
  • Das, Shiuli
  • Shanbhag, Suraj Manjunath
  • Narain, Abhishek
  • Xu, Yan

Abstract

Systems and methods for depth plane selection in display system such as augmented reality display systems, including mixed reality display systems, are disclosed. A display(s) may present virtual image content via image light to an eye(s) of a user. The display(s) may output the image light to the eye(s) of the user, the image light to have different amounts of wavefront divergence corresponding to different depth planes at different distances away from the user. A camera(s) may capture images of the eye(s). An indication may be generated based on obtained images of the eye(s), indicating whether the user is identified. The display(s) may be controlled to output the image light to the eye(s) of the user, the image light to have the different amounts of wavefront divergence based at least in part on the generated indication indicating whether the user is identified.

IPC Classes  ?

  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/01 - Head-up displays
  • G06F 21/32 - User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
  • G06V 40/18 - Eye characteristics, e.g. of the iris

90.

METASURFACES FOR REDIRECTING LIGHT AND METHODS FOR FABRICATING

      
Application Number 18464732
Status Pending
Filing Date 2023-09-11
First Publication Date 2023-12-28
Owner Magic Leap, Inc. (USA)
Inventor
  • Lin, Dianmin
  • Melli, Mauro
  • St. Hilaire, Pierre
  • Peroz, Christophe
  • Poliakov, Evgeni

Abstract

A display system comprises a waveguide having light incoupling or light outcoupling optical elements formed of a metasurface. The metasurface is a multilevel (e.g., bi-level) structure having a first level defined by spaced apart protrusions formed of a first optically transmissive material and a second optically transmissive material between the protrusions. The metasurface also includes a second level formed by the second optically transmissive material. The protrusions on the first level may be patterned by nanoimprinting the first optically transmissive material, and the second optically transmissive material may be deposited over and between the patterned protrusions. The widths of the protrusions and the spacing between the protrusions may be selected to diffract light, and a pitch of the protrusions may be 10-600 nm.

IPC Classes  ?

  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems
  • G02B 6/00 - Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings
  • G02B 6/293 - Optical coupling means having data bus means, i.e. plural waveguides interconnected and providing an inherently bidirectional system by mixing and splitting signals with wavelength selective means
  • G02B 27/01 - Head-up displays

91.

Device controller

      
Application Number 29866009
Grant Number D1009001
Status In Force
Filing Date 2022-08-24
First Publication Date 2023-12-26
Grant Date 2023-12-26
Owner Magic Leap, Inc. (USA)
Inventor
  • Natsume, Shigeru
  • Gunther, Sebastian Gonzalo Arrieta
  • Awad, Haney
  • Swinton, Matthew David
  • Urban, Hayes

92.

ARCHITECTURES AND METHODS FOR OUTPUTTING DIFFERENT WAVELENGTH LIGHT OUT OF WAVEGUIDES

      
Application Number 18241772
Status Pending
Filing Date 2023-09-01
First Publication Date 2023-12-21
Owner Magic Leap, Inc. (USA)
Inventor
  • Tekolste, Robert Dale
  • Klug, Michael Anthony
  • Schowengerdt, Brian T.

Abstract

Architectures are provided for selectively outputting light for forming images, the light having different wavelengths and being outputted with low levels of crosstalk. In some embodiments, light is incoupled into a waveguide and deflected to propagate in different directions, depending on wavelength. The incoupled light then outcoupled by outcoupling optical elements that outcouple light based on the direction of propagation of the light. In some other embodiments, color filters are between a waveguide and outcoupling elements. The color filters limit the wavelengths of light that interact with and are outcoupled by the outcoupling elements. In yet other embodiments, a different waveguide is provided for each range of wavelengths to be outputted. Incoupling optical elements selectively incouple light of the appropriate range of wavelengths into a corresponding waveguide, from which the light is outcoupled.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems
  • G02B 27/42 - Diffraction optics
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,

93.

CONVERTING A 2D POSITIONAL INPUT INTO A 3D POINT IN SPACE

      
Application Number 18459811
Status Pending
Filing Date 2023-09-01
First Publication Date 2023-12-21
Owner Magic Leap, Inc. (USA)
Inventor Mccall, Marc Alan

Abstract

A user may interact and select positions in three-dimensional space arbitrarily through conversion of a two-dimensional positional input into a three-dimensional point in space. The system may allow a user to use one or more user input devices for pointing, annotating, or drawing on virtual objects, real objects or empty space in reference to the location of the three-dimensional point in space within an augmented reality or mixed reality session.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/0346 - Pointing devices displaced or positioned by the user; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
  • G06F 3/041 - Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
  • G06F 3/04812 - Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

94.

SYSTEMS AND METHODS FOR DISPLAY BINOCULAR DEFORMATION COMPENSATION

      
Application Number US2023068550
Publication Number 2023/245146
Status In Force
Filing Date 2023-06-15
Publication Date 2023-12-21
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Schuelke, Aaron Mark
  • Lopez, Alejandro
  • Nguyen, Bach
  • Mareno, Jason Donald
  • Kulessa, Sebastian
  • Abbott, Michael Derek
  • Schabacker, Charles Robert
  • Fan, Chuanyang

Abstract

An eyewear device for being worn on a head of a user for presenting virtual content to a user comprises a frame structure having a frame front, and an optical assembly having a first rigidity. The optical assembly has a chassis and a plurality of optical components affixed to the chassis. The eyewear device further comprises a plurality of mounts mechanically coupling the chassis of the optical assembly to the frame front, at least one of the plurality of mounts having a second rigidity less than the first rigidity, such that the mount(s) is configured for preventing at least a portion of a first static deformation load applied to the frame front from being mechanically communicated to the optical assembly.

IPC Classes  ?

95.

POWER SUPPLY ASSEMBLY WITH FAN ASSEMBLY FOR ELECTRONIC DEVICE

      
Application Number 18243048
Status Pending
Filing Date 2023-09-06
First Publication Date 2023-12-21
Owner Magic Leap, Inc. (USA)
Inventor
  • Aguirre, John
  • Jin, Youlin
  • Remsburg, Ralph
  • Rohena, Guillermo Padin
  • Rynk, Evan Francis
  • Pedroza, Carlos Julio Suate
  • Quartana, Jr., Gary
  • Fraser, Bradley
  • Awad, Haney
  • Wheeler, William
  • Natsume, Shigeru

Abstract

A fan assembly is disclosed. The fan assembly can include a first support frame. The fan assembly can comprise a shaft assembly having a first end coupled with the first support frame and a second end disposed away from the first end. A second support frame can be coupled with the first support frame and disposed at or over the second end of the shaft assembly. An impeller can have fan blades coupled with a hub, the hub being disposed over the shaft assembly for rotation between the first and second support frames about a longitudinal axis. Transverse loading on the shaft assembly can be controlled by the first and second support frames.

IPC Classes  ?

  • G06F 1/18 - Packaging or power distribution
  • H05K 7/20 - Modifications to facilitate cooling, ventilating, or heating
  • H05K 7/14 - Mounting supporting structure in casing or on frame or rack
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 1/16 - Constructional details or arrangements
  • G06F 1/20 - Cooling means
  • G02B 27/01 - Head-up displays
  • F04D 29/60 - Mounting; Assembling; Disassembling
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • F04D 25/06 - Units comprising pumps and their driving means the pump being electrically driven
  • F04D 29/58 - Cooling; Heating; Diminishing heat transfer
  • F04D 29/42 - Casings; Connections for working fluid for radial or helico-centrifugal pumps

96.

METHODS AND SYSTEMS FOR AUDIO SIGNAL FILTERING

      
Application Number 18455585
Status Pending
Filing Date 2023-08-24
First Publication Date 2023-12-21
Owner Magic Leap, Inc. (USA)
Inventor
  • Audfray, Remi Samuel
  • Jot, Jean-Marc
  • Dicker, Samuel Charles

Abstract

Systems and methods for rendering audio signals are disclosed. In some embodiments, a method may receive an input signal including a first portion and the second portion. A first processing stage comprising a first filter is applied to the first portion to generate a first filtered signal. A second processing stage comprising a second filter is applied to the first portion to generate a second filtered signal. A third processing stage comprising a third filter is applied to the second portion to generate a third filtered signal. A fourth processing stage comprising a fourth filter is applied to the second portion to generate a fourth filtered signal. A first output signal is determined based on a sum of the first filtered signal and the third filtered signal. A second output signal is determined based on a sum of the second filtered signal and the fourth filtered signal. The first output signal is presented to a first ear of a user of a virtual environment, and the second output signal is presented to the second ear of the user. The first portion of the input signal corresponds to a first location in the virtual environment, and the second portion of the input signal corresponds to a second location in the virtual environment.

IPC Classes  ?

  • H04S 1/00 - Two-channel systems
  • H04S 7/00 - Indicating arrangements; Control arrangements, e.g. balance control

97.

VOICE ONSET DETECTION

      
Application Number 18459342
Status Pending
Filing Date 2023-08-31
First Publication Date 2023-12-21
Owner Magic Leap, Inc. (USA)
Inventor
  • Lee, Jung-Suk
  • Jot, Jean-Marc

Abstract

In some embodiments, a first audio signal is received via a first microphone, and a first probability of voice activity is determined based on the first audio signal. A second audio signal is received via a second microphone, and a second probability of voice activity is determined based on the first and second audio signals. Whether a first threshold of voice activity is met is determined based on the first and second probabilities of voice activity. In accordance with a determination that a first threshold of voice activity is met, it is determined that a voice onset has occurred, and an alert is transmitted to a processor based on the determination that the voice onset has occurred. In accordance with a determination that a first threshold of voice activity is not met, it is not determined that a voice onset has occurred.

IPC Classes  ?

  • G10L 25/78 - Detection of presence or absence of voice signals
  • G10L 25/51 - Speech or voice analysis techniques not restricted to a single one of groups specially adapted for particular use for comparison or discrimination
  • H04R 3/00 - Circuits for transducers
  • H04R 3/04 - Circuits for transducers for correcting frequency response
  • H04R 5/04 - Circuit arrangements
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/01 - Head-up displays
  • G06F 17/18 - Complex mathematical operations for evaluating statistical data

98.

SPATIAL AUDIO FOR INTERACTIVE AUDIO ENVIRONMENTS

      
Application Number 18461289
Status Pending
Filing Date 2023-09-05
First Publication Date 2023-12-21
Owner Magic Leap, Inc. (USA)
Inventor
  • Jot, Jean-Marc
  • Dicker, Samuel Charles
  • Schmidt, Brian Lloyd
  • Audfray, Remi Samuel

Abstract

Systems and methods of presenting an output audio signal to a listener located at a first location in a virtual environment are disclosed. According to embodiments of a method, an input audio signal is received. For each sound source of a plurality of sound sources in the virtual environment, a respective first intermediate audio signal corresponding to the input audio signal is determined, based on a location of the respective sound source in the virtual environment, and the respective first intermediate audio signal is associated with a first bus. For each of the sound sources of the plurality of sound sources in the virtual environment, a respective second intermediate audio signal is determined. The respective second intermediate audio signal corresponds to a reflection of the input audio signal in a surface of the virtual environment. The respective second intermediate audio signal is determined based on a location of the respective sound source, and further based on an acoustic property of the virtual environment. The respective second intermediate audio signal is associated with a second bus. The output audio signal is presented to the listener via the first bus and the second bus.

IPC Classes  ?

  • H04S 7/00 - Indicating arrangements; Control arrangements, e.g. balance control
  • H04R 5/033 - Headphones for stereophonic communication
  • H04R 5/04 - Circuit arrangements
  • H04S 3/00 - Systems employing more than two channels, e.g. quadraphonic
  • G10K 15/10 - Arrangements for producing a reverberation or echo sound using time-delay networks comprising electromechanical or electro-acoustic devices
  • H04R 3/04 - Circuits for transducers for correcting frequency response
  • G02B 27/01 - Head-up displays
  • H04R 3/12 - Circuits for transducers for distributing signals to two or more loudspeakers

99.

EYE TRACKING BASED VIDEO TRANSMISSION AND COMPRESSION

      
Application Number 18037709
Status Pending
Filing Date 2021-09-16
First Publication Date 2023-12-21
Owner Magic Leap, Inc. (USA)
Inventor Babu J D, Praveen

Abstract

A computer-implemented method includes receiving gaze information about an observer of a video stream; determining a video compression spatial map for the video stream based on the received gaze information and performance characteristics of a network connection with the observer; compressing the video stream according to the video compression spatial map; and sending the compressed video stream to the observer.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 7/50 - Depth or shape recovery
  • G06V 10/25 - Determination of region of interest [ROI] or a volume of interest [VOI]
  • H04N 19/59 - Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving spatial sub-sampling or interpolation, e.g. alteration of picture size or resolution

100.

SYSTEMS AND METHODS FOR HUMAN GAIT ANALYSIS, REAL-TIME FEEDBACK AND REHABILITATION USING AN EXTENDED-REALITY DEVICE

      
Application Number US2022072909
Publication Number 2023/244267
Status In Force
Filing Date 2022-06-13
Publication Date 2023-12-21
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Nyman, Jr., Edward
  • Shironoshita, Emilio
  • Leider, Colby
  • Esposito, Jennifer

Abstract

System and methods for performing gait analysis diagnosis, and rehabilitation and real-time feedback for treating gait disorders using extended reality, such as augmented reality. Image data is captured using native camera(s) on an extended reality headset worn by a subject while walking. The images are analyzed using a location and mapping algorithm to determine head-pose data regarding a position and location of the head of the subject. One or more gait attributes of a gait of the subject by analyzing the head-pose data using a gait-metric prediction algorithm. The gait attributes are analyzed to determine gait disorders, rehabilitation treatment, rehabilitation assessments, and/or rehabilitation feedback.

IPC Classes  ?

  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods
  • G02B 27/01 - Head-up displays
  • G06F 1/16 - Constructional details or arrangements
  • G06F 1/3234 - Power saving characterised by the action undertaken
  • H04N 5/232 - Devices for controlling television cameras, e.g. remote control
  1     2     3     ...     31        Next Page