Snap Inc.

United States of America

Back to Profile

1-100 of 901 for Snap Inc. Sort by
Query
Patent
World - WIPO
Excluding Subsidiaries
Aggregations Reset Report
Date
New (last 4 weeks) 21
2024 April (MTD) 14
2024 March 56
2024 February 15
2024 January 25
See more
IPC Class
G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer 199
G06T 19/00 - Manipulating 3D models or images for computer graphics 143
G02B 27/01 - Head-up displays 133
H04L 12/58 - Message switching systems 56
G06F 1/16 - Constructional details or arrangements 49
See more
Found results for  patents
  1     2     3     ...     10        Next Page

1.

PHONE CASE FOR TRACKING AND LOCALIZATION

      
Application Number US2023077192
Publication Number 2024/086645
Status In Force
Filing Date 2023-10-18
Publication Date 2024-04-25
Owner SNAP INC. (USA)
Inventor
  • Canberk, Ilteris Kaan
  • Hallberg, Matthew
  • Zhuang, Richard

Abstract

A case for a portable device like a smartphone includes light sources such as LEDs, which, when illuminated, can be detected and tracked by a head-worn augmented or virtual reality device. The light sources may be located at the corners of the case and may emit infrared light. A relative pose between the smartphone and the head-worn device can be determined based on computer vision techniques performed on images captured by the head- worn device that includes light from the light sources. Relative movement between the smartphone and the head-worn device can be used to provide user input to the head-worn device, as can touch input on the portable device. In some instances, the case is powered inductively from the portable device.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06V 40/20 - Movements or behaviour, e.g. gesture recognition
  • G02B 27/01 - Head-up displays
  • G06F 3/00 - Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements

2.

HEAD PROPERTY DETECTION IN DISPLAY-ENABLED WEARABLE DEVICES

      
Application Number US2023077092
Publication Number 2024/086580
Status In Force
Filing Date 2023-10-17
Publication Date 2024-04-25
Owner SNAP INC. (USA)
Inventor
  • Olgun, Ugur
  • You, Choonshin
  • Zhang, Bo Ya

Abstract

A display-enabled eyewear device has an integrated head sensor that dynamically and continuously measures or detects various cephalic parameters of a wearer's head. The head sensor includes a loop coupler system integrated in a lens-carrying frame to sense proximate ambient RF absorption influenced by head presence, size, and/or distance. Autonomous device management dynamically adjust or cause adjustment of selected device features based on current detected values for the cephalic parameters, which can include wear status, head size, and frame-head spacing.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

3.

STYLIZING A WHOLE-BODY OF A PERSON

      
Application Number US2023076997
Publication Number 2024/086534
Status In Force
Filing Date 2023-10-16
Publication Date 2024-04-25
Owner SNAP INC. (USA)
Inventor
  • Rami Koujan, Mohammad
  • Kokkinos, Iason

Abstract

Methods and systems are disclosed for performing real-time stylizing operations. The system receives an image that includes a depiction of a whole body of a real-world person. The system applies a machine learning model to the image to generate a stylized version of the whole body of the real-world person corresponding to a given style, the machine learning model being trained using training data to establish a relationship between a plurality of training images depicting synthetically rendered whole bodies of persons and corresponding ground-truth stylized versions of the whole bodies of the persons of the given style. The system replaces the depiction of the whole body of the real-world person in the image with the generated stylized version of the whole body of the real-world person.

IPC Classes  ?

  • G06T 11/00 - 2D [Two Dimensional] image generation

4.

SIGN LANGUAGE INTERPRETATION WITH COLLABORATIVE AGENTS

      
Application Number US2023077007
Publication Number 2024/086538
Status In Force
Filing Date 2023-10-16
Publication Date 2024-04-25
Owner SNAP INC. (USA)
Inventor
  • Zhou, Kai
  • Pounds, Jennica
  • Robotka, Zsolt
  • Kajtár, Márton Gergely

Abstract

A method for recognizing sign language using collaborative augmented reality devices is described. In one aspect, a method includes accessing a first image generated by a first augmented reality device and a second image generated by a second augmented reality device, the first image and the second image depicting a hand gesture of a user of the first augmented reality device, synchronizing the first augmented reality device with the second augmented reality device, in response to the synchronizing, distributing one or more processes of a sign language recognition system between the first and second augmented reality devices, collecting results from the one or more processes from the first and second augmented reality devices, and displaying, in near real-time in a first display of the first augmented reality device, text indicating a sign language translation of the hand gesture based on the results.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06V 40/20 - Movements or behaviour, e.g. gesture recognition
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

5.

TRACTABLE BODY-BASED AR SYSTEM INPUT

      
Application Number US2023034559
Publication Number 2024/081152
Status In Force
Filing Date 2023-10-05
Publication Date 2024-04-18
Owner SNAP INC. (USA)
Inventor
  • Alvarez, Attila
  • Kajtár, Márton Gergely
  • Pocsi, Peter
  • Pounds, Jennica
  • Retek, David
  • Robotka, Zsolt

Abstract

A hand-tracking platform generates gesture components for use as user inputs into an application of an Augmented Reality (AR) system. In some examples, the hand-tracking platform generates real-world scene environment frame data based on gestures being made by a user of the AR system using a camera component of the AR system. The hand-tracking platform recognizes a gesture component based on the real-world scene environment frame data and generates gesture component data based on the gesture component. The application utilizes the gesture component data as user input in a user interface of the application.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/00 - Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

6.

ENERGY-EFFICIENT ADAPTIVE 3D SENSING

      
Application Number US2023034564
Publication Number 2024/081154
Status In Force
Filing Date 2023-10-05
Publication Date 2024-04-18
Owner SNAP INC. (USA)
Inventor
  • Wang, Jian
  • Ma, Sizhuo
  • Tilmon, Brevin
  • Wu, Yicheng
  • Krishnan Gorumkonda, Gurunandan
  • Zahreddine, Ramzi
  • Evangelidis, Georgios

Abstract

An energy-efficient adaptive 3D sensing system. The adaptive 3D sensing system includes one or more cameras and one or more projectors. The adaptive 3D sensing system captures images of a real-world scene using the one or more cameras and computes depth estimates and depth estimate confidence values for pixels of the images. The adaptive 3D sensing system computes an attention mask based on the one or more depth estimate confidence values and commands the one or more projectors to send a distributed laser beam into one or more areas of the real-world scene based on the attention mask. The adaptive 3D sensing system captures 3D sensing image data of the one or more areas of the real-world scene and generates 3D sensing data for the real-world scene based on the 3D sensing image data.

IPC Classes  ?

  • H04N 5/222 - Studio circuitry; Studio devices; Studio equipment
  • G01B 11/25 - Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. moiré fringes, on the object
  • H04N 23/56 - Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
  • G06F 3/00 - Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
  • H04N 13/128 - Adjusting depth or disparity

7.

REMOTE ANNOTATION AND NAVIGATION IN AUGMENTED REALITY

      
Application Number US2023034731
Publication Number 2024/081184
Status In Force
Filing Date 2023-10-09
Publication Date 2024-04-18
Owner SNAP INC. (USA)
Inventor
  • Canberk, Ilteris Kaan
  • Hallberg, Matthew
  • Jung, Bernhard

Abstract

Systems, methods, and computer readable media for remote annotations, drawings, and navigation instructions sent to an augmented reality (AR) wearable device from a computing device are disclosed. The AR wearable device captures images and sends them to the remote computing device to provide a real-time view of what the user of the AR wearable device sees. A user of the remote computing device can add navigation instructions and can select an image to annotate or draw on. The AR wearable device provides 3-dimensional (3D) coordinate information within a 3D world of the AR wearable device for the selected image. The user of the remote computing device then annotates or draws on the selected image. The remote computing device determines 3D coordinates for the annotations and drawings within the 3D world of the AR wearable device. The annotations and drawings are sent to the AR wearable device with associated 3D coordinates.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/048 - Interaction techniques based on graphical user interfaces [GUI]

8.

AR SYSTEM BENDING CORRECTION

      
Application Number US2023034375
Publication Number 2024/076571
Status In Force
Filing Date 2023-10-03
Publication Date 2024-04-11
Owner SNAP INC. (USA)
Inventor
  • Kalkgruber, Matthias
  • Pereira Torres, Tiago Miguel
  • Welge, Weston
  • Zahreddine, Ramzi

Abstract

A system for deformation or bending correction in an Augmented Reality (AR) system. Sensors are positioned in a frame of a head-worn AR system to sense forces or pressure acting on the frame by temple pieces attached to the frame. The sensed forces or pressure are used in conjunction with a model of the frame to determine a corrected model of the frame. The corrected model is used to correct video data captured by the AR system and to correct a video virtual overlay that is provided to a user wearing the head- worn AR system.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G02B 27/01 - Head-up displays
  • G02C 5/22 - Hinges
  • H04N 13/327 - Calibration thereof
  • H04N 13/344 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays

9.

REAL-TIME MACHINE LEARNING BASED IN-PAINTING

      
Application Number US2023033907
Publication Number 2024/076486
Status In Force
Filing Date 2023-09-27
Publication Date 2024-04-11
Owner SNAP INC. (USA)
Inventor
  • Dudovitch, Gal
  • Harel, Peleg
  • Mishin Shuvi, Ma'Ayan
  • Sasson, Gal
  • Zohar, Matan

Abstract

Aspects of the present disclosure involve a system for performing real-time in-painting using machine learning techniques. The system receives a video that includes a depiction of a real-world object in a real -world environment. The system accesses a segmentation associated with the real-world object and removes a depiction of the real -world object from a region of a first frame of the video. The system processes, by a machine learning model, the first frame and one or more previous frames of the video that precede the first frame to generate a new frame in which portions of the first frame have been blended into the region from which the depiction of the real-world object has been removed.

IPC Classes  ?

  • G06T 5/50 - Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
  • G06T 5/60 - using machine learning, e.g. neural networks
  • G06T 5/77 - Retouching; Inpainting; Scratch removal

10.

EXTERNAL SCREEN STREAMING FOR AN EYEWEAR DEVICE

      
Application Number US2023034437
Publication Number 2024/076613
Status In Force
Filing Date 2023-10-04
Publication Date 2024-04-11
Owner SNAP INC. (USA)
Inventor
  • Canberk, Ilteris Kaan
  • Jung, Bernhard
  • Kang, Shin Hwun
  • Skrypnyk, Daria
  • Sun, Tianyi
  • Tran, Lien Le Hong

Abstract

Systems and methods are provided for performing operations on an augmented reality (AR) device using an external screen streaming system. The system establishes, by one or more processors of an AR device, a communication with an external client device. The system causes overlay of, by the AR device, a first AR object on a real -world environment being viewed using the AR device. The system receives, by the AR device, a first image from the external client device. The system, in response to receiving the first image from the external client device, overlays the first image on the first AR object by the AR device.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

11.

GENERATING USER INTERFACES IN AUGMENTED REALITY ENVIRONMENTS

      
Application Number US2023075829
Publication Number 2024/076986
Status In Force
Filing Date 2023-10-03
Publication Date 2024-04-11
Owner SNAP INC. (USA)
Inventor
  • Kang, Shin Hwun
  • Tran, Lien Le Hong

Abstract

An augmented reality (AR) content system is provided. The AR content system may analyze audio input obtained from a user to generate a search request. The AR content system may obtain search results in response to the search request and determine a layout by which to display the search results. The search results may be displayed in a user interface within an AR environment according to the layout. The AR content system may also analyze audio input to detect commands to perform with respect to content displayed in the user interface.

IPC Classes  ?

  • G06F 3/16 - Sound input; Sound output
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/01 - Head-up displays
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

12.

3D GARMENT GENERATION FROM 2D SCRIBBLE IMAGES

      
Application Number US2023029082
Publication Number 2024/072550
Status In Force
Filing Date 2023-07-31
Publication Date 2024-04-04
Owner SNAP INC. (USA)
Inventor
  • Achlioptas, Panagiotis
  • Chai, Menglei
  • Lee, Hsin-Ying
  • Olszewski, Kyle
  • Ren, Jian
  • Tulyakov, Sergey

Abstract

A system and method are described for generating 3D garments from two-dimensional (2D) scribble images drawn by users. The system includes a conditional 2D generator, a conditional 3D generator, and two intermediate media including dimension-coupling color-density pairs and flat point clouds that bridge the gap between dimensions. Given a scribble image, the 2D generator synthesizes dimension-coupling color-density pairs including the RGB projection and density map from the front and rear views of the scribble image. A density-aware sampling algorithm converts the 2D dimension-coupling color-density pairs into a 3D flat point cloud representation, where the depth information is ignored. The 3D generator predicts the depth information from the flat point cloud. Dynamic variations per garment due to deformations resulting from a wearer's pose as well as irregular wrinkles and folds may be bypassed by taking advantage of 2D generative models to bridge the dimension gap in a non-parametric way.

IPC Classes  ?

13.

MIXED REALITY MEDIA CONTENT

      
Application Number US2023075529
Publication Number 2024/073675
Status In Force
Filing Date 2023-09-29
Publication Date 2024-04-04
Owner SNAP INC. (USA)
Inventor
  • Moll, Sharon
  • Gurgul, Piotr
  • Zhang, Dawei

Abstract

A mixed-reality media content system may be configured to perform operations that include: causing display of image data at a client device, the image data comprising a depiction of an object that includes a graphical code at a position upon the object; detecting the graphical code at the position upon the depiction of the object based on the image data; accessing media content within a media repository based on the graphical code scanned by the client device; and causing display of a presentation of the media content at the position of the graphical code upon the depiction of the object at the client device.

IPC Classes  ?

  • G06Q 50/10 - Services
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
  • G06K 19/06 - Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code
  • G06F 3/0481 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance

14.

9-DOF OBJECT TRACKING

      
Application Number US2023033853
Publication Number 2024/072885
Status In Force
Filing Date 2023-09-27
Publication Date 2024-04-04
Owner SNAP INC. (USA)
Inventor
  • Berger, Itamar
  • Dudovitch, Gal
  • Harel, Peleg
  • Mishin Shuvi, Ma'Ayan

Abstract

Aspects of the present disclosure involve a system for presenting AR items. The system receives a video that includes a depiction of a real-world object in a real-world environment. The system generates a three-dimensional (3D) bounding box for the real-world object and stabilizes the 3D bounding box based on one or more sensors of the device. The system determines a position, orientation, and dimensions of the real-world object based on the stabilized 3D bounding box and renders a display of an augmented reality (AR) item within the video based on the position, orientation, and dimensions of the real -world object.

IPC Classes  ?

  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods

15.

AR GLASSES AS IOT REMOTE CONTROL

      
Application Number US2023029076
Publication Number 2024/063865
Status In Force
Filing Date 2023-07-31
Publication Date 2024-03-28
Owner SNAP INC. (USA)
Inventor
  • Moll, Sharon
  • Gurgul, Piotr

Abstract

AR-enabled wearable electronic devices such as smart glasses are adapted for use as an (Internet of Things) IoT remote control device where the user can control a pointer on a television screen, computer screen, or other IoT enabled device to select items by looking at them and making selections using gestures. Built-in six-degrees-of-freedom (6DoF) tracking capabilities are used to move the pointer on the screen to facilitate navigation. The display screen is tracked in real-world coordinates to determine the point of intersection of the user's view with the screen using raycasting techniques. Hand and head gesture detection are used to allow the user to execute a variety of control actions by performing different gestures. The techniques are particularly useful for smart displays that offer AR-enhanced content that can be viewed in the displays of the AR-enabled wearable electronic devices.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/00 - Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G02B 27/01 - Head-up displays
  • H04L 67/125 - Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks involving control of end-device applications over a network

16.

MOBILE DEVICE RESOURCE OPTIMIZED KIOSK MODE

      
Application Number US2023029078
Publication Number 2024/063866
Status In Force
Filing Date 2023-07-31
Publication Date 2024-03-28
Owner SNAP INC. (USA)
Inventor
  • Moll, Sharon
  • Wawruch, Pawel
  • Razafindrabe, Neken Aritia Symphonie

Abstract

A resource optimized kiosk mode that improves the mobile experience for creators and users of mobile devices such as an augmented reality (AR)-enabled wearable eyewear device. An eyewear device enters a kiosk mode by receiving a kiosk mode request for an application and, in response to the request, determining which services and application programming interfaces (APIs) are required to execute the selected application. An identification of the determined services and APIs required to execute the selected application are stored and the eyewear device is rebooted. After reboot, the selected application is started, and only the identified services and APIs are enabled. To determine which services and APIs are required to execute the selected application, metadata may be associated with the selected application specifying the services and/or APIs that the selected application requires to use when in operation.

IPC Classes  ?

17.

VISUAL AND AUDIO WAKE COMMANDS

      
Application Number US2023033063
Publication Number 2024/064094
Status In Force
Filing Date 2023-09-18
Publication Date 2024-03-28
Owner SNAP INC. (USA)
Inventor
  • Colascione, Daniel
  • Hanover, Matthew
  • Korolev, Sergei
  • Marr, Michael David
  • Myers, Scott
  • Powderly, James

Abstract

A gesture-based wake process for an AR system is described herein. The AR system places a hand-tracking input pipeline of the AR system in a suspended mode. A camera component of the hand-tracking input pipeline detects a possible visual wake command being made by a user of the AR system. On the basis of detecting the possible visual wake command, the AR system wakes the hand-tracking input pipeline and places the camera component in a fully operational mode. If the AR system, using the hand¬ tracking input pipeline, verifies the possible visual wake command as an actual wake command, the AR system initiates execution of an AR application.

IPC Classes  ?

18.

AR GRAPHICAL ASSISTANCE WITH TASKS

      
Application Number US2023033058
Publication Number 2024/064092
Status In Force
Filing Date 2023-09-18
Publication Date 2024-03-28
Owner SNAP INC. (USA)
Inventor
  • Gurgul, Piotr
  • Moll, Sharon

Abstract

Systems, methods, and computer readable media for graphical assistance with tasks using an augmented reality (AR) wearable devices are disclosed. Embodiments capture an image of a first user view of a real-world scene and access indications of surfaces and locations of the surfaces detected in the image. The AR wearable device displays indications of the surfaces on a display of the AR wearable device where the locations of the indications are based on the locations of the surfaces and a second user view of the real-world scene. The locations of the surfaces are indicated with 3D world coordinates. The user views are determined based on a location of the user. The AR wearable device enables a user to add graphics to the surfaces and select tasks to perform. Tools such as a bubble level or a measuring tool are available for the user to utilize to perform the task.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

19.

STEERABLE CAMERA FOR AR HAND TRACKING

      
Application Number US2023033134
Publication Number 2024/064130
Status In Force
Filing Date 2023-09-19
Publication Date 2024-03-28
Owner SNAP INC. (USA)
Inventor
  • Colascione, Daniel
  • Simons, Patrick Timothy Mcsweeney
  • Welge, Weston
  • Zahreddine, Ramzi

Abstract

A system for hand tracking for an Augmented Reality (AR) system. The AR system uses a camera of the AR system to capture tracking video frame data of a hand of a user of the AR system. The AR system generates a skeletal model based on the tracking video frame data and determines a location of the hand of the user based on the skeletal model. The AR system causes a steerable camera of the AR system to focus on the hand of the user.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/03 - Arrangements for converting the position or the displacement of a member into a coded form
  • G09B 21/00 - Teaching, or communicating with, the blind, deaf or mute
  • H04N 23/60 - Control of cameras or camera modules
  • G06F 1/16 - Constructional details or arrangements

20.

TEXT-GUIDED CAMEO GENERATION

      
Application Number US2023074762
Publication Number 2024/064806
Status In Force
Filing Date 2023-09-21
Publication Date 2024-03-28
Owner SNAP INC. (USA)
Inventor
  • Ghosh, Arnab
  • Ren, Jian
  • Savchenkov, Pavel
  • Tulyakov, Sergey

Abstract

A method of generating an image for use in a conversation taking place in a messaging application is disclosed. Conversation input text is received from a user of a portable device that includes a display. Model input text is generated from the conversation input text, which is processed with a text-to-image model to generate an image based on the model input text. The coordinates of a face in the image are determined, and the face of the user or another person is added to the image at the location. The final image is displayed on the portable device, and user input is received to transmit the image to a remote recipient.

IPC Classes  ?

  • G06T 11/60 - Editing figures and text; Combining figures or text
  • H04L 51/04 - Real-time or near real-time messaging, e.g. instant messaging [IM]
  • G06T 13/80 - 2D animation, e.g. using sprites

21.

OPACITY CONTROL OF AUGMENTED REALITY DEVICES

      
Application Number US2023074771
Publication Number 2024/064812
Status In Force
Filing Date 2023-09-21
Publication Date 2024-03-28
Owner SNAP INC. (USA)
Inventor Zare Seisan, Farid

Abstract

An augmented reality (AR) eyewear device has a lens system which includes an optical screening mechanism that enables switching the lens system between a conventional see- through state and an opaque state in which the lens system screens or functionally blocks out the wearer's view of the external environment. Such a screening mechanism allows for expanded use cases of the AR glasses compared to conventional devices, e.g.: as a sleep mask; to view displayed content like movies or sports events against a visually nondistracting background instead of against the external environment; and/or to enable VR functionality.

IPC Classes  ?

22.

MULTIPATH OPTICAL DEVICE

      
Application Number EP2023075362
Publication Number 2024/056832
Status In Force
Filing Date 2023-09-14
Publication Date 2024-03-21
Owner
  • SNAP, INC. (USA)
  • SNAP GROUP LIMITED (United Kingdom)
Inventor
  • Crai, Alexandra
  • Webber, Alexander James Lewarne
  • Valera, Mohmed Salim

Abstract

An optical device for use in an augmented reality or virtual reality display, comprising: a waveguide; an input diffractive optical element, DOE, configured to receive light from a projector and to couple the received light into the waveguide along a plurality of optical paths; an output DOE offset from the input DOE along a first direction and configured to couple the received light out of the waveguide and towards a viewer; a first turning DOE offset from the input DOE along a second direction different from the first direction; wherein the input DOE is configured to couple a first portion of the received light in the second direction towards the first turning DOE and the first turning DOE is configured to diffract the first portion of the received light towards the output DOE, and the input DOE is configured to couple a second portion of the received light in the first direction towards the output DOE.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 5/18 - Diffracting gratings

23.

EYEWEAR WITH STRAIN GAUGE WEAR DETECTION

      
Application Number US2023029066
Publication Number 2024/058870
Status In Force
Filing Date 2023-07-31
Publication Date 2024-03-21
Owner SNAP INC. (USA)
Inventor
  • Heger, Jason
  • Kalkgruber, Matthias
  • Mendez, Erick Mendez

Abstract

An eyewear device including a strain gauge sensor to determine when the eyewear device is manipulated by a user, such as being put on, taken off, and interacted with. A processor identifies a signature event based on sensor signals received from the strain gauge sensor and a data table of strain gauge sensor measurements corresponding to signature events. The processor controls the eyewear device as a function of the identified signature event, such as powering on a display of the eyewear device as the eyewear device is being put on a user's head, and then turning of the display when the eyewear device is removed from the user's head.

IPC Classes  ?

  • G02C 11/00 - Non-optical adjuncts; Attachment thereof

24.

WATERPROOF UAV FOR CAPTURING IMAGES

      
Application Number US2023029071
Publication Number 2024/058872
Status In Force
Filing Date 2023-07-31
Publication Date 2024-03-21
Owner SNAP INC. (USA)
Inventor
  • Moll, Sharon
  • Zhang, Dawei

Abstract

A waterproof UAV that records camera footage while traveling through air and while submerged in water. The UAV alters speed and direction of propellers dependent on the medium that the UAV is traveling through to provide control of the UAV. The propellers are capable of spinning in both directions to enable the UAV to change its depth and orientation in water. A machine learning (ML) model is used to identify humans and objects underwater. A housing coupled to the UAV makes the UAV positively buoyant to float in water and to control buoyancy while submerged.

IPC Classes  ?

  • B64U 10/14 - Flying platforms with four distinct rotor axes, e.g. quadcopters
  • B64U 30/26 - Ducted or shrouded rotors
  • B64U 20/70 - Constructional aspects of the UAV body
  • B64U 60/10 - Undercarriages specially adapted for use on water
  • B64U 20/87 - Mounting of imaging devices, e.g. mounting of gimbals
  • B64C 39/02 - Aircraft not otherwise provided for characterised by special use
  • G06N 20/00 - Machine learning
  • B64U 101/30 - UAVs specially adapted for particular uses or applications for imaging, photography or videography

25.

DEFORMING REAL-WORLD OBJECT USING IMAGE WARPING

      
Application Number US2023032181
Publication Number 2024/058966
Status In Force
Filing Date 2023-09-07
Publication Date 2024-03-21
Owner SNAP INC. (USA)
Inventor
  • Guler, Riza Alp
  • Tam, Himmy
  • Wang, Haoyang
  • Kakolyris, Antonios

Abstract

Methods and systems are disclosed for performing real-time deforming operations. The system receives an image that includes a depiction of a real-world object. The system applies a machine learning model to the image to generate a warping field and segmentation mask, the machine learning model trained to establish a relationship between a plurality of training images depicting real -world objects and corresponding ground-truth warping fields and segmentation masks associated with a target shape. The system applies the generated warping field and segmentation mask to the image to warp the real- world object depicted in the image to the target shape.

IPC Classes  ?

  • G06T 3/00 - Geometric image transformation in the plane of the image
  • G06T 7/194 - Segmentation; Edge detection involving foreground-background segmentation
  • G06T 11/00 - 2D [Two Dimensional] image generation

26.

THREE-DIMENSIONAL ASSET RECONSTRUCTION

      
Application Number US2023029068
Publication Number 2024/058871
Status In Force
Filing Date 2023-07-31
Publication Date 2024-03-21
Owner SNAP INC. (USA)
Inventor
  • Vasilkovskii, Mikhail
  • Demyanov, Sergey
  • Shakhrai, Vladislav

Abstract

A three-dimensional asset (3D) reconstruction technique for generating a 3D asset representing an object from images of the object. The images are captured from different viewpoints in a darkroom using one or more light sources having known locations. The system estimates camera poses for each of the captured images and then constructs a 3D surface mesh made up of surfaces using the captured images and their respective estimated camera poses. Texture properties for each of the surfaces of the 3D surface mesh are then refined to generate the 3D asset.

IPC Classes  ?

27.

FINGER GESTURE RECOGNITION VIA ACOUSTIC-OPTIC SENSOR FUSION

      
Application Number US2023032717
Publication Number 2024/059182
Status In Force
Filing Date 2023-09-14
Publication Date 2024-03-21
Owner SNAP INC. (USA)
Inventor
  • Krishnan Gorumkonda, Gurunandan
  • Nayar, Shree K.
  • Xu, Chenhan
  • Zhou, Bing

Abstract

A finger gesture recognition system is provided. The finger gesture recognition system includes one or more audio sensors and one or more optic sensors. The finger gesture recognition system captures, using the one or more audio sensors, audio signal data of a finger gesture being made by a user, and captures, using the one or more optic sensors, optic signal data of the finger gesture. The finger gesture recognition system recognizes the finger gesture based on the audio signal data and the optic signal data and communicates finger gesture data of the recognized finger gesture to an Augmented Reality/Combined Reality/Virtual Reality (XR) application.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

28.

EGOCENTRIC HUMAN BODY POSE TRACKING

      
Application Number US2023032755
Publication Number 2024/059206
Status In Force
Filing Date 2023-09-14
Publication Date 2024-03-21
Owner SNAP INC. (USA)
Inventor
  • Arakawa, Riku
  • Krishnan Gorumkonda, Gurunandan
  • Nayar, Shree K.
  • Zhou, Bing

Abstract

A pose tracking system is provided. The pose tracking system includes an EMF tracking system having a user-worn head-mounted EMF source and one or more user-worn EMF tracking sensors attached to the wrists of the user. The EMF source is associated with a VIO tracking system such as AR glasses or the like. The pose tracking system determines a pose of the user's head and a ground plane using the VIO tracking system and a pose of the user's hands using the EMF tracking system to determine a fullbody pose for the user. Metal interference with the EMF tracking system is minimized using an IMU mounted with the EMF tracking sensors. Long term drift in the IMU and the VIO tracking system are minimized using the EMF tracking system.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/0346 - Pointing devices displaced or positioned by the user; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors

29.

SELECTING AR BUTTONS ON A HAND

      
Application Number US2023031980
Publication Number 2024/054434
Status In Force
Filing Date 2023-09-05
Publication Date 2024-03-14
Owner SNAP INC. (USA)
Inventor Crispin, Sterling

Abstract

Systems and methods are provided for performing AR button selection operations on an augmented reality (AR) device. The system displays, by an AR device, a plurality of AR objects on a display region that overlaps a first real-world object, each of the plurality of AR objects being associated with an object selection region. The system computes a first spatial relationship factor for a first AR object of the plurality of AR objects based on a position of the first AR object relative to a position of a second real -world object and adjusts the object selection region of the first AR object based on the first spatial relationship factor. The system activates the first AR object in response to determining that the second real -world object overlaps the object selection region of the first AR object.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/00 - Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
  • G02B 27/01 - Head-up displays
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06N 20/00 - Machine learning

30.

3D CURSOR FUNCTIONALITY FOR AUGMENTED REALITY CONTENT IN MESSAGING SYSTEMS

      
Application Number US2023073639
Publication Number 2024/054909
Status In Force
Filing Date 2023-09-07
Publication Date 2024-03-14
Owner SNAP INC. (USA)
Inventor
  • Goodrich, Kyle
  • Lazarov, Maxim Maximov
  • Moreno, Daniel

Abstract

The subject technology detects a location and a position of a representation of a finger. The subject technology generates a first virtual object based on the location and the position of the representation of the finger. The subject technology detects a first collision event. The subject technology in response to the first collision event, modifies a set of dimensions of the second virtual object to a second set of dimensions. The subject technology detects a second location and a second position of the representation of the finger. The subject technology detects a second collision event. The subject technology modifies a set of dimensions of the third virtual object to a third set of dimensions. The subject technology renders the third virtual object based on the third set of dimensions within a third scene, the third scene comprising a modified scene from a second scene.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

31.

SHOOTING INTERACTION USING AUGMENTED REALITY CONTENT IN A MESSAGING SYSTEM

      
Application Number US2023073647
Publication Number 2024/054915
Status In Force
Filing Date 2023-09-07
Publication Date 2024-03-14
Owner SNAP INC. (USA)
Inventor
  • Goodrich, Kyle
  • Lazarov, Maxim Maximov
  • Moreno, Daniel

Abstract

The subject technology receives a set of frames. The subject technology detect a first gesture correspond to an open trigger finger gesture. The subject technology receives a second set of frames. The subject technology detects from the second set of frames, a second gesture correspond to a closed trigger finger gesture. The subject technology detects a location and a position of a representation of a finger from the closed trigger finger gesture. The subject technology generates a first virtual object based at least in part on the location and the position of the representation of the finger. The subject technology renders a movement of the first virtual object along a vector away from the location and the position of the representation of the finger within a first scene. The subject technology provides for display the rendered movement of the first virtual object along the vector within the first scene.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/03 - Arrangements for converting the position or the displacement of a member into a coded form
  • G06F 3/04845 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour

32.

VIRTUAL OBJECT MANIPULATION WITH GESTURES IN A MESSAGING SYSTEM

      
Application Number US2023073781
Publication Number 2024/054999
Status In Force
Filing Date 2023-09-08
Publication Date 2024-03-14
Owner SNAP INC. (USA)
Inventor
  • Goodrich, Kyle
  • Moreno, Daniel

Abstract

The subject technology detects a first gesture and a second gesture, each gesture corresponding to an open trigger finger gesture. The subject technology detects a third gesture and a fourth gesture, each gesture corresponding to a closed trigger finger gesture. The subject technology, selects a first virtual object in a first scene. The subject technology detects a first location and a first position of a first representation of a first finger from the third gesture and a second location and a second position of a second representation of a second finger from the fourth gesture. The subject technology detects a first change in the first location and the first position and a second change in the second location and the second position. The subject technology modifies a set of dimensions of the first virtual object to a different set of dimensions.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/04842 - Selection of displayed objects or displayed text elements
  • G06F 3/04845 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour

33.

REMOTELY CHANGING SETTINGS ON AR WEARABLE DEVICES

      
Application Number US2023031364
Publication Number 2024/054377
Status In Force
Filing Date 2023-08-29
Publication Date 2024-03-14
Owner SNAP INC. (USA)
Inventor
  • Gurgul, Piotr
  • Moll, Sharon

Abstract

Systems, methods, and computer readable media are described for remotely changing settings on augmented reality (AR) wearable devices. Embodiments are disclosed that enable a user to change settings of an AR wearable device on a user interface (UI) provided by a host client device that can communicate wirelessly with the AR wearable device. The host client device and AR wearable device provide remote procedure calls (RFCs) and an application program interface (API) to access settings and determine if settings have been changed. The API enables the host client device to determine the settings on the AR wearable device without any prior knowledge of the settings on the AR wearable device. The RFCs and the API enable the host client device to automatically update the settings on the AR wearable device when the user changes the settings on the host client device.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 1/16 - Constructional details or arrangements
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • H04W 4/80 - Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication

34.

SELECTING A TILT ANGLE OF AN AR DISPLAY

      
Application Number US2023031487
Publication Number 2024/054381
Status In Force
Filing Date 2023-08-30
Publication Date 2024-03-14
Owner SNAP INC. (USA)
Inventor
  • Lucas, Benjamin
  • Meisenholder, David

Abstract

Systems, methods, and computer readable media for selecting a tilt angle of an augmented reality (AR) display of an AR wearable device. Some examples of the present disclosure capture simulation data of gaze fixations while users are performing tasks using applications resident on the AR wearable device. The tilt angle of the AR display is selected based on including more gaze fixations that are within the field of view (FOV) of the AR display than are outside the FOV of the AR display. In some examples, an AR wearable device is manufactured with a fixed vertical tilt angle for the AR display. In some examples, the AR wearable device can dynamically adjust the vertical tilt angle of the AR display based on the applications that a user of the AR wearable device is likely to use or is using.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 1/16 - Constructional details or arrangements
  • G02B 27/01 - Head-up displays

35.

AUTO TRIMMING FOR AUGMENTED REALITY CONTENT IN MESSAGING SYSTEMS

      
Application Number US2023073609
Publication Number 2024/054888
Status In Force
Filing Date 2023-09-07
Publication Date 2024-03-14
Owner SNAP INC. (USA)
Inventor
  • Goodrich, Kyle
  • Lazarov, Maxim Maximov
  • Mcphee, Andrew James
  • Moreno, Daniel

Abstract

The subject technology receives frames of a source media content. The subject technology detects from the frames of the source media content, a first gesture indicating a cut point at a particular frame of the source media content, the cut point associated with a trimming operation to be performed on the source media content. The subject technology selects a starting frame and an ending frame from the frames based at least in part on the cut point at the particular frame. The subject technology performs the trimming operation based on the starting frame and the ending frame. The subject technology generates a second media content using the third set of frames. The subject technology provides for display at least a portion of the third set of frames of the second media content.

IPC Classes  ?

  • G11B 27/031 - Electronic editing of digitised analogue information signals, e.g. audio or video signals
  • G11B 27/34 - Indicating arrangements
  • H04N 23/611 - Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
  • H04N 23/62 - Control of parameters via user interfaces

36.

CURSOR FUNCTIONALITY FOR AUGMENTED REALITY CONTENT IN MESSAGING SYSTEMS

      
Application Number US2023073635
Publication Number 2024/054906
Status In Force
Filing Date 2023-09-07
Publication Date 2024-03-14
Owner SNAP INC. (USA)
Inventor
  • Goodrich, Kyle
  • Lazarov, Maxim Maximov
  • Mcphee, Andrew James
  • Moreno, Daniel

Abstract

The subject technology detects a location and a position of a representation of a finger in a set of frames captured by a camera of a client device. The subject technology generates a first virtual object based at least in part on the location and the position of the representation of the finger. The subject technology renders the first virtual object within a first scene. The subject technology detects a first collision event corresponding to a first collider of the first virtual object intersecting with a second collider of a second virtual object. The subject technology modifies a set of dimensions of the second virtual object to a second set of dimensions. The subject technology renders the second virtual object based on the second set of dimensions within a second scene. The subject technology provides for display the rendered second virtual object within the second scene.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 3/0482 - Interaction with lists of selectable items, e.g. menus
  • G06F 3/04842 - Selection of displayed objects or displayed text elements
  • G06F 3/04845 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
  • G06F 3/04847 - Interaction techniques to control parameter settings, e.g. interaction with sliders or dials

37.

TRIGGER GESTURE FOR SELECTION OF AUGMENTED REALITY CONTENT IN MESSAGING SYSTEMS

      
Application Number US2023073776
Publication Number 2024/054995
Status In Force
Filing Date 2023-09-08
Publication Date 2024-03-14
Owner SNAP INC. (USA)
Inventor
  • Goodrich, Kyle
  • Moreno, Daniel

Abstract

The subject technology detects a first gesture corresponding to an open trigger finger gesture. The subject technology detects a location and a position of a representation of a finger from the open trigger finger gesture. The subject technology generates a first virtual object based at least in part on the location and the position of the representation of the finger. The subject technology detects a first collision event. The subject technology detects a second gesture corresponding to a closed trigger finger gesture. The subject technology selects the second virtual object. The subject technology renders the first virtual object as attached to the second virtual object in response to the selecting. The subject technology provides for display the rendered first virtual object as attached to the second virtual object within a first scene.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

38.

SCULPTING AUGMENTED REALITY CONTENT USING GESTURES IN A MESSAGING SYSTEM

      
Application Number US2023073783
Publication Number 2024/055001
Status In Force
Filing Date 2023-09-08
Publication Date 2024-03-14
Owner SNAP INC. (USA)
Inventor
  • Goodrich, Kyle
  • Kaminski, Kurt
  • Moreno, Daniel

Abstract

The subject technology detects from a set of frames, a first gesture, the first gesture corresponding to a pinch gesture. The subject technology detects a first location and a first position of a first representation of a first finger from the first gesture and a second location and a second position of a second representation of a second finger from the first gesture. The subject technology detects a first collision event corresponding to a first collider and a second collider intersecting with a third collider of a first virtual object. The subject technology detects a first change in the first location and the first position and a second change in the second location and the second position. The subject technology modifies the first virtual object to include an additional augmented reality content based at least in part on the first change and the second change.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
  • G06F 3/0482 - Interaction with lists of selectable items, e.g. menus
  • G06F 3/04842 - Selection of displayed objects or displayed text elements
  • G06F 3/04845 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
  • G06F 3/04847 - Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
  • G06V 40/20 - Movements or behaviour, e.g. gesture recognition

39.

GESTURES TO ENABLE MENUS USING AUGMENTED REALITY CONTENT IN A MESSAGING SYSTEM

      
Application Number US2023073786
Publication Number 2024/055004
Status In Force
Filing Date 2023-09-08
Publication Date 2024-03-14
Owner SNAP INC. (USA)
Inventor
  • Goodrich, Kyle
  • Moreno, Daniel

Abstract

The subject technology detects a first location and a first position of a first representation of a first finger and a second location and a second position of a second representation of a second finger. The subject technology detects a first particular location and a first particular position of a first particular representation of a first particular finger and a second particular location and a second particular position of a second particular representation of a second particular finger. The subject technology detects a first change in the first location and the first position and a second change in the second location and the second position. The subject technology detects a first particular change in the first particular location and the first particular position and a second particular change in the second particular location and the second particular position. The subject technology generates a set of virtual objects.

IPC Classes  ?

  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/04842 - Selection of displayed objects or displayed text elements

40.

REVEALING COLLABORATIVE OBJECT USING COUNTDOWN TIMER

      
Application Number US2023028387
Publication Number 2024/049575
Status In Force
Filing Date 2023-07-21
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Cho, Youjean
  • Ji, Chen
  • Liu, Fannie
  • Monroy-Hernández, Andrés
  • Tsai, Tsung-Yu
  • Vaish, Rajan

Abstract

A collaborative session (e.g., a virtual time capsule) in which access to a collaborative object and added virtual content is selectively provided to participants/users. In one example of the collaborative session, a processor provides users with access to a collaborative object using respective physically remote devices, and associates virtual content received from the users with the collaborative object during a collaboration period. The processor maintains a timer including a countdown indicative of when the collaboration period ends for associating virtual content with the collaborative object. The processor provides the users with access to the collaborative object with associated virtual content at the end of the collaboration period.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/048 - Interaction techniques based on graphical user interfaces [GUI]

41.

REAL-WORLD RESPONSIVENESS OF A COLLABORATIVE OBJECT

      
Application Number US2023028460
Publication Number 2024/049576
Status In Force
Filing Date 2023-07-24
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Cho, Youjean
  • Ji, Chen
  • Liu, Fannie
  • Monroy-Hernández, Andrés
  • Tsai, Tsung-Yu
  • Vaish, Rajan

Abstract

A collaborative session (e.g., a virtual time capsule) in which access to a collaborative object with an associated material and added virtual content is provided to users. In one example of the collaborative session, a user selects the associated material of the collaborative object. Physical characteristics are assigned to the collaborative object as a function of the associated material to be perceived by the participants when the collaborative object is manipulated. In one example, the material associated to the collaborative object is metal, wherein the interaction between the users and the collaborative object generates a response of the collaborative object that is indicative of the physical properties of metal, such as inertial, acoustic, and malleability.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
  • G06F 3/0484 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
  • G06Q 10/10 - Office automation; Time management
  • G06F 3/16 - Sound input; Sound output
  • G02B 27/01 - Head-up displays

42.

SELECTIVE COLLABORATIVE OBJECT ACCESS BASED ON TIMESTAMP

      
Application Number US2023028468
Publication Number 2024/049577
Status In Force
Filing Date 2023-07-24
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Cho, Youjean
  • Ji, Chen
  • Liu, Fannie
  • Monroy-Hernández, Andrés
  • Tsai, Tsung-Yu
  • Vaish, Rajan

Abstract

Collaborative sessions in which access to added virtual content is selectively made available to participants/users by a collaborative system. The system receives a request from a user to join a session and associates a timestamp with the user corresponding to receipt of the request. Users can edit the collaborative object if the timestamp is within the collaborative duration period and can view the collaborative object if the timestamp is after the collaborative duration period.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
  • G06Q 10/10 - Office automation; Time management
  • H04L 9/40 - Network security protocols
  • G02B 27/01 - Head-up displays

43.

SCISSOR HAND GESTURE FOR A COLLABORATIVE OBJECT

      
Application Number US2023028536
Publication Number 2024/049578
Status In Force
Filing Date 2023-07-25
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Cho, Youjean
  • Ji, Chen
  • Liu, Fannie
  • Monroy-Hernández, Andrés
  • Tsai, Tsung-Yu
  • Vaish, Rajan

Abstract

Collaborative sessions in which access to a collaborative object and added virtual content is selectively provided to participants/users. In one example of the collaborative session, a participant crops media content by use of a hand gesture to produce an image segment that can be associated to the collaborative object. The hand gesture resembles a pair of scissors and the camera and processor of the client device track a path of the hand gesture to identify an object within a displayed image to create virtual content of the identified object. The virtual content created by the hand gesture is then associated to the collaborative object.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/00 - Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
  • G06F 3/04842 - Selection of displayed objects or displayed text elements
  • G06V 40/20 - Movements or behaviour, e.g. gesture recognition
  • G06F 3/04845 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour

44.

PHYSICAL GESTURE INTERACTION WITH OBJECTS BASED ON INTUITIVE DESIGN

      
Application Number US2023028537
Publication Number 2024/049579
Status In Force
Filing Date 2023-07-25
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Cho, Youjean
  • Ji, Chen
  • Liu, Fannie
  • Monroy-Hernández, Andrés
  • Tsai, Tsung-Yu
  • Vaish, Rajan

Abstract

A collaborative session (e.g., a virtual time capsule) in which access to a collaborative object and added virtual content is selectively provided to participants/users. In one example of the collaborative session, a user interacts with the collaborative object using hand gestures. The virtual content associated with the collaborative object can be accessed with an opening hand gesture and the virtual content can be hidden with a closing hand gesture. The hand gestures are detected by cameras of a client device used by the user. The collaborative object can be moved and manipulated using a pointing gesture, wherein the position of the collaborative object can be confirmed to a new position by titling the client device of the user.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/00 - Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
  • G06V 40/20 - Movements or behaviour, e.g. gesture recognition

45.

AUTHENTICATING A SELECTIVE COLLABORATIVE OBJECT

      
Application Number US2023028542
Publication Number 2024/049580
Status In Force
Filing Date 2023-07-25
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Cho, Youjean
  • Ji, Chen
  • Liu, Fannie
  • Monroy-Hernández, Andrés
  • Tsai, Tsung-Yu
  • Vaish, Rajan

Abstract

A collaborative session (e.g., a virtual time capsule) in which access to a collaborative object and added virtual content is selectively provided to participants/users. In one example of the collaborative session, authentication of the collaborative object is performed by all of the users to complete the collaborative session. Each user authenticates the collaborative object, such as using a stamping gesture on a user interface of a client device or in an augmented reality session. User specific data is recorded with the stamping gesture to authenticate the collaborative object and the associated virtual content. In an example, user specific data may include device information, participant profile information, or biometric signal information. Biometric signal information, such as a fingerprint from a mobile device or a heart rate received from a connected smart device can be used to provide an authenticating signature to the seal.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
  • G06Q 10/10 - Office automation; Time management
  • G06F 21/32 - User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G02B 27/01 - Head-up displays

46.

COLLABORATIVE OBJECT ASSOCIATED WITH A GEOGRAPHICAL LOCATION

      
Application Number US2023028671
Publication Number 2024/049586
Status In Force
Filing Date 2023-07-26
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Cho, Youjean
  • Ji, Chen
  • Liu, Fannie
  • Monroy-Hernández, Andrés
  • Tsai, Tsung-Yu
  • Vaish, Rajan

Abstract

Collaborative sessions in which access to added virtual content is selectively made available to participants/users. A participant (the host) creates a new session and invites participants to join. The invited participants receive an invitation to join the session. The session creator (i.e., the host) and other approved participants can access the contents of a session. The session identifies a new participant when they join the session, and concurrently notifies the other participants in the session that a new participant is waiting for permission to access the added virtual content. The host or approved participants can set up the new participant with permissions for accessing added virtual content.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/00 - Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements

47.

CHARACTER AND COSTUME ASSIGNMENT FOR CO-LOCATED USERS

      
Application Number US2023028717
Publication Number 2024/049588
Status In Force
Filing Date 2023-07-26
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Kim, Daekun
  • Zhang, Lei
  • Cho, Youjean
  • Robinson, Ava
  • Tham, Yu Jiang
  • Vaish, Rajan
  • Monroy-Hernández, Andrés

Abstract

Multi-player co-located AR experiences are augmented by assigning characters and costumes to respective participants (a.k.a. "users" of AR-enabled mobile devices) in multi-player AR sessions for storytelling, play acting, and the like. Body tracking technology and augmented reality (AR) software are used to decorate the bodies of the co-located participants with virtual costumes within the context of the multi-player co-located AR experiences. Tracked bodies are distinguished to determine which body belongs to which user and hence which virtual costume belongs to which tracked body so that corresponding costumes may be assigned for display in augmented reality. A host-guest mechanism is used for networked assignment of characters and corresponding costumes in the co-located multi-player AR session. Body tracking technology is used to move the costume with the body as movement of the assigned body is detected.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06T 13/40 - 3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
  • G06Q 10/10 - Office automation; Time management

48.

GENERATING IMMERSIVE AUGMENTED REALITY EXPERIENCES FROM EXISTING IMAGES AND VIDEOS

      
Application Number US2023030926
Publication Number 2024/049687
Status In Force
Filing Date 2023-08-23
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Vaish, Rajan
  • Kratz, Sven
  • Monroy-Hernández, Andrés
  • Smith, Brian Anthony

Abstract

A two-dimensional element is identified from one or more two-dimensional images. A volumetric content item is generated based on the two-dimensional element identified from the one or more two-dimensional images. A display device presents the volumetric content item overlaid on a real-world environment that is within a field of view of a user of the display device.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • H04N 13/388 - Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
  • G06T 15/08 - Volume rendering
  • G06T 7/70 - Determining position or orientation of objects or cameras
  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
  • G06F 16/58 - Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually

49.

VOICE CONTROLLED UIS FOR AR WEARABLE DEVICES

      
Application Number US2023031018
Publication Number 2024/049696
Status In Force
Filing Date 2023-08-24
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Moll, Sharon
  • Gurgul, Piotr

Abstract

Systems, methods, and computer readable media for voice-controlled user interfaces (UIs) for augmented reality (AR) wearable devices are disclosed. Embodiments are disclosed that enable a user to interact with the AR wearable device without using physical user interface devices. An application has a non-voice-controlled UI mode and a voice-controlled UI mode. The user selects the mode of the UI. The application running on the AR wearable device displays UI elements on a display of the AR wearable device. The UI elements have types. Predetermined actions are associated with each of the UI element types. The predetermined actions are displayed with other information and used by the user to invoke the corresponding UI element.

IPC Classes  ?

  • G06F 3/16 - Sound input; Sound output
  • G06F 3/04817 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
  • G06F 3/0482 - Interaction with lists of selectable items, e.g. menus
  • G06F 1/16 - Constructional details or arrangements

50.

TIMELAPSE RE-EXPERIENCING SYSTEM

      
Application Number US2023072282
Publication Number 2024/050232
Status In Force
Filing Date 2023-08-16
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Vaish, Rajan
  • Kratz, Sven
  • Monroy-Hernández, Andrés
  • Smith, Brian Anthony

Abstract

A system captures via one or more sensors of a computing device, data of an environment observed by the one or more sensors at a first timeslot, and stores the data in a data store as a first portion of a timelapse memory experience. The system also captures, via the one or more sensors of a computing device, data of the environment observed by the one or more sensors at a second timeslot, and stores the data in a data store as a second portion of the timelapse memory experience. The system additionally associates the timelapse memory experience with a memory experience trigger, wherein the memory experience trigger can initiate a presentation of the timelapse memory experience.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

51.

TOUCH-BASED AUGMENTED REALITY EXPERIENCE

      
Application Number US2023072701
Publication Number 2024/050259
Status In Force
Filing Date 2023-08-23
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Vaish, Rajan
  • Kratz, Sven
  • Monroy-Hernández, Andrés
  • Smith, Brian Anthony

Abstract

The present disclosure relates to methods and systems for providing a touch-based augmented reality (AR) experience. During a capture phase, a first user may grip an object. An intensity of a force applied on the object in the grip and/or a duration of the grip may be recorded. A volumetric representation of the first user holding the object may also be captured. During an experience phase, a second user may touch the object, the object may provide haptic feedback (e.g., a vibration) to the second user at an intensity and a duration corresponding to an intensity of the force applied on the object and a duration of the grip of the object. If a volumetric representation of the first user holding the object is captured, touching the object may also cause a presentation of the first user's volumetric body that holds the object.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/04842 - Selection of displayed objects or displayed text elements

52.

ONE-HANDED ZOOM OPERATION FOR AR/VR DEVICES

      
Application Number US2023072707
Publication Number 2024/050260
Status In Force
Filing Date 2023-08-23
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Mahalingam, Anoosh Kruba Chandar
  • Pounds, Jennica
  • Rybin, Andrei
  • Santerre, Pierre-Yves

Abstract

An Augmented Reality (AR) system is provided. The AR system uses a combination of gesture and Direct Manipulation of Virtual Objects (DMVO) methodologies to provide for the user's selection and modification of virtual objects of an AR experience. The user indicates that they want to interact with a virtual object of the AR experience by moving their hand to overlap the virtual object. While keeping their hand in an overlapping position, the user makes gestures that cause the user's viewpoint of the virtual object to either zoom in or zoom out. To end the interaction, the user moves their hand such that their hand is no longer overlapping the virtual object.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/03 - Arrangements for converting the position or the displacement of a member into a coded form

53.

WRIST ROTATION MANIPULATION OF VIRTUAL OBJECTS

      
Application Number US2023072721
Publication Number 2024/050263
Status In Force
Filing Date 2023-08-23
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Mahalingam, Anoosh Kruba Chandar
  • Pounds, Jennica
  • Rybin, Andrei
  • Santerre, Pierre-Yves

Abstract

An Augmented Reality (AR) system is provided. The AR system uses a combination of gesture and Direct Manipulation of Virtual Objects (DMVO) methodologies to provide for the user's selection and modification of virtual objects of an AR experience. The user indicates that they want to interact with a virtual object of the AR experience by moving their hand to overlap the virtual object. While keeping their hand in an overlapping position, the user rotates their wrist and the virtual object is rotated as well. To end the interaction, the user moves their hand such that their hand is no longer overlapping the virtual object.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/03 - Arrangements for converting the position or the displacement of a member into a coded form
  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
  • G06F 3/04817 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
  • G06F 3/04845 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour

54.

EXTENDING USER INTERFACES OF MOBILE APPS TO AR EYEWEAR

      
Application Number US2023028257
Publication Number 2024/049565
Status In Force
Filing Date 2023-07-20
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Gurgul, Piotr
  • Moll, Sharon

Abstract

An architecture is provided for packaging visual overlay-based user interfaces (UIs) into mobile device applications to work as user interface extensions that allow certain flows and logic to be displayed on an eyewear device when connected to the mobile device application. The extension of the UIs of the mobile device applications to the display of the eyewear device allows for inexpensive experimentation with augmented reality (AR) UIs for eyewear devices and allows for reusing of business logic across mobile devices and associated eyewear devices. For example, a mobile device application for maps or navigation may be extended to show directions on an associated eyewear device once the destination is chosen in the navigation application on the mobile device. In this example, the business logic would still live in the navigation application on the mobile device but the user would see AR directions overlaid on a display of the eyewear device.

IPC Classes  ?

  • G06F 3/0484 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
  • G02B 27/01 - Head-up displays
  • H04W 4/80 - Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication
  • G06F 1/16 - Constructional details or arrangements
  • G06F 9/451 - Execution arrangements for user interfaces

55.

SELECTIVE COLLABORATIVE OBJECT ACCESS

      
Application Number US2023028377
Publication Number 2024/049573
Status In Force
Filing Date 2023-07-21
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Cho, Youjean
  • Ji, Chen
  • Liu, Fannie
  • Monroy-Hernández, Andrés
  • Tsai, Tsung-Yu
  • Vaish, Rajan

Abstract

Collaborative sessions in which access to added virtual content is selectively made available to participants/users. A participant (the host) creates a new session and invites participants to join. The invited participants receive an invitation to join the session. The session creator (i.e., the host) and other approved participants can access the contents of a session. The session identifies a new participant when they join the session, and concurrently notifies the other participants in the session that a new participant is waiting for permission to access the added virtual content. The host or approved participants can set up the new participant with permissions for accessing added virtual content.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/04842 - Selection of displayed objects or displayed text elements
  • G02B 27/01 - Head-up displays

56.

TIMELAPSE OF GENERATING A COLLABORATIVE OBJECT

      
Application Number US2023028664
Publication Number 2024/049585
Status In Force
Filing Date 2023-07-26
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Cho, Youjean
  • Ji, Chen
  • Liu, Fannie
  • Monroy-Hernández, Andrés
  • Tsai, Tsung-Yu
  • Vaish, Rajan

Abstract

A collaborative session (e.g., a virtual time capsule) in which access to a collaborative object and added virtual content is selectively provided to participants/users. In one example of the collaborative session, a participant (the host) creates a new session and invites participants to join. The session creator (i.e., the host) and other approved participants can access the contents of a session (e.g., which may be recorded using an application such as lens cloud feature; available from Snap Inc. of Santa Monica, California). A timestamp is associated with each received virtual content, and the users are provided with a timelapse of the collaborative object as a function of the timestamps.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/0484 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
  • G06F 3/0481 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
  • G06F 3/00 - Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements

57.

CO-LOCATED FULL-BODY GESTURES

      
Application Number US2023028708
Publication Number 2024/049587
Status In Force
Filing Date 2023-07-26
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Kim, Daekun
  • Zhang, Lei
  • Cho, Youjean
  • Robinson, Ava
  • Tham, Yu Jiang
  • Vaish, Rajan
  • Monroy-Hernández, Andrés

Abstract

A method for detecting full-body gestures by a mobile device includes a host mobile device detecting the tracked body of a co-located participant in a multi-party session. When the participant's tracked body provides a full-body gesture, the host's mobile device recognizes that there is a tracked body providing a full-body gesture. The host mobile device iterates through the list of participants in the multi-party session and finds the closest participant mobile device with respect to the screen-space position of the head of the gesturing participant. The host mobile device then obtains the user ID of the closest participant mobile device and broadcasts the recognized full-body gesture event to all co-located participants in the multi-party session, along with the obtained user ID. Each participant's mobile device may then handle the gesture event as appropriate for the multi-party session. For example, a character or costume may be assigned to a gesturing participant.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/00 - Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements

58.

AUTHORING TOOLS FOR CREATING INTERACTIVE AR EXPERIENCES

      
Application Number US2023028720
Publication Number 2024/049589
Status In Force
Filing Date 2023-07-26
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Zhang, Lei
  • Kim, Daekun
  • Cho, Youjean
  • Robinson, Ava
  • Tham, Yu Jiang
  • Vaish, Rajan
  • Monroy-Hernández, Andrés

Abstract

Described are authoring tools for creating interactive AR experiences. The story-authoring application enables a user with little or no programming skills to create an interactive story that includes recording voice commands for advancing to the next scene, inserting and manipulating virtual objects in a mixed-reality environment, and recording a variety of interactions with connected IoT devices. The story creation interface is presented on the display as a virtual object in an AR environment.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 3/16 - Sound input; Sound output
  • G06F 3/14 - Digital output to display device
  • G10L 15/26 - Speech to text systems

59.

VIRTUAL AR INTERFACES FOR CONTROLLING IOT DEVICES USING MOBILE DEVICE ORIENTATION SENSORS

      
Application Number US2023028776
Publication Number 2024/049592
Status In Force
Filing Date 2023-07-27
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Zhang, Lei
  • Cho, Youjean
  • Kim, Daekun
  • Robinson, Ava
  • Tham, Yu Jiang
  • Vaish, Rajan
  • Monroy-Hernández, Andrés

Abstract

Described are virtual AR interfaces for generating a virtual rotational interface for the purpose of controlling connected IoT devices using the inertial measurement unit (IMU) of a portable electronic device. The IMU control application enables a user of a portable electronic device to activate a virtual rotational interface overlay on a display and adjust a feature of a connected IoT product by rotating a portable electronic device. The device IMU moves a slider on the virtual rotational interface. The IMU control application sends a control signal to the IoT product which executes an action in accordance with the slider position. The virtual rotational interface is presented on the display as a virtual object in an AR environment. The IMU control application detects the device orientation (in the physical environment) and in response presents a corresponding slider element on the virtual rotational interface (in the AR environment).

IPC Classes  ?

  • H04L 67/125 - Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks involving control of end-device applications over a network
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 3/00 - Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements

60.

INTERACTION RECORDING TOOLS FOR CREATING INTERACTIVE AR STORIES

      
Application Number US2023028788
Publication Number 2024/049594
Status In Force
Filing Date 2023-07-27
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Zhang, Lei
  • Cho, Youjean
  • Kim, Daekun
  • Robinson, Ava
  • Tham, Yu Jiang
  • Vaish, Rajan
  • Monroy-Hernández, Andrés

Abstract

Recording tools for creating interactive AR experiences. An interaction recording application enables a user with little or no programming skills to perform and record user behaviors that are associated with reactions between story elements such as virtual objects and connected IoT devices. The user behaviors include a range of actions, such as speaking a trigger word and apparently touching a virtual object. The corresponding reactions include starting to record a subsequent scene and executing actions between story elements. The trigger recording interface is presented on the display as an overlay relative to the physical environment.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/16 - Sound input; Sound output
  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
  • G10L 15/22 - Procedures used during a speech recognition process, e.g. man-machine dialog
  • G02B 27/01 - Head-up displays
  • G16Y 10/75 - Information technology; Communication

61.

RECORDING FOLLOWING BEHAVIORS BETWEEN VIRTUAL OBJECTS AND USER AVATARS IN AR EXPERIENCES

      
Application Number US2023028882
Publication Number 2024/049596
Status In Force
Filing Date 2023-07-27
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Zhang, Lei
  • Robinson, Ava
  • Kim, Daekun
  • Cho, Youjean
  • Tham, Yu Jiang
  • Vaish, Rajan
  • Monroy-Hernández, Andrés

Abstract

Described are recording tools for generating following behaviors and creating interactive AR experiences. The following recording application enables a user with little or no programming skills to virtually connect virtual objects to other elements, including virtual avatars representing fellow users, thereby creating an interactive story in which multiple elements are apparently and persistently connected. The following interface includes methods for selecting objects and instructions for connecting a virtual object to a target object. In one example, the recording application presents on the display a virtual tether between the objects until a connecting action is detected. The following interface is presented on the display as an overlay, in the foreground relative to the physical environment.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/00 - Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
  • G06F 3/0487 - Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G02B 27/01 - Head-up displays

62.

VIRTUAL INTERFACES FOR CONTROLLING IOT DEVICES

      
Application Number US2023028885
Publication Number 2024/049597
Status In Force
Filing Date 2023-07-27
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Kim, Daekun
  • Zhang, Lei
  • Cho, Youjean
  • Robinson, Ava
  • Tham, Yu Jiang
  • Vaish, Rajan
  • Monroy-Hernández, Andrés

Abstract

A virtual interface application presented in augmented reality (AR) is described for controlling Internet of Things (IoT) products. The virtual interface application enables a user of a portable electronic device to activate a virtual control interface overlay on a display, receive a selection from the user using her hands or feet, and send a control signal to a nearby IoT product which executes an action in accordance with the selection. The virtual control interface is presented on the display as a virtual object in an AR environment. The virtual interface application includes a foot tracking tool for detecting an intersection between the foot location (in the physical environment) and the virtual surface position (in the AR environment). When an intersection is detected, the virtual interface application sends a control signal with instructions to the IoT product.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/04842 - Selection of displayed objects or displayed text elements
  • G06F 3/0482 - Interaction with lists of selectable items, e.g. menus
  • G06F 3/04817 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
  • G06F 3/00 - Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
  • H04L 67/125 - Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks involving control of end-device applications over a network

63.

MULTISENSORIAL PRESENTATION OF VOLUMETRIC CONTENT

      
Application Number US2023031066
Publication Number 2024/049700
Status In Force
Filing Date 2023-08-24
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Vaish, Rajan
  • Kratz, Sven
  • Monroy-Hernández, Andrés
  • Smith, Brian Anthony

Abstract

Input indicative of a selection of volumetric content for presentation is received. The volumetric content comprises a volumetric representation of one or more elements of a real-world three-dimensional space. In response to the input, device state data associated with the volumetric content is accessed. The device state data describes a state of one or more network-connected devices associated with the real-world three-dimensional space. The volumetric content is presented. The presentation of the volumetric content includes presentation of the volumetric representation of the one or more elements overlaid on the real-world three-dimensional space by a display device and configuring the one or more network-connected devices using the device state data.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • H04N 13/388 - Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
  • G06T 17/20 - Wire-frame description, e.g. polygonalisation or tessellation
  • G06T 15/08 - Volume rendering

64.

CONTEXTUAL MEMORY EXPERIENCE TRIGGERS SYSTEM

      
Application Number US2023072274
Publication Number 2024/050229
Status In Force
Filing Date 2023-08-16
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Vaish, Rajan
  • Kratz, Sven
  • Monroy-Hernández, Andrés
  • Smith, Brian Anthony

Abstract

A system monitors an environment via one or more sensors included in a computing device and applies a trigger to detect that a memory experience is stored in a data store based on the monitoring. The system creates an augmented reality memory experience, a virtual reality memory experience, or a combination thereof, based on the trigger if the memory experience is detected. The system additionally projects the augmented reality memory experience, the virtual reality memory experience, or the combination thereof, via the computing device.

IPC Classes  ?

  • G06F 3/06 - Digital input from, or digital output to, record carriers

65.

SOCIAL MEMORY RE-EXPERIENCING SYSTEM

      
Application Number US2023072277
Publication Number 2024/050231
Status In Force
Filing Date 2023-08-16
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Vaish, Rajan
  • Kratz, Sven
  • Monroy-Hernández, Andrés
  • Smith, Brian Anthony

Abstract

A system monitors a user environment via one or more sensors included in a computing device and detects, via a trigger, that event data is stored in a data store based on the monitoring. The system further detects one or more participants in the event data and invites the one or more participants to share an augmented reality event data and/or to a virtual reality event data. The system also creates, based on the event data, an augmented reality event data and/or a virtual reality event data, and presents the augmented reality event data and/or the virtual reality event data to the one or more participants in a synchronous mode and/or in an asynchronous mode, via the computing device.

IPC Classes  ?

  • G06Q 50/10 - Services
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

66.

MULTI-PERSPECTIVE AUGMENTED REALITY EXPERIENCE

      
Application Number US2023072557
Publication Number 2024/050245
Status In Force
Filing Date 2023-08-21
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Vaish, Rajan
  • Kratz, Sven
  • Monroy-Hernández, Andrés
  • Smith, Brian Anthony

Abstract

The present disclosure relates to methods and systems for providing a multi-perspective augmented reality experience. A volumetric video of a three-dimensional space is captured. The volumetric video of the three-dimensional space includes a volumetric representation of a first user within the three-dimensional space. The volumetric video is displayed by a display device worn by a second user, and the second user sees the volumetric representation of the first user within the three-dimensional space. Input indicative of an interaction (e.g., entering or leaving) of the second user with the volumetric representation of the first user is detected. Based on detecting the input indicative of the interaction, the display device switches to a display of a recorded perspective of the first user. Thus, by interacting with a volumetric representation of the first user in a volumetric video, the second user views the first user's perspective of the three-dimensional space.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06T 15/20 - Perspective computation
  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
  • G06T 15/08 - Volume rendering
  • H04N 13/388 - Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume
  • G02B 27/01 - Head-up displays

67.

CONTROLLING AND EDITING PRESENTATION OF VOLUMETRIC CONTENT

      
Application Number US2023072568
Publication Number 2024/050246
Status In Force
Filing Date 2023-08-21
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Vaish, Rajan
  • Kratz, Sven
  • Monroy-Hernández, Andrés
  • Smith, Brian Anthony

Abstract

A display device presents volumetric content comprising a volumetric video. The volumetric video comprises a volumetric representation of one or more elements a three-dimensional space. Input indicative of a control operation associated with the presentation of the volumetric video is received. The presentation of the volumetric video by the display device is controlled by executing the control operation. While the control operation is being executed, the volumetric representation of the one or more elements of the three-dimensional space are displayed from multiple perspectives based on movement of a user.

IPC Classes  ?

  • G06F 3/16 - Sound input; Sound output
  • G06F 3/0484 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
  • G06F 3/0481 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance

68.

MIXING AND MATCHING VOLUMETRIC CONTENTS FOR NEW AUGMENTED REALITY EXPERIENCES

      
Application Number US2023072718
Publication Number 2024/050262
Status In Force
Filing Date 2023-08-23
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Kratz, Sven
  • Monroy‐hernández, Andrés
  • Smith, Brian Anthony
  • Vaish, Rajan

Abstract

A volumetric content presentation system includes a head-worn display device, which includes one or more processors, and a memory storing instructions that, when executed by the one or more processors, configure the display device to access AR content items that correspond to either real -world objects or virtual objects, mix and match these AR content items, and present volumetric content that includes these mixed and matched AR content items overlaid on a real-world environment to create a new AR scene that a user can experience.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • H04N 13/388 - Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume
  • H04N 13/332 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
  • G06F 3/16 - Sound input; Sound output
  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object

69.

MULTI-DIMENSIONAL EXPERIENCE PRESENTATION USING AUGMENTED REALITY

      
Application Number US2023072726
Publication Number 2024/050264
Status In Force
Filing Date 2023-08-23
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Vaish, Rajan
  • Kratz, Sven
  • Monroy-Hernández, Andrés
  • Smith, Brian Anthony

Abstract

The present disclosure relates to methods and systems for providing a presentation of an experience (e.g., a journey) to a user using augmented reality (AR). During a capture phase, persons in the journey may take videos or pictures using their smartphones, GoPros, and/or smart glasses. A drone may also take videos or pictures during the journey. During an experience phase, an AR topographical rendering of the real-world environment of the journey may be rendered on a tabletop, highlighting/animating a path persons took in the journey. The persons may be rendered as miniature avatars/dolls overlaid on the representation of the real-world environment. When the user clicks on a point in the presentation of the journey, a perspective (e.g., the videos or pictures) at that point is presented.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 3/00 - Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
  • H04W 4/80 - Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication

70.

3D SPACE CARVING USING HANDS FOR OBJECT CAPTURE

      
Application Number US2023073217
Publication Number 2024/050460
Status In Force
Filing Date 2023-08-31
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Micusik, Branislav
  • Evangelidis, Georgios
  • Wolf, Daniel

Abstract

A method for carving a 3D space using hands tracking is described. In one aspect, a method includes accessing a first frame from a camera of a display device, tracking, using a hand tracking algorithm operating at the display device, hand pixels corresponding to one or more user hands depicted in the first frame, detecting, using a sensor of the display device, depths of the hand pixels, identifying a 3D region based on the depths of the hand pixels, and applying a 3D reconstruction engine to the 3D region.

IPC Classes  ?

  • G06T 7/50 - Depth or shape recovery
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06V 40/20 - Movements or behaviour, e.g. gesture recognition
  • G02B 27/01 - Head-up displays

71.

OBJECT COUNTING ON AR WEARABLE DEVICES

      
Application Number US2023030709
Publication Number 2024/044137
Status In Force
Filing Date 2023-08-21
Publication Date 2024-02-29
Owner SNAP INC. (USA)
Inventor
  • Gurgul, Piotr
  • Moll, Sharon
  • Zakrzewski, Tomasz

Abstract

Systems, methods, and computer readable media for object counting on augmented reality (AR) wearable devices are disclosed. Embodiments are disclosed that enable display of a count of objects as part of a user view. Upon receipt of a request to count objects, the AR wearable device captures an image of the user view. The AR wearable device transmits the image to a backend for processing to determine the objects in the image. The AR wearable device selects a. group of objects of the determined objects to count and overlays boundary boxes over counted objects within the user view. The position of the boundary boxes is adjusted to account for movement of the AR wearable device. A hierarchy of objects is used to group together objects that are related but have different labels or names.

IPC Classes  ?

  • G06T 7/11 - Region-based segmentation
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06T 7/70 - Determining position or orientation of objects or cameras
  • G06F 3/16 - Sound input; Sound output
  • G06V 20/70 - Labelling scene content, e.g. deriving syntactic or semantic representations

72.

GROUPING SIMILAR WORDS IN A LANGUAGE MODEL

      
Application Number US2023072579
Publication Number 2024/044544
Status In Force
Filing Date 2023-08-21
Publication Date 2024-02-29
Owner SNAP INC. (USA)
Inventor
  • Assa, Jackie
  • Bekker, Alan
  • Moshe, Zach

Abstract

Systems and methods are provided for performing automated speech recognition. The systems and methods access a LM that includes a plurality of n-grams, each of the plurality of n-grams comprising a respective sequence of words and corresponding LM score and receive a list of words associated with a group classification, each word in the list of words being associated with a respective weight. The systems and method compute, based on the LM scores of the plurality of n-grams, a probability that a given word in the list of words associated with the group classification appears in an n-gram in the LM comprising an individual sequence of words and adds one or more new n-grams to the LM comprising one or more words in the list of words in combination with the individual sequence of words and associated with a particular LM score based on the computed probability.

IPC Classes  ?

  • G10L 15/197 - Probabilistic grammars, e.g. word n-grams
  • G06F 40/284 - Lexical analysis, e.g. tokenisation or collocates

73.

LAYER FREEZING AND DATA SIEVING FOR SPARSE TRAINING

      
Application Number US2023028253
Publication Number 2024/044004
Status In Force
Filing Date 2023-07-20
Publication Date 2024-02-29
Owner SNAP INC. (USA)
Inventor
  • Ren, Jian
  • Tulyakov, Sergey
  • Li, Yanyu
  • Yuan, Geng

Abstract

e.g.e.g., the support of the sparse computation, layer type and size, and system overhead. The FLOPs reduction from the frozen layers and shrunken dataset leads to higher actual training acceleration than weight sparsity.

IPC Classes  ?

  • G06N 3/09 - Supervised learning
  • G06N 3/0495 - Quantised networks; Sparse networks; Compressed networks

74.

AVATAR CALL ON AN EYEWEAR DEVICE

      
Application Number US2023030711
Publication Number 2024/044138
Status In Force
Filing Date 2023-08-21
Publication Date 2024-02-29
Owner SNAP INC. (USA)
Inventor
  • Tran, Lien Le Hong
  • Saunders, Matthew
  • Skrypnyk, Daria
  • Canberk, Ilteris Kaan

Abstract

Systems and methods are provided for performing voice communication operations. The system establishes, by a first augmented reality (AR) device, a voice communication session between a plurality of users. The system displays, by the first AR device of a first user of the plurality of users, an avatar representing a second user of the plurality of users. The system receives, by the first AR device of a first user of the plurality of users, input from the first user that selects a display position for the avatar representing the second user within a real-world environment of the first user. The system animates the avatar representing the second user based on movement information received from a second AR device of the second user.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 1/16 - Constructional details or arrangements
  • G06F 3/0484 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range

75.

EXTERNAL COMPUTER VISION FOR AN EYEWEAR DEVICE

      
Application Number US2023030818
Publication Number 2024/044184
Status In Force
Filing Date 2023-08-22
Publication Date 2024-02-29
Owner SNAP INC. (USA)
Inventor
  • Canberk, Ilteris Kaan
  • Hallberg, Matthew
  • Saunders, Matthew
  • Skrypnyk, Daria
  • Tran, Lien Le Hong

Abstract

Systems and methods are provided for performing operations on an augmented reality (AR) device using an external vision system. The system establishes, by the AR device, a communication with an external client device. The system overlays, by the AR device, a first AR object on a real -world environment being viewed using the AR device. The system receives interaction data from the external client device representing movement of a user determined by the external client device. The system, in response to receiving the interaction data from the external client device, modifies the first AR object by the AR device.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
  • G02B 27/01 - Head-up displays
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 13/40 - 3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
  • G06N 20/00 - Machine learning

76.

HAND-TRACKING STABILIZATION

      
Application Number US2023071923
Publication Number 2024/044473
Status In Force
Filing Date 2023-08-09
Publication Date 2024-02-29
Owner SNAP INC. (USA)
Inventor Lucas, Benjamin

Abstract

An Augmented Reality (AR) system (352) provides stabilization of hand-tracking input data. The AR system provides for display a user interface of an AR application (328). The AR system captures, using one or more cameras (326) of the AR system, video frame tracking data of a gesture being made by a user (332) while the user interacts with the AR user interface. The AR system generates skeletal 3D model data of a hand of the user based on the video frame tracking data that includes one or more skeletal 3D model features corresponding to recognized visual landmarks of portions of the hand of the user. The AR system generates targeting data based on the skeletal 3D model data where the targeting data identifies a virtual 3D object of the AR user interface. The AR system filters the targeting data using a targeting filter component and provides the filtered targeting data to the AR application.

IPC Classes  ?

  • G06T 7/246 - Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
  • G06T 7/70 - Determining position or orientation of objects or cameras
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
  • G06V 20/00 - Scenes; Scene-specific elements
  • G06V 40/00 - Recognition of biometric, human-related or animal-related patterns in image or video data

77.

TEXT-GUIDED STICKER GENERATION

      
Application Number US2023071003
Publication Number 2024/039957
Status In Force
Filing Date 2023-07-26
Publication Date 2024-02-22
Owner SNAP INC. (USA)
Inventor
  • Ghosh, Arnab
  • Ren, Jian
  • Savchenkov, Pavel
  • Tulyakov, Sergey

Abstract

A method of generating an image for use in a conversation taking place in a messaging application is disclosed. Conversation input text is received from a user of a portable device that includes a display. Model input text is generated from the conversation input text, which is processed with a text-to-image model to generate an image based on the model input text. The generated image is displayed on the portable device, and user input is received to transmit the image to a remote recipient.

IPC Classes  ?

78.

CONTEXTUAL TEST CODE GENERATION

      
Application Number US2023071916
Publication Number 2024/039986
Status In Force
Filing Date 2023-08-09
Publication Date 2024-02-22
Owner SNAP INC. (USA)
Inventor
  • Tran, Adrian
  • Wang, Sichao

Abstract

At least one unit of a software application is identified. The at least one unit includes source code. The source code of the at least one unit is analyzed to determine a style of the source code. Metadata is extracted from the at least one unit based on the source code analysis. One or more features of the extracted metadata are classified. A template file is modified based on the extracted metadata and the classified features to create a modified template file.

IPC Classes  ?

  • G06F 8/75 - Structural analysis for program understanding
  • G06F 11/36 - Preventing errors by testing or debugging of software

79.

DETECTING WEAR STATUS OF WEARABLE DEVICE

      
Application Number US2023030154
Publication Number 2024/039605
Status In Force
Filing Date 2023-08-14
Publication Date 2024-02-22
Owner SNAP INC. (USA)
Inventor
  • Heger, Jason
  • Hu, Dunxu
  • Nachtigall, Eric
  • Nilles, Gerald
  • Olgun, Ugur
  • Vadivelu, Praveen Babu

Abstract

Methods and systems are disclosed for detecting whether a wearable device is being worn by a user. The system transmits a radio signal from a first communication device of a wearable device to a second communication device of the wearable device and measures a signal strength associated with the radio signal received by the second communication device. The system compares the signal strength to a threshold value and generates an indication of a wear status associated with the wearable device based on comparing the signal strength to the threshold value.

IPC Classes  ?

  • H04B 17/318 - Received signal strength
  • H04R 1/10 - Earpieces; Attachments therefor
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 1/16 - Constructional details or arrangements

80.

CELLULAR ARCHITECTURES FOR AR CAPABLE WEARABLE DEVICES

      
Application Number US2023030235
Publication Number 2024/039650
Status In Force
Filing Date 2023-08-15
Publication Date 2024-02-22
Owner SNAP INC. (USA)
Inventor
  • Olgun, Ugur
  • Heger, Jason
  • Vadivelu, Praveen Babu
  • Wakser, Jordan

Abstract

Examples include a wearable device having a frame, a temple and onboard electronics components. The frame can optionally configured to hold one or more optical elements. T temple can optionally connected to the frame at a joint such that the temple is disposable between a collapsed condition and a wearable condition in which the wearable device is wearable by a user to hold the one or more optical elements within user view. The onboard electronics components can be carried by at least one of the frame and the temple and can include a first antenna configured for cellular communication carried by the frame and a second antenna configured for cellular communication carried by one of the frame or the temple.

IPC Classes  ?

81.

INTERACTING WITH VISUAL CODES WITHIN MESSAGING SYSTEM

      
Application Number US2023071671
Publication Number 2024/039977
Status In Force
Filing Date 2023-08-04
Publication Date 2024-02-22
Owner SNAP INC. (USA)
Inventor Velicodnii, Vadim

Abstract

Aspects of the present disclosure involve a system comprising a computer-readable storage medium storing a program and method for interacting with visual codes within a messaging system. The program and method provide for displaying, by a messaging application, captured image data comprising a visual code, the visual code including a custom graphic and being decodable to access a first feature of the messaging application; receiving user input selecting the visual code; displaying an updated version of the custom graphic; providing an animation which depicts the updated version of the custom graphic as moving from the visual code to an interface element comprising a group of icons, each icon within the group of icons being user-selectable to access a respective second feature of the messaging application; and updating the group of icons to include an additional icon which is user-selectable to access the first feature of the messaging application.

IPC Classes  ?

  • H04L 51/10 - Multimedia information
  • H04L 51/18 - Commands or executable codes
  • H04L 51/52 - User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail for supporting social networking services

82.

EXTERNAL CONTROLLER FOR AN EYEWEAR DEVICE

      
Application Number US2023029814
Publication Number 2024/035763
Status In Force
Filing Date 2023-08-09
Publication Date 2024-02-15
Owner SNAP INC. (USA)
Inventor
  • Canberk, Ilteris Kaan
  • Hallberg, Matthew
  • Miller, William Miles
  • Tran, Lien Le Hong
  • Tucker, Michael Benson

Abstract

Systems and methods are provided for using an external controller with an AR device. The system establishes, by one or more processors of the AR device, a communication with an external client device. The system overlays, by the AR device, a first AR object on a real -world environment being viewed using the AR device. The system receives interaction data from the external client device representing one or more inputs received by the external client device and, in response, modifies the first AR object by the AR device.

IPC Classes  ?

  • A63F 13/213 - Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
  • A63F 13/212 - Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
  • A63F 13/211 - Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
  • A63F 13/428 - Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
  • A63F 13/426 - Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
  • A63F 13/22 - Setup operations, e.g. calibration, key configuration or button assignment
  • A63F 13/214 - Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
  • A63F 13/537 - Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

83.

LIGHT PROJECTOR

      
Application Number EP2023070755
Publication Number 2024/033091
Status In Force
Filing Date 2023-07-26
Publication Date 2024-02-15
Owner
  • SNAP INC. (USA)
  • SNAP GROUP LIMITED (United Kingdom)
Inventor
  • Pennell, Brennan Lloyd
  • Woods, David Mark

Abstract

A projector (10) for an augmented reality or mixed reality headset is disclosed, comprising: a display (20) defining an optical axis (9), configured to receive light (41) to generate first supplied light (40); an exit pupil (80) configured to couple the first supplied light into a waveguide for an augmented reality or mixed reality headset, the exit pupil comprising a first region and a second region; a first optical arrangement (60) configured to couple the first supplied light from the display towards the exit pupil; and a first light blocker (70). A first partial reflection (50), which is an undesirable reflection of the first supplied light, can be reflected back into the light projector. The light projector is configured to separate the supplied and reflected light spatially and prevent the reflected light from being coupled into the waveguide using the light blocker.

IPC Classes  ?

  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/18 - Optical systems or apparatus not provided for by any of the groups , for optical projection, e.g. combination of mirror and condenser and objective
  • G02B 27/28 - Optical systems or apparatus not provided for by any of the groups , for polarising
  • G02B 27/01 - Head-up displays

84.

AUTOMATIC QUANTIZATION OF A FLOATING POINT MODEL

      
Application Number US2023071664
Publication Number 2024/036082
Status In Force
Filing Date 2023-08-04
Publication Date 2024-02-15
Owner SNAP INC. (USA)
Inventor
  • Makoviichuk, Denys
  • Wang, Jiazhuo
  • Wen, Yang

Abstract

Aspects of the present disclosure involve a system comprising a computer-readable storage medium storing a program and method for automatic quantization of a floating point model. The program and method provide for providing a floating point model to an automatic quantization library, the floating point model being configured to represent a neural network, and the automatic quantization library being configured to generate a first quantized model based on the floating point model; providing a function to the automatic quantization library, the function being configured to run a forward pass on a given dataset for the floating point model; causing the automatic quantization library to generate the first quantized model based on the floating point model; causing the automatic quantization library to calibrate the first quantized model by running the first quantized model on the function; and converting the calibrated first quantized model to a second quantized model.

IPC Classes  ?

  • G06N 3/045 - Combinations of networks
  • G06N 3/0495 - Quantised networks; Sparse networks; Compressed networks
  • H03M 7/24 - Conversion to or from floating-point codes

85.

VOICE INPUT FOR AR WEARABLE DEVICES

      
Application Number US2023029111
Publication Number 2024/030373
Status In Force
Filing Date 2023-07-31
Publication Date 2024-02-08
Owner SNAP INC. (USA)
Inventor
  • Moll, Sharon
  • Gurgul, Piotr
  • Zakrzewski, Tomasz

Abstract

Systems, methods, and computer readable media for voice input for augmented reality (AR) wearable devices are disclosed. Embodiments are disclosed that enable a user to interact with the AR wearable device without using physical user interface devices. A keyword is used to indicate that the user is about to speak an action or command. The AR wearable device divides the processing of the audio data into a keyword module that is trained to recognize the keyword and a module to process the audio data after the keyword. In some embodiments, the AR wearable device transmits the audio data after the keyword to a host device to process. The AR wearable device maintains an application registry that associates actions with applications. Applications can be downloaded, and the application registry updated where the applications indicate actions to associate with the application.

IPC Classes  ?

  • G06F 3/16 - Sound input; Sound output
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 3/00 - Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
  • G06F 1/16 - Constructional details or arrangements

86.

MUTUAL AFFINITY WIDGET

      
Application Number US2023071005
Publication Number 2024/030795
Status In Force
Filing Date 2023-07-26
Publication Date 2024-02-08
Owner SNAP INC. (USA)
Inventor
  • Jonik, Daniel
  • Moreno, Daniel
  • Wang, Yu

Abstract

An interaction system that provides users in mutual affinity relationships to send messages. Interaction applications of two or more users receives notifications of a mutual affinity relationship between the first user and the second user. The interaction applications configure respective mutual affinity widgets by associating the mutual affinity widget with the respective other user. Icons of the mutual affinity widgets are provided on respective home screens of the users. Upon detecting a selection of the mutual affinity widget by a first user, a message creation interface is provided to the first user and a message is generated based on an image captured by the first user using the message creation user interface. The message is then sent to a second user. The second user uses their own mutual affinity widget to access the message.

IPC Classes  ?

  • G06F 3/04817 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
  • G06F 3/0484 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
  • G06F 9/451 - Execution arrangements for user interfaces

87.

COORDINATING DATA ACCESS AMONG MULTIPLE SERVICES

      
Application Number US2023071619
Publication Number 2024/031025
Status In Force
Filing Date 2023-08-03
Publication Date 2024-02-08
Owner SNAP INC. (USA)
Inventor
  • Badrinarayanan, Saikrishna
  • Chen, Guangyu
  • Chopra, Samarth
  • Deshpande, Apoorvaa
  • Javaheri, Hooman
  • Naveed, Muhammad
  • Papadimitriou, Antonios
  • Shiehian, Sina
  • Yeganeh, Bahador
  • Zhuang, Di

Abstract

Methods and systems are disclosed for managing access to encrypted data and encryption keys. The system stores, by a key management server, a first encryption key associated with a first service and a second encryption key associated with a second service. The system prevents, by the key management server, the second service from accessing the second encryption key while the first service is performing a first function using the first encryption key and determines that a first threshold period of time associated with the first function has elapsed. The system, in response to determining that the first threshold period of time associated with the first function has elapsed, prevents, by the key management server, the first service from accessing the first encryption key while the second service is performing a second function using the second encryption key.

IPC Classes  ?

  • H04L 9/40 - Network security protocols
  • H04W 12/02 - Protecting privacy or anonymity, e.g. protecting personally identifiable information [PII]
  • H04L 9/08 - Key distribution

88.

MEDICAL IMAGE OVERLAYS FOR AUGMENTED REALITY EXPERIENCES

      
Application Number US2023028251
Publication Number 2024/030269
Status In Force
Filing Date 2023-07-20
Publication Date 2024-02-08
Owner SNAP INC. (USA)
Inventor Sourov, Alexander

Abstract

A medical image overlay application for use with augmented reality (AR) eyewear devices. The image overlay application enables a user of an eyewear device to activate an image overlay on a display when the eyewear device detects that the camera field of view includes a medical image location. Medical image locations are defined relative to virtual markers. The image overlay includes one or more medical images, presented according to a configurable transparency value. An image registration tool transforms the location and scale of each medical image to the physical environment, such that the medical image as presented on the display closely matches the location and size of real objects.

IPC Classes  ?

  • A61B 90/00 - Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups , e.g. for luxation treatment or for protecting wound edges
  • A61B 90/50 - Supports for surgical instruments, e.g. articulated arms

89.

MAGNIFIED OVERLAYS CORRELATED WITH VIRTUAL MARKERS

      
Application Number US2023028171
Publication Number 2024/025779
Status In Force
Filing Date 2023-07-19
Publication Date 2024-02-01
Owner SNAP INC. (USA)
Inventor
  • Sourov, Alexander
  • Robertson, John James

Abstract

A magnification application for use with augmented reality (AR) eyewear devices. The magnification application enables a user of an eyewear device to activate a magnification overlay on a display whenever a camera on the eyewear device detects that the field of view includes a registered virtual marker. The magnified overlay includes one or more frames of the captured video data, presented according to a predefined and configurable magnification power. A pointer including a vector and a visual tether guides the user toward the virtual marker. When the eyewear device location is near a perimeter associated with the virtual marker, the magnified overlay appears in a predefined and configurable frame on the display.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/04845 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
  • G06F 3/00 - Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements

90.

VIRTUAL WARDROBE AR EXPERIENCE

      
Application Number US2023028481
Publication Number 2024/025830
Status In Force
Filing Date 2023-07-24
Publication Date 2024-02-01
Owner SNAP INC. (USA)
Inventor
  • Assouline, Avihay
  • Berger, Itamar
  • Dudovitch, Gal
  • Harel, Peleg
  • Mishin Shuvi, Ma'Ayan

Abstract

Aspects of the present disclosure involve a system for providing AR experiences. The system accesses, by a messaging application, an image depicting a real-world fashion item of a user and generates a three-dimensional (3D) virtual fashion item based on the real-world fashion item depicted in the image. The system stores the 3D virtual fashion item in a database that includes a virtual wardrobe comprising a plurality of 3D virtual fashion items associated with the user. The system generates, by the messaging application, an augmented reality (AR) experience that allows the user to interact with the virtual wardrobe.

IPC Classes  ?

  • G06T 17/00 - 3D modelling for computer graphics

91.

VIDEO PROCESSING WITH PREVIEW OF AR EFFECTS

      
Application Number CN2022108422
Publication Number 2024/020908
Status In Force
Filing Date 2022-07-28
Publication Date 2024-02-01
Owner SNAP INC. (USA)
Inventor
  • Zhu, Cai
  • Liu, Chuangwen
  • Wu, Haoyun
  • Yuan, Weihao

Abstract

Image augmentation effects are provided on a device that includes a display and a camera. A simplified augmented reality effect is applied to a stream of images captured by the camera, to generate a preview stream of images. The preview stream of images is displayed on the display. A second stream of images corresponding to the first stream of images is saved to an initial video file. A full augmented reality effect, corresponding to the simplified augmented reality affect, is then applied to the second stream of images to generate a fully-augmented stream of images, which are saved to a further video file. The further video file can then be played back on the display to show the final, fully augmented reality effect as applied to the stream of images.

IPC Classes  ?

  • H04N 21/44 - Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to MPEG-4 scene graphs

92.

SMART DEVICE INCLUDING OLFACTORY SENSING

      
Application Number US2023028087
Publication Number 2024/020064
Status In Force
Filing Date 2023-07-19
Publication Date 2024-01-25
Owner SNAP INC. (USA)
Inventor
  • Kratz, Sven
  • Monroy-Hernández, Andrés
  • Vaish, Rajan

Abstract

An electronic device with olfactory detector including an array of olfactory sensors for determining scents. Each sensor in the array is tuned to detect the presence and concentration of specific chemical compounds or molecules. A fan creates airflow of ambient air across the olfactory sensors. An analog to digital (A/D) converter receives and processes the sensor outputs of the olfactory sensors and provides the processed sensor output to a processor. Scent type and intensity can be classified by using the information from the scent sensors as input for a machine learning model, generated through supervised training using labeled example measurements from our sensor array. The processor may display information of the determined scents on a display of a smart device, and the processor can also send information indicative of the determined scents to another device.

IPC Classes  ?

  • G01N 33/00 - Investigating or analysing materials by specific methods not covered by groups

93.

SECURE PEER-TO-PEER CONNECTIONS BETWEEN MOBILE DEVICES

      
Application Number US2023070415
Publication Number 2024/020389
Status In Force
Filing Date 2023-07-18
Publication Date 2024-01-25
Owner SNAP INC. (USA)
Inventor
  • Zhuang, Richard
  • Hallberg, Matthew

Abstract

A communication link is established between a first mobile device and a second mobile device using communication setup information in a machine-readable code that is displayed on a display of the second mobile device. The first mobile device captures and decodes an image of the machine-readable code to extract dynamically-generated communication setup information. A communication link is then established between the two devices using the communication setup information. The machine readable code may also be used as a fiducial marker to establish an initial relative pose between the two devices. Pose updates received from the second mobile device can then be used as user-interface inputs to the first mobile device.

IPC Classes  ?

94.

OLFACTORY STICKERS FOR CHAT AND AR-BASED MESSAGING

      
Application Number US2023028053
Publication Number 2024/020049
Status In Force
Filing Date 2023-07-18
Publication Date 2024-01-25
Owner SNAP INC. (USA)
Inventor
  • Monroy-Hernández, Andrés
  • Kratz, Sven
  • Vaish, Rajan

Abstract

An olfactory sticker used in chats between electronic devices to make messaging more immersive, personalized, and authentic by integrating olfactory information with traditional text-based formats and AR messaging. Olfactory stickers are used to indicate to a recipient that a message includes olfactory information. The olfactory sticker illustrates a graphical representation of a particular scent that can be sent and received via a chat or an AR message. Olfactory stickers provide the recipient control of accessing the transmitted scent at a desired time. This is particularly useful since certain olfactory information, i.e., scents, can be very direct and intrusive. Olfactory stickers are activated (i.e., release their scent) when they are tapped or rubbed by the recipient.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • H04M 1/21 - Combinations with auxiliary equipment, e.g. with clocks or memoranda pads
  • A61L 9/12 - Apparatus, e.g. holders, therefor
  • G06V 20/20 - Scenes; Scene-specific elements in augmented reality scenes
  • B05B 17/00 - Apparatus for spraying or atomising liquids or other fluent materials, not covered by any other group of this subclass

95.

EYEWEAR WITH NON-POLARIZING AMBIENT LIGHT DIMMING

      
Application Number US2023028167
Publication Number 2024/020110
Status In Force
Filing Date 2023-07-19
Publication Date 2024-01-25
Owner SNAP INC. (USA)
Inventor
  • Mathur, Viabhav
  • Schuck, Miller
  • Matranga, Mario

Abstract

An electronic eyewear device including a spatial dimming pixel panel within an optical assembly that delivers improved backlighting conditions for displayed images. The spatial dimming pixel panel is spatially dimmed where an image is positioned on a display, and the unoccupied area of the display is not dimmed (undimmed), thereby unaltering the real-world view through that portion of the display. The spatial dimming pixel panel is made of multiple liquid crystal cells arranged in a gridded orientation with a dye-doped or guest-host liquid crystal system having a phase change mode with homeotropic alignment. The spatial dimming pixel panel absorbs nonpolarized light when a voltage is applied across the cells and passes nonpolarized light in the absence of a voltage across the cells.

IPC Classes  ?

  • G09G 3/34 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix by control of light from an independent source
  • G09G 3/36 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix by control of light from an independent source using liquid crystals
  • G02F 1/137 - Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour based on liquid crystals, e.g. single liquid crystal display cells characterised by the electro-optical or magneto-optical effect, e.g. field-induced phase transition, orientation effect, guest-host interaction or dynamic scattering
  • G02F 1/139 - Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour based on liquid crystals, e.g. single liquid crystal display cells characterised by the electro-optical or magneto-optical effect, e.g. field-induced phase transition, orientation effect, guest-host interaction or dynamic scattering based on orientation effects in which the liquid crystal remains transparent
  • G02B 27/01 - Head-up displays

96.

SINGLE IMAGE THREE-DIMENSIONAL HAIR RECONSTRUCTION

      
Application Number US2023070732
Publication Number 2024/020559
Status In Force
Filing Date 2023-07-21
Publication Date 2024-01-25
Owner SNAP INC. (USA)
Inventor
  • Huang, Zeng
  • Chai, Menglei
  • Tulyakov, Sergey
  • Olszewski, Kyle
  • Lee, Hsin-Ying

Abstract

A system to enable 3D hair reconstruction and rendering from a single reference image which performs a multi-stage process that utilizes both a 3D implicit representation and a 2D parametric embedding space.

IPC Classes  ?

97.

BOOSTING WORDS IN AUTOMATED SPEECH RECOGNITION

      
Application Number US2023069953
Publication Number 2024/015782
Status In Force
Filing Date 2023-07-11
Publication Date 2024-01-18
Owner SNAP INC. (USA)
Inventor
  • Assa, Jackie
  • Bekker, Alan
  • Moshe, Zach

Abstract

Systems and methods are provided for performing automated speech recognition. The systems and methods perform operations comprising: accessing a language model that includes a plurality of n-grams, each of the plurality of n-grams comprising a respective sequence of words and corresponding LM score; selecting a target word to boost in the language model; receiving a boosting factor for the target word; identifying a target n-gram in the language model that includes the target word; identifying a subset of n-grams of the plurality of n-grams that include words in a portion of the target n-gram; and adjusting the LM score of the target n-gram based on the LM scores of the subset of n-grams and the boosting factor.

IPC Classes  ?

  • G10L 15/197 - Probabilistic grammars, e.g. word n-grams
  • G06F 40/284 - Lexical analysis, e.g. tokenisation or collocates

98.

INCREMENTAL SCANNING FOR CUSTOM LANDMARKERS

      
Application Number US2023070147
Publication Number 2024/015917
Status In Force
Filing Date 2023-07-13
Publication Date 2024-01-18
Owner SNAP INC. (USA)
Inventor
  • Aljubeh, Marwan
  • Jubatyrov, Nursultan
  • Nersesian, Eric
  • Tanathong, Supannee

Abstract

A method for generating an updated localizer map of a reference scene is provided. The method may include: acquiring a preliminary localizer map of the reference scene, the preliminary localizer map including a preliminary frame that each is associated with a set of preliminary data points; capturing a plurality of new data points on the reference scene; determining poses of the capture device related to the capture of the plurality of new data points; creating at least one new frame; selecting one or more target frames from the preliminary frame and the new frame; and generating the updated localizer map based on the one or more target frames.

IPC Classes  ?

  • G06T 7/579 - Depth or shape recovery from multiple images from motion

99.

APPLYING ANIMATED 3D AVATAR IN AR EXPERIENCES

      
Application Number US2023026917
Publication Number 2024/010800
Status In Force
Filing Date 2023-07-05
Publication Date 2024-01-11
Owner SNAP INC. (USA)
Inventor
  • Assouline, Avihay
  • Berger, Itamar
  • Guler, Riza Alp
  • Kakolyris, Antonios
  • Lu, Frank
  • Wang, Haoyang
  • Zohar, Matan

Abstract

Aspects of the present disclosure involve a system for providing virtual experiences. The system accesses, by a messaging application, an image depicting a person. The system generates, by the messaging application, a three- dimensional (3D) avatar based on the person depicted in the image. The system receives input that selects a pose for the 3D avatar and one or more fashion items to be worn by the 3D avatar and places, by the messaging application, the 3D avatar in the selected pose and wearing the one or more fashion items in an augmented reality (AR) experience.

IPC Classes  ?

  • G06T 17/00 - 3D modelling for computer graphics
  • G06T 19/006 -
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts

100.

LOW-POWER ARCHITECTURE FOR AUGMENTED REALITY DEVICE

      
Application Number US2023069708
Publication Number 2024/011175
Status In Force
Filing Date 2023-07-06
Publication Date 2024-01-11
Owner SNAP INC. (USA)
Inventor
  • Arya, Ashwani
  • Feinman, Alex
  • Harris, Daniel
  • Bahulkar, Tejas
  • Hu, Dunxu

Abstract

A method for managing power resource in an augmented reality (AR) device is described. In one aspect, the method includes configuring a low-power mode to run on a low-power processor of the AR device using a first set of sensor data, and a high-power mode to run on a high-power processor of the AR device using a second set of sensor data, operating, using the low-power processor, a low-power application in the low-power mode based on the first set of sensor data, detecting a request to operate a high-power application at the AR device, in response to detecting the request, activating the second set of sensors of the AR device corresponding to the high-power mode, and operating, using the high-power processor, a high-power application in the high-power mode based on the second set of sensors.

IPC Classes  ?

  • G06F 1/3293 - Power saving characterised by the action undertaken by switching to a less power-consuming processor, e.g. sub-CPU
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 1/16 - Constructional details or arrangements
  • G02B 27/01 - Head-up displays
  • G06F 1/3231 - Monitoring the presence, absence or movement of users
  • G06F 1/3234 - Power saving characterised by the action undertaken
  • G06F 1/3287 - Power saving characterised by the action undertaken by switching off individual functional units in the computer system
  • G06F 3/0346 - Pointing devices displaced or positioned by the user; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
  1     2     3     ...     10        Next Page