Snap Inc.

United States of America

Back to Profile

1-100 of 4,570 for Snap Inc. and 4 subsidiaries Sort by
Query
Aggregations
IP Type
        Patent 4,214
        Trademark 356
Jurisdiction
        United States 3,452
        World 915
        Canada 111
        Europe 92
Owner / Subsidiary
[Owner] Snap Inc. 4,532
Snapchat, Inc. 35
Bitstrips Inc. 1
Flite, Inc. 1
Verbify Inc. 1
Date
New (last 4 weeks) 174
2024 March (MTD) 98
2024 February 73
2024 January 71
2023 December 105
See more
IPC Class
G06T 19/00 - Manipulating 3D models or images for computer graphics 603
G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer 474
G02B 27/01 - Head-up displays 438
H04L 12/58 - Message switching systems 402
G06F 3/0482 - Interaction with lists of selectable items, e.g. menus 346
See more
NICE Class
09 - Scientific and electric apparatus and instruments 242
42 - Scientific, technological and industrial services, research and design 136
41 - Education, entertainment, sporting and cultural services 131
35 - Advertising and business services 101
38 - Telecommunications services 59
See more
Status
Pending 1,084
Registered / In Force 3,486
  1     2     3     ...     46        Next Page

1.

REAL-TIME UPPER-BODY GARMENT EXCHANGE

      
Application Number 18525285
Status Pending
Filing Date 2023-11-30
First Publication Date 2024-03-21
Owner Snap Inc. (USA)
Inventor
  • Assouline, Avihay
  • Berger, Itamar
  • Malbin, Nir
  • Sasson, Gal

Abstract

Methods and systems are disclosed for performing operations for transferring garments from one real-world object to another in real time. The operations comprise receiving a first video that includes a depiction of a first person wearing a first upper-body garment in a first pose and obtaining a second video that includes a depiction of a second person wearing a second upper-body garment in a second pose. A pose of the second person depicted in the second video is modified to match the first pose of the first person depicted in the first video. The operations comprise generating an upper-body segmentation of the second upper-body garment which the second person is wearing in the second video in the modified pose and replacing the first upper-body garment worn by the first person in the first video with the second upper-body garment based on the upper-body segmentation.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06T 7/215 - Motion-based segmentation
  • G06T 7/70 - Determining position or orientation of objects or cameras
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts

2.

DEVICE AND METHOD FOR COMPENSATING EFFECTS OF PANTOSCOPIC TILT OR WRAP/SWEEP TILT ON AN IMAGE PRESENTED ON AN AUGMENTED REALITY OR VIRTUAL REALITY DISPLAY

      
Application Number 18263837
Status Pending
Filing Date 2021-12-07
First Publication Date 2024-03-21
Owner Snap Inc. (USA)
Inventor
  • Valera, Mohmed Salim
  • Poussin, David Louis Maxime

Abstract

An optical device is disclosed for use in an augmented reality or virtual reality display, comprising a waveguide (12; 22; 32) and an input diffractive optical element (H0; H3; 34) positioned in or on the waveguide, configured to receive light from a projector and couple it into the waveguide so that it is captured within the waveguide under total internal reflection. The input diffractive optical element has an input grating vector (G0; Gig) in the plane of the waveguide. The device includes a first diffractive optical element (H1; H4) and a second diffractive optical element (H2; H5) having first and second grating vectors (G2, G3; GV1, GV2) respectively in the plane of the waveguide, wherein the first diffractive optical element is configured to receive light from the input diffractive optical element and to couple it towards the second diffractive optical element, and wherein the second diffractive optical element is configured to receive light from the first diffractive optical element and to couple it out of the waveguide towards a viewer. The input grating vector, the first grating vector and the second grating vector have different respective magnitudes, and wherein a vector addition of the input grating vector, the first grating vector and the second grating vector sums to zero.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems
  • G02B 27/42 - Diffraction optics

3.

CONFIGURING A 3D MODEL WITHIN A VIRTUAL CONFERENCING SYSTEM

      
Application Number 17948480
Status Pending
Filing Date 2022-09-20
First Publication Date 2024-03-21
Owner Snap Inc. (USA)
Inventor
  • Chou, William
  • Lin, Andrew Cheng-Min

Abstract

Aspects of the present disclosure involve a system comprising a computer-readable storage medium storing a program and method for configuring a three-dimensional (3D) model within a virtual conferencing system. The program and method provide, in association with designing a room for virtual conferencing, an interface for configuring a 3D model; receiving, via the interface, an indication of user input for setting properties for the 3D model, the properties specifying image data for projecting onto the 3D model; and in association with virtual conferencing, providing display of the room based on the properties for the 3D model, and causing the image data to be projected onto the 3D model within the room.

IPC Classes  ?

  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
  • G06F 3/04845 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
  • G06F 3/04847 - Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts

4.

MEDIA CONTENT PLAYBACK AND COMMENTS MANAGEMENT

      
Application Number 18521428
Status Pending
Filing Date 2023-11-28
First Publication Date 2024-03-21
Owner Snap Inc. (USA)
Inventor
  • Boyd, Nathan Kenneth
  • Voss, Jeremy Baker

Abstract

A method and a system include receiving a request from a client device to view a media content item, determining at least one comment associated with a respective user profile from a set of connected profiles, generating a summary comments selectable item based at least in part on the respective user profile, causing a display of playback of the media content item and the summary comments selectable item in response to the request to view the media content item, and during the playback of the media content item at the particular time, causing a display of at least one comment.

IPC Classes  ?

  • H04L 51/52 - User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail for supporting social networking services
  • G06F 3/04817 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
  • H04L 67/306 - User profiles

5.

CIRCUITS AND METHODS FOR WEARABLE DEVICE CHARGING AND WIRED CONTROL

      
Application Number 18520094
Status Pending
Filing Date 2023-11-27
First Publication Date 2024-03-21
Owner Snap Inc. (USA)
Inventor
  • Tham, Yu Jiang
  • Larson, Nicolas
  • Brook, Peter
  • Patton, Russell Douglas
  • Alhaideri, Miran
  • Hong, Zhihao

Abstract

Methods and devices for wired charging and communication with a wearable device are described. In one embodiment, a symmetrical contact interface comprises a first contact pad and a second contact pad, and particular wired circuitry is coupled to the first and second contact pads to enable charging as well as receive and transmit communications via the contact pads as part of various device states.

IPC Classes  ?

  • H02J 7/00 - Circuit arrangements for charging or depolarising batteries or for supplying loads from batteries
  • G02C 5/14 - Side-members
  • G02C 11/00 - Non-optical adjuncts; Attachment thereof
  • H01L 27/02 - Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including integrated passive circuit elements with at least one potential-jump barrier or surface barrier
  • H01R 13/62 - Means for facilitating engagement or disengagement of coupling parts or for holding them in engagement
  • H02J 7/04 - Regulation of the charging current or voltage
  • H02J 7/34 - Parallel operation in networks using both storage and other dc sources, e.g. providing buffering
  • H03K 19/0185 - Coupling arrangements; Interface arrangements using field-effect transistors only
  • H04B 3/56 - Circuits for coupling, blocking, or by-passing of signals

6.

SELECTING ITEMS DISPLAYED BY A HEAD-WORN DISPLAY DEVICE

      
Application Number 18523197
Status Pending
Filing Date 2023-11-29
First Publication Date 2024-03-21
Owner Snap Inc. (USA)
Inventor
  • Stolzenberg, Karen
  • Meisenholder, David
  • Vignau, Mathieu Emmanuel
  • Park, Sana
  • Sun, Tianyi
  • Fortier, Joseph Timothy
  • Anvaripour, Kaveh
  • Moreno, Daniel
  • Goodrich, Kyle

Abstract

Disclosed is a method of receiving and processing content-sending inputs received by a head-worn device system including one or more display devices, one or more cameras and a vertically-arranged touchpad. The method includes displaying a content item on the one or more display devices, receiving a touch input on the touchpad corresponding to a send instruction, displaying a carousel of potential recipients, receiving a horizontal touch input on the touchpad, scrolling the carousel left or right on the one or more display devices in response to the horizontal touch input, receiving a tap touch input on the touchpad to select a particular recipient, receiving a further touch input, and in response to the further touch input, transmitting the content item to the selected recipient.

IPC Classes  ?

  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
  • G02B 27/01 - Head-up displays
  • G06F 3/0485 - Scrolling or panning
  • G06F 3/0488 - Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

7.

AUGMENTING IMAGE CONTENT WITH SOUND

      
Application Number 18519735
Status Pending
Filing Date 2023-11-27
First Publication Date 2024-03-21
Owner Snap Inc. (USA)
Inventor
  • Boyd, Nathan Kenneth
  • Brody, Jonathan Dale
  • Cooper, Andrew Grosvenor
  • Francis, Brandon
  • Heikkinen, Christie Marie
  • Lankage, Ranidu

Abstract

Aspects of the present disclosure involve a system and a method for performing operations comprising: receiving, by a messaging application implemented on a client device, input that selects a sound option to add sound to one or more images; in response to receiving the input, presenting a sound editing user interface element that visually indicates a played portion of the sound and separately visually indicates an un-played portion of the sound; receiving an interaction with the sound editing user interface element to modify a start point of the sound; embedding a graphical element representing the sound in the one or more images; playing, by the messaging application, the sound associated with the graphical element starting from the start point together with displaying the one or more images.

IPC Classes  ?

  • G06F 3/16 - Sound input; Sound output
  • G06F 3/04847 - Interaction techniques to control parameter settings, e.g. interaction with sliders or dials

8.

VISUAL AND AUDIO WAKE COMMANDS

      
Application Number 18367278
Status Pending
Filing Date 2023-09-12
First Publication Date 2024-03-21
Owner Snap Inc. (USA)
Inventor
  • Colascione, Daniel
  • Hanover, Matthew
  • Korolev, Sergei
  • Marr, Michael David
  • Myers, Scott
  • Powderly, James

Abstract

A gesture-based wake process for an AR system is described herein. The AR system places a hand-tracking input pipeline of the AR system in a suspended mode. A camera component of the hand-tracking input pipeline detects a possible visual wake command being made by a user of the AR system. On the basis of detecting the possible visual wake command, the AR system wakes the hand-tracking input pipeline and places the camera component in a fully operational mode. If the AR system, using the hand-tracking input pipeline, verifies the possible visual wake command as an actual wake command, the AR system initiates execution of an AR application.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06V 10/82 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
  • G06V 40/20 - Movements or behaviour, e.g. gesture recognition

9.

GEO-FENCE AUTHORIZATION PROVISIONING

      
Application Number 18521752
Status Pending
Filing Date 2023-11-28
First Publication Date 2024-03-21
Owner Snap Inc. (USA)
Inventor
  • Allen, Nicholas Richard
  • Chang, Sheldon

Abstract

A system includes a communication module that receives a request to post content to an event gallery associated with an event. The request in turn includes geo-location data for a device sending the content, and identification data identifying the device or a user of the device. The system further has an event gallery module to perform a first authorization operation that includes determining that the geo-location data corresponds to a geo-location fence associated with an event. The event gallery module also performs a second authorization operation that includes using the identification data to verify an attribute of the user. Finally, based on the first and second authorization operations, the event gallery module may selectively authorize the device to post the content to the event gallery.

IPC Classes  ?

  • H04L 9/40 - Network security protocols
  • H04L 51/222 - Monitoring or handling of messages using geographical location information, e.g. messages transmitted or received in proximity of a certain spot or area
  • H04L 51/52 - User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail for supporting social networking services
  • H04W 4/02 - Services making use of location information
  • H04W 4/021 - Services related to particular areas, e.g. point of interest [POI] services, venue services or geofences
  • H04W 4/029 - Location-based management or tracking services
  • H04W 4/18 - Information format or content conversion, e.g. adaptation by the network of the transmitted or received information for the purpose of wireless delivery to users or terminals
  • H04W 12/06 - Authentication
  • H04W 12/64 - Location-dependent; Proximity-dependent using geofenced areas

10.

USER INTERFACE FOR POSE DRIVEN VIRTUAL EFFECTS

      
Application Number 18520255
Status Pending
Filing Date 2023-11-27
First Publication Date 2024-03-21
Owner Snap Inc. (USA)
Inventor
  • Alavi, Amir
  • Rykhliuk, Olha
  • Shi, Xintong
  • Solichin, Jonathan
  • Voronova, Olesia
  • Yagodin, Artem

Abstract

Systems and methods herein describe a method for capturing a video in real-time by an image capture device. The system provides a plurality of visual pose hints, identifies first pose information in the video while capturing the video, applies a first series of virtual effects to the video, identifies second pose information, and applies a second series of virtual effects to the video, the second series of virtual effects based on the first series of virtual effects.

IPC Classes  ?

  • H04N 5/262 - Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 7/246 - Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
  • G06T 11/00 - 2D [Two Dimensional] image generation
  • G06V 40/20 - Movements or behaviour, e.g. gesture recognition
  • H04N 23/611 - Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
  • H04N 23/63 - Control of cameras or camera modules by using electronic viewfinders

11.

DEFORMING REAL-WORLD OBJECT USING IMAGE WARPING

      
Application Number US2023032181
Publication Number 2024/058966
Status In Force
Filing Date 2023-09-07
Publication Date 2024-03-21
Owner SNAP INC. (USA)
Inventor
  • Guler, Riza Alp
  • Tam, Himmy
  • Wang, Haoyang
  • Kakolyris, Antonios

Abstract

Methods and systems are disclosed for performing real-time deforming operations. The system receives an image that includes a depiction of a real-world object. The system applies a machine learning model to the image to generate a warping field and segmentation mask, the machine learning model trained to establish a relationship between a plurality of training images depicting real -world objects and corresponding ground-truth warping fields and segmentation masks associated with a target shape. The system applies the generated warping field and segmentation mask to the image to warp the real- world object depicted in the image to the target shape.

IPC Classes  ?

  • G06T 3/00 - Geometric image transformation in the plane of the image
  • G06T 7/194 - Segmentation; Edge detection involving foreground-background segmentation
  • G06T 11/00 - 2D [Two Dimensional] image generation

12.

MULTIPATH OPTICAL DEVICE

      
Application Number EP2023075362
Publication Number 2024/056832
Status In Force
Filing Date 2023-09-14
Publication Date 2024-03-21
Owner
  • SNAP, INC. (USA)
  • SNAP GROUP LIMITED (United Kingdom)
Inventor
  • Crai, Alexandra
  • Webber, Alexander James Lewarne
  • Valera, Mohmed Salim

Abstract

An optical device for use in an augmented reality or virtual reality display, comprising: a waveguide; an input diffractive optical element, DOE, configured to receive light from a projector and to couple the received light into the waveguide along a plurality of optical paths; an output DOE offset from the input DOE along a first direction and configured to couple the received light out of the waveguide and towards a viewer; a first turning DOE offset from the input DOE along a second direction different from the first direction; wherein the input DOE is configured to couple a first portion of the received light in the second direction towards the first turning DOE and the first turning DOE is configured to diffract the first portion of the received light towards the output DOE, and the input DOE is configured to couple a second portion of the received light in the first direction towards the output DOE.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 5/18 - Diffracting gratings

13.

EYEWEAR WITH STRAIN GAUGE WEAR DETECTION

      
Application Number US2023029066
Publication Number 2024/058870
Status In Force
Filing Date 2023-07-31
Publication Date 2024-03-21
Owner SNAP INC. (USA)
Inventor
  • Heger, Jason
  • Kalkgruber, Matthias
  • Mendez, Erick Mendez

Abstract

An eyewear device including a strain gauge sensor to determine when the eyewear device is manipulated by a user, such as being put on, taken off, and interacted with. A processor identifies a signature event based on sensor signals received from the strain gauge sensor and a data table of strain gauge sensor measurements corresponding to signature events. The processor controls the eyewear device as a function of the identified signature event, such as powering on a display of the eyewear device as the eyewear device is being put on a user's head, and then turning of the display when the eyewear device is removed from the user's head.

IPC Classes  ?

  • G02C 11/00 - Non-optical adjuncts; Attachment thereof

14.

WATERPROOF UAV FOR CAPTURING IMAGES

      
Application Number US2023029071
Publication Number 2024/058872
Status In Force
Filing Date 2023-07-31
Publication Date 2024-03-21
Owner SNAP INC. (USA)
Inventor
  • Moll, Sharon
  • Zhang, Dawei

Abstract

A waterproof UAV that records camera footage while traveling through air and while submerged in water. The UAV alters speed and direction of propellers dependent on the medium that the UAV is traveling through to provide control of the UAV. The propellers are capable of spinning in both directions to enable the UAV to change its depth and orientation in water. A machine learning (ML) model is used to identify humans and objects underwater. A housing coupled to the UAV makes the UAV positively buoyant to float in water and to control buoyancy while submerged.

IPC Classes  ?

  • B64U 10/14 - Flying platforms with four distinct rotor axes, e.g. quadcopters
  • B64U 30/26 - Ducted or shrouded rotors
  • B64U 20/70 - Constructional aspects of the UAV body
  • B64U 60/10 - Undercarriages specially adapted for use on water
  • B64U 20/87 - Mounting of imaging devices, e.g. mounting of gimbals
  • B64C 39/02 - Aircraft not otherwise provided for characterised by special use
  • G06N 20/00 - Machine learning
  • B64U 101/30 - UAVs specially adapted for particular uses or applications for imaging, photography or videography

15.

PUSH NOTIFICATION MANAGEMENT

      
Application Number 18525658
Status Pending
Filing Date 2023-11-30
First Publication Date 2024-03-21
Owner Snap Inc. (USA)
Inventor
  • Castro, Alex Joseph
  • Murray, Michael Brian
  • Wu, William

Abstract

A push notification mechanism at a mobile user device provides for automated limiting of the rate of production of push notification alerts (such as an audible alert or a vibratory alert) and/or push notifications responsive to the occurrence of chat events relevant to a chat application hosted by the user device. Some chat events automatically trigger suppression periods during which push notification alerts are prevented for subsequent chat events that satisfy predefined suppression criteria. Such push notification and/or alert limiting can be performed separately for separate users, chat groups, and/or chat event types.

IPC Classes  ?

  • H04L 67/55 - Push-based network services
  • H04L 51/04 - Real-time or near real-time messaging, e.g. instant messaging [IM]
  • H04L 51/224 - Monitoring or handling of messages providing notification on incoming messages, e.g. pushed notifications of received messages
  • H04L 51/52 - User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail for supporting social networking services

16.

MEDIA GALLERY SHARING AND MANAGEMENT

      
Application Number 18520365
Status Pending
Filing Date 2023-11-27
First Publication Date 2024-03-21
Owner Snap Inc. (USA)
Inventor
  • Kennedy, David James
  • Muñoz Escalante, Diego
  • Spool, Arianne
  • Xia, Yinghua David

Abstract

Various embodiments include systems, methods, and non-transitory computer-readable media for sharing and managing media galleries. Consistent with these embodiments, a method includes receiving a request from a first device to share a media gallery that includes a user avatar; generating metadata associated with the media gallery; generating a message associated with the media gallery, the message at least including the media gallery identifier and the identifier of the user avatar; and transmitting the message to a second device of the recipient user.

IPC Classes  ?

  • H04L 51/10 - Multimedia information
  • G06F 3/0482 - Interaction with lists of selectable items, e.g. menus
  • G06F 3/0484 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
  • H04L 67/146 - Markers for unambiguous identification of a particular session, e.g. session cookie or URL-encoding

17.

GRAPHICAL ASSISTANCE WITH TASKS USING AN AR WEARABLE DEVICE

      
Application Number 17947889
Status Pending
Filing Date 2022-09-19
First Publication Date 2024-03-21
Owner Snap Inc. (USA)
Inventor
  • Gurgul, Piotr
  • Moll, Sharon

Abstract

Systems, methods, and computer readable media for graphical assistance with tasks using an augmented reality (AR) wearable devices are disclosed. Embodiments capture an image of a first user view of a real-world scene and access indications of surfaces and locations of the surfaces detected in the image. The AR wearable device displays indications of the surfaces on a display of the AR wearable device where the locations of the indications are based on the locations of the surfaces and a second user view of the real-world scene. The locations of the surfaces are indicated with 3D world coordinates. The user views are determined based on a location of the user. The AR wearable device enables a user to add graphics to the surfaces and select tasks to perform. Tools such as a bubble level or a measuring tool are available for the user to utilize to perform the task.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G02B 27/01 - Head-up displays
  • G06T 7/70 - Determining position or orientation of objects or cameras
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
  • G06V 10/74 - Image or video pattern matching; Proximity measures in feature spaces
  • G06V 20/20 - Scenes; Scene-specific elements in augmented reality scenes

18.

VIDEO GENERATION SYSTEM TO RENDER FRAMES ON DEMAND USING A FLEET OF GPUS

      
Application Number 18520203
Status Pending
Filing Date 2023-11-27
First Publication Date 2024-03-21
Owner Snap Inc. (USA)
Inventor
  • Kotsopoulos, Bradley
  • Semory, Eli
  • Sheth, Rahul Bhupendra

Abstract

A content controller system to render frames on demand comprises a rendering server system that includes a plurality of graphics processing units (GPUs). The GPUs in the rendering server system render a set of media content item segments using a media content identification and a main user identification. Rendering the set of media content item segments includes retrieving metadata from a metadata database associated with the media content identification, rendering the set of media content item segments using the metadata, generating a main user avatar based on the main user identification, and incorporating the main user avatar into the set of media content item segments. The rendering server system then uploads the set of media content item segments to a segment database; and updates segment states in a segment state database to indicate that the set of media content item segments are available. Other embodiments are disclosed herein.

IPC Classes  ?

  • H04N 21/262 - Content or additional data distribution scheduling, e.g. sending additional data at off-peak times, updating software modules, calculating the carousel transmission frequency, delaying a video stream transmission or generating play-lists
  • G06F 16/23 - Updating
  • G06F 16/43 - Querying
  • G06T 1/20 - Processor architectures; Processor configuration, e.g. pipelining
  • H04N 21/234 - Processing of video elementary streams, e.g. splicing of video streams or manipulating MPEG-4 scene graphs
  • H04N 21/235 - Processing of additional data, e.g. scrambling of additional data or processing content descriptors
  • H04N 21/239 - Interfacing the upstream path of the transmission network, e.g. prioritizing client requests
  • H04N 21/258 - Client or end-user data management, e.g. managing client capabilities, user preferences or demographics or processing of multiple end-users preferences to derive collaborative data
  • H04N 21/84 - Generation or processing of descriptive data, e.g. content descriptors

19.

THREE-DIMENSIONAL ASSET RECONSTRUCTION

      
Application Number US2023029068
Publication Number 2024/058871
Status In Force
Filing Date 2023-07-31
Publication Date 2024-03-21
Owner SNAP INC. (USA)
Inventor
  • Vasilkovskii, Mikhail
  • Demyanov, Sergey
  • Shakhrai, Vladislav

Abstract

A three-dimensional asset (3D) reconstruction technique for generating a 3D asset representing an object from images of the object. The images are captured from different viewpoints in a darkroom using one or more light sources having known locations. The system estimates camera poses for each of the captured images and then constructs a 3D surface mesh made up of surfaces using the captured images and their respective estimated camera poses. Texture properties for each of the surfaces of the 3D surface mesh are then refined to generate the 3D asset.

IPC Classes  ?

20.

EGOCENTRIC HUMAN BODY POSE TRACKING

      
Application Number US2023032755
Publication Number 2024/059206
Status In Force
Filing Date 2023-09-14
Publication Date 2024-03-21
Owner SNAP INC. (USA)
Inventor
  • Arakawa, Riku
  • Krishnan Gorumkonda, Gurunandan
  • Nayar, Shree K.
  • Zhou, Bing

Abstract

A pose tracking system is provided. The pose tracking system includes an EMF tracking system having a user-worn head-mounted EMF source and one or more user-worn EMF tracking sensors attached to the wrists of the user. The EMF source is associated with a VIO tracking system such as AR glasses or the like. The pose tracking system determines a pose of the user's head and a ground plane using the VIO tracking system and a pose of the user's hands using the EMF tracking system to determine a fullbody pose for the user. Metal interference with the EMF tracking system is minimized using an IMU mounted with the EMF tracking sensors. Long term drift in the IMU and the VIO tracking system are minimized using the EMF tracking system.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/0346 - Pointing devices displaced or positioned by the user; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors

21.

FINGER GESTURE RECOGNITION VIA ACOUSTIC-OPTIC SENSOR FUSION

      
Application Number US2023032717
Publication Number 2024/059182
Status In Force
Filing Date 2023-09-14
Publication Date 2024-03-21
Owner SNAP INC. (USA)
Inventor
  • Krishnan Gorumkonda, Gurunandan
  • Nayar, Shree K.
  • Xu, Chenhan
  • Zhou, Bing

Abstract

A finger gesture recognition system is provided. The finger gesture recognition system includes one or more audio sensors and one or more optic sensors. The finger gesture recognition system captures, using the one or more audio sensors, audio signal data of a finger gesture being made by a user, and captures, using the one or more optic sensors, optic signal data of the finger gesture. The finger gesture recognition system recognizes the finger gesture based on the audio signal data and the optic signal data and communicates finger gesture data of the recognized finger gesture to an Augmented Reality/Combined Reality/Virtual Reality (XR) application.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

22.

Eyewear tether

      
Application Number 18141661
Grant Number 11934038
Status In Force
Filing Date 2023-05-01
First Publication Date 2024-03-19
Grant Date 2024-03-19
Owner SNAP INC. (USA)
Inventor
  • Ben-Haim, Yoav
  • Sehrawat, Varun
  • Dabov, Teodor
  • Ardisana, John Bernard

Abstract

Eyewear devices including a tether and methods for identifying proper installation of the tether are disclosed. An eyewear device includes transmission lines extending through the temples to electrical and electronic components positioned adjacent to edges of a frame. A tether is attached to the temples to enable power and communication flow between the electrical and electronic components rather than through the frame. Proper installation is identified based on communications passing between the electrical and electronic components via the tether.

IPC Classes  ?

  • G02C 11/00 - Non-optical adjuncts; Attachment thereof
  • G02C 3/00 - Special supporting arrangement for lens assemblies or monocles
  • G02C 5/00 - Constructions of non-optical parts
  • G02C 5/02 - Bridges; Browbars; Intermediate bars
  • G02C 5/14 - Side-members

23.

Controlling brightness based on eye tracking

      
Application Number 18051330
Grant Number 11935442
Status In Force
Filing Date 2022-10-31
First Publication Date 2024-03-19
Grant Date 2024-03-19
Owner SNAP INC. (USA)
Inventor Patton, Russell Douglas

Abstract

Methods and systems are disclosed for performing operations for controlling brightness in an AR device. The operations comprise displaying an image on an eyewear device worn by a user; detecting a gaze direction of a pupil of the user; identifying a first region of the image that corresponds to the gaze direction of the pupil; and modifying a brightness level or value of pixels in the image based on the gaze direction such that pixels in the first region of the image are set to a first brightness value and pixels in a second region of the image are set to a second brightness value that is lower than the first brightness value.

IPC Classes  ?

  • G09G 3/00 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
  • G02B 27/01 - Head-up displays
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

24.

SCULPTING AUGMENTED REALITY CONTENT USING GESTURES IN A MESSAGING SYSTEM

      
Application Number 17930927
Status Pending
Filing Date 2022-09-09
First Publication Date 2024-03-14
Owner Snap Inc. (USA)
Inventor
  • Goodrich, Kyle
  • Kaminski, Kurt
  • Mcphee, Andrew James
  • Moreno, Daniel

Abstract

The subject technology detects from a set of frames, a first gesture, the first gesture corresponding to a pinch gesture. The subject technology detects a first location and a first position of a first representation of a first finger from the first gesture and a second location and a second position of a second representation of a second finger from the first gesture. The subject technology detects a first collision event corresponding to a first collider and a second collider intersecting with a third collider of a first virtual object. The subject technology detects a first change in the first location and the first position and a second change in the second location and the second position. The subject technology modifies the first virtual object to include an additional augmented reality content based at least in part on the first change and the second change.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 7/20 - Analysis of motion
  • G06T 7/70 - Determining position or orientation of objects or cameras
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
  • G06V 10/74 - Image or video pattern matching; Proximity measures in feature spaces
  • G06V 40/20 - Movements or behaviour, e.g. gesture recognition
  • H04L 51/046 - Interoperability with other network applications or services

25.

CLUSTERING VIDEOS USING A SELF-SUPERVISED DNN

      
Application Number 17939256
Status Pending
Filing Date 2022-09-07
First Publication Date 2024-03-14
Owner Snap Inc. (USA)
Inventor
  • Coskun, Huseyin
  • Zareian, Alireza
  • Moore, Joshua
  • Wang, Chen

Abstract

Systems and methods are provided for clustering videos. The system accesses a plurality of content items, the plurality of content items comprising a first set of RGB video frames and a second set of optical flow frames corresponding to the first set of RGB video frames. The system processes the first set of RGB video frames by a first machine learning model to generate a first optimal assignment for the first set of RGB video frames, the first optimal assignment representing initial clustering of the first set of RGB video frames. The system generates an updated first optimal assignment for the first set of RGB video frames based on the first optimal assignment for the first set of RGB video frames and a second optimal assignment of the second set of optical flow frames, the second optimal assignment representing initial clustering of the second set of optical flow frames.

IPC Classes  ?

  • G06V 10/762 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using clustering, e.g. of similar faces in social networks
  • G06T 5/50 - Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
  • G06V 10/82 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks

26.

VIRTUAL OBJECT MANIPULATION WITH GESTURES IN A MESSAGING SYSTEM

      
Application Number 17941435
Status Pending
Filing Date 2022-09-09
First Publication Date 2024-03-14
Owner Snap Inc. (USA)
Inventor
  • Goodrich, Kyle
  • Lazarov, Maxim Maximov
  • Mcphee, Andrew James
  • Moreno, Daniel

Abstract

The subject technology detects a first gesture and a second gesture, each gesture corresponding to an open trigger finger gesture. The subject technology detects a third gesture and a fourth gesture, each gesture corresponding to a closed trigger finger gesture. The subject technology, selects a first virtual object in a first scene. The subject technology detects a first location and a first position of a first representation of a first finger from the third gesture and a second location and a second position of a second representation of a second finger from the fourth gesture. The subject technology detects a first change in the first location and the first position and a second change in the second location and the second position. The subject technology modifies a set of dimensions of the first virtual object to a different set of dimensions.

IPC Classes  ?

  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

27.

GESTURES TO ENABLE MENUS USING AUGMENTED REALITY CONTENT IN A MESSAGING SYSTEM

      
Application Number 17941522
Status Pending
Filing Date 2022-09-09
First Publication Date 2024-03-14
Owner Snap Inc. (USA)
Inventor
  • Goodrich, Kyle
  • Lazarov, Maxim Maximov
  • Mcphee, Andrew James
  • Moreno, Daniel

Abstract

The subject technology detects a first location and a first position of a first representation of a first finger and a second location and a second position of a second representation of a second finger. The subject technology detects a first particular location and a first particular position of a first particular representation of a first particular finger and a second particular location and a second particular position of a second particular representation of a second particular finger. The subject technology detects a first change in the first location and the first position and a second change in the second location and the second position. The subject technology detects a first particular change in the first particular location and the first particular position and a second particular change in the second particular location and the second particular position. The subject technology generates a set of virtual objects.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 7/20 - Analysis of motion
  • G06T 7/70 - Determining position or orientation of objects or cameras
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
  • G06V 10/74 - Image or video pattern matching; Proximity measures in feature spaces
  • G06V 20/20 - Scenes; Scene-specific elements in augmented reality scenes
  • G06V 40/20 - Movements or behaviour, e.g. gesture recognition
  • H04L 51/046 - Interoperability with other network applications or services

28.

DEFORMING REAL-WORLD OBJECT USING IMAGE WARPING

      
Application Number 17973295
Status Pending
Filing Date 2022-10-25
First Publication Date 2024-03-14
Owner Snap Inc. (USA)
Inventor
  • Guler, Riza Alp
  • Tam, Himmy
  • Wang, Haoyang
  • Kakolyris, Antonios

Abstract

Methods and systems are disclosed for performing real-time deforming operations. The system receives an image that includes a depiction of a real-world object. The system applies a machine learning model to the image to generate a warping field and segmentation mask, the machine learning model trained to establish a relationship between a plurality of training images depicting real-world objects and corresponding ground-truth warping fields and segmentation masks associated with a target shape. The system applies the generated warping field and segmentation mask to the image to warp the real-world object depicted in the image to the target shape.

IPC Classes  ?

  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
  • G06T 3/00 - Geometric image transformation in the plane of the image
  • G06T 7/10 - Segmentation; Edge detection
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

29.

MUTABLE GEO-FENCING SYSTEM

      
Application Number 18508771
Status Pending
Filing Date 2023-11-14
First Publication Date 2024-03-14
Owner Snap Inc. (USA)
Inventor
  • Azmoodeh, Farnaz
  • Sellis, Peter
  • Yang, Jinlin

Abstract

In various embodiments, boundaries of geo-fences can be made mutable based on principles described herein. The term “mutable” refers to the ability of a thing (in this case, the boundary of a geo-fence) to change and adjust. In a typical embodiment, a mutable geo-fence system is configured to generate and monitor a geo-fence that encompasses a region, in order to dynamically vary the boundary of the geo-fence based on a number of boundary variables. The term “geo-fence” as used herein describes a virtual perimeter (e.g., a boundary) for a real-world geographic area. A geo-fence could be a radius around a point (e.g., a store), or a set of predefined boundaries. Boundary variables, as used herein, refers to a set of variables utilized by the mutable geo-fence system in determining a location of the boundary of the geo-fence.

IPC Classes  ?

  • G06Q 30/0251 - Targeted advertisements
  • G06Q 30/0272 - Period of advertisement exposure
  • H04M 15/00 - Arrangements for metering, time-control or time-indication
  • H04W 4/021 - Services related to particular areas, e.g. point of interest [POI] services, venue services or geofences
  • H04W 4/24 - Accounting or billing

30.

Augmented reality guidance that generates guidance markers

      
Application Number 18510286
Status Pending
Filing Date 2023-11-15
First Publication Date 2024-03-14
Owner Snap Inc. (USA)
Inventor
  • Kang, Shin Hwun
  • Kucher, Dmytro
  • Hovorov, Dmytro
  • Canberk, Ilteris

Abstract

Augmented reality guidance for guiding a user through an environment using an eyewear device. The eyewear device includes a display system and a position detection system. A user is guided though an environment by monitoring a current position of the eyewear device within the environment, identifying marker positions within a threshold of the current position, the marker positions defined with respect to the environment and associated with guidance markers, registering the marker positions, generating overlay image including the guidance markers, and presenting the overlay image on a display of the eyewear device.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G02B 27/01 - Head-up displays
  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods

31.

SELECTING ADS FOR A VIDEO WITHIN A MESSAGING SYSTEM

      
Application Number 18514929
Status Pending
Filing Date 2023-11-20
First Publication Date 2024-03-14
Owner Snap Inc. (USA)
Inventor
  • Blackwood, John Cain
  • Lonkar, Chinmay
  • Lue, David B.
  • Penner, Kevin Lee

Abstract

Aspects of the present disclosure involve a system comprising a computer-readable storage medium storing a program and method for selecting ads for a video. The program and method provide for receiving a request for an ad to insert into a video playing on a client device, the request including a first content identifier that identifies a first type of content included in the video; determining a set of content identifiers associated with the first content identifier, the set of content identifiers identifying second types of content to filter with respect to providing the ad in response to the request; selecting an ad from among plural ads, by filtering ads tagged with a second content identifier included in the set of content identifiers; and providing the selected ad as a response to the request.

IPC Classes  ?

  • H04N 21/44 - Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to MPEG-4 scene graphs
  • G06F 16/245 - Query processing
  • H04N 21/4788 - Supplemental services, e.g. displaying phone caller identification or shopping application communicating with other users, e.g. chatting
  • H04N 21/81 - Monomedia components thereof
  • H04N 21/84 - Generation or processing of descriptive data, e.g. content descriptors

32.

CARRY CASE FOR RECHARGEABLE EYEWEAR DEVICES

      
Application Number 18515096
Status Pending
Filing Date 2023-11-20
First Publication Date 2024-03-14
Owner Snap Inc. (USA)
Inventor
  • Kim, Jinwoo
  • Lin, Jun

Abstract

A carry case for an electronics-enabled eyewear device, such as smart glasses, has charging contacts that are movable relative to a storage chamber in which the eyewear device is receivable. The charging contacts are connected to a battery carried by the case for charging the eyewear device via contact coupling of the charging contacts to corresponding contact formations on an exterior of the eyewear device. The charging contacts are in some instances mounted on respective flexible walls defining opposite extremities of the storage chamber. The contact formations on the eyewear device are in some instances provided by hinge assemblies that couple respective temples to a frame of the eyewear device.

IPC Classes  ?

  • A45C 11/04 - Spectacle cases; Pince-nez cases
  • H02J 7/00 - Circuit arrangements for charging or depolarising batteries or for supplying loads from batteries
  • H02J 7/34 - Parallel operation in networks using both storage and other dc sources, e.g. providing buffering

33.

DESTINATION SHARING IN LOCATION SHARING SYSTEM

      
Application Number 18516785
Status Pending
Filing Date 2023-11-21
First Publication Date 2024-03-14
Owner Snap Inc. (USA)
Inventor
  • Baylin, Benoit
  • Dancie, Nicolas
  • Martin, Antoine
  • Martin, Julien
  • Sinton, Antoine
  • Uzan, Steven

Abstract

Methods, systems, and devices are described for predicting a destination of a user and sharing the presumed destination with the other users via a geographically-based graphical user interface. Consistent with some embodiments, an electronic communication containing location information is received from a location sensor coupled to a first client device. A current trajectory of the first user is determined based on the location information. A presumed destination of the first user is determined, by correlating the current trajectory of the first user with historical location information of the first user. A map depicting an icon associated with the presumed destination of the first user is displayed, on a display screen of a second client device of a second user.

IPC Classes  ?

  • H04W 4/029 - Location-based management or tracking services
  • G01C 21/34 - Route searching; Route guidance
  • G06F 3/0482 - Interaction with lists of selectable items, e.g. menus
  • H04W 4/024 - Guidance services
  • H04W 8/18 - Processing of user or subscriber data, e.g. subscribed services, user preferences or user profiles; Transfer of user or subscriber data
  • H04W 12/63 - Location-dependent; Proximity-dependent
  • H04W 64/00 - Locating users or terminals for network management purposes, e.g. mobility management

34.

SELECTING AR BUTTONS ON A HAND

      
Application Number US2023031980
Publication Number 2024/054434
Status In Force
Filing Date 2023-09-05
Publication Date 2024-03-14
Owner SNAP INC. (USA)
Inventor Crispin, Sterling

Abstract

Systems and methods are provided for performing AR button selection operations on an augmented reality (AR) device. The system displays, by an AR device, a plurality of AR objects on a display region that overlaps a first real-world object, each of the plurality of AR objects being associated with an object selection region. The system computes a first spatial relationship factor for a first AR object of the plurality of AR objects based on a position of the first AR object relative to a position of a second real -world object and adjusts the object selection region of the first AR object based on the first spatial relationship factor. The system activates the first AR object in response to determining that the second real -world object overlaps the object selection region of the first AR object.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/00 - Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
  • G02B 27/01 - Head-up displays
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06N 20/00 - Machine learning

35.

3D CURSOR FUNCTIONALITY FOR AUGMENTED REALITY CONTENT IN MESSAGING SYSTEMS

      
Application Number US2023073639
Publication Number 2024/054909
Status In Force
Filing Date 2023-09-07
Publication Date 2024-03-14
Owner SNAP INC. (USA)
Inventor
  • Goodrich, Kyle
  • Lazarov, Maxim Maximov
  • Moreno, Daniel

Abstract

The subject technology detects a location and a position of a representation of a finger. The subject technology generates a first virtual object based on the location and the position of the representation of the finger. The subject technology detects a first collision event. The subject technology in response to the first collision event, modifies a set of dimensions of the second virtual object to a second set of dimensions. The subject technology detects a second location and a second position of the representation of the finger. The subject technology detects a second collision event. The subject technology modifies a set of dimensions of the third virtual object to a third set of dimensions. The subject technology renders the third virtual object based on the third set of dimensions within a third scene, the third scene comprising a modified scene from a second scene.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

36.

SHOOTING INTERACTION USING AUGMENTED REALITY CONTENT IN A MESSAGING SYSTEM

      
Application Number US2023073647
Publication Number 2024/054915
Status In Force
Filing Date 2023-09-07
Publication Date 2024-03-14
Owner SNAP INC. (USA)
Inventor
  • Goodrich, Kyle
  • Lazarov, Maxim Maximov
  • Moreno, Daniel

Abstract

The subject technology receives a set of frames. The subject technology detect a first gesture correspond to an open trigger finger gesture. The subject technology receives a second set of frames. The subject technology detects from the second set of frames, a second gesture correspond to a closed trigger finger gesture. The subject technology detects a location and a position of a representation of a finger from the closed trigger finger gesture. The subject technology generates a first virtual object based at least in part on the location and the position of the representation of the finger. The subject technology renders a movement of the first virtual object along a vector away from the location and the position of the representation of the finger within a first scene. The subject technology provides for display the rendered movement of the first virtual object along the vector within the first scene.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/03 - Arrangements for converting the position or the displacement of a member into a coded form
  • G06F 3/04845 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour

37.

VIRTUAL OBJECT MANIPULATION WITH GESTURES IN A MESSAGING SYSTEM

      
Application Number US2023073781
Publication Number 2024/054999
Status In Force
Filing Date 2023-09-08
Publication Date 2024-03-14
Owner SNAP INC. (USA)
Inventor
  • Goodrich, Kyle
  • Moreno, Daniel

Abstract

The subject technology detects a first gesture and a second gesture, each gesture corresponding to an open trigger finger gesture. The subject technology detects a third gesture and a fourth gesture, each gesture corresponding to a closed trigger finger gesture. The subject technology, selects a first virtual object in a first scene. The subject technology detects a first location and a first position of a first representation of a first finger from the third gesture and a second location and a second position of a second representation of a second finger from the fourth gesture. The subject technology detects a first change in the first location and the first position and a second change in the second location and the second position. The subject technology modifies a set of dimensions of the first virtual object to a different set of dimensions.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/04842 - Selection of displayed objects or displayed text elements
  • G06F 3/04845 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour

38.

ADAPTIVE ILLUMINATOR SEQUENCING

      
Application Number 18507252
Status Pending
Filing Date 2023-11-13
First Publication Date 2024-03-14
Owner Snap Inc. (USA)
Inventor Ramanath, Rajeev

Abstract

An eyewear device is disclosed including an illumination device including illumination sources, each illumination source including a first illuminator, a second illuminator, and a third illuminator, and a spatial light modulator coupled to the illumination device to control when each of the first, second, and third illuminators are on during an illumination frame. The spatial light modulator is adapted to turn on the first illuminator while the second and third illuminators are off, turn on the second illuminator while the first and third illuminators are off, turn on the third illuminator while the first and second illuminators are off during a third time period of the illumination frame, and turn on the first, second and third illuminators during a fourth time period. An illumination method is also disclosed.

IPC Classes  ?

  • H04N 9/31 - Projection devices for colour picture display

39.

GENERATING PERSONALIZED VIDEOS WITH CUSTOMIZED TEXT MESSAGES

      
Application Number 18509420
Status Pending
Filing Date 2023-11-15
First Publication Date 2024-03-14
Owner Snap Inc. (USA)
Inventor
  • Mashrabov, Alexander
  • Shaburov, Victor
  • Savinova, Sofia
  • Matov, Dmitriy
  • Osipov, Andrew
  • Semenov, Ivan
  • Golobokov, Roman

Abstract

Described are systems and methods for generating personalized videos with customized text messages. An example method includes receiving an input text, a video template including a sequence of frame images, and at least one parameter for animation of the input text across the sequence of frame images, generating, based on the input text and the at least one parameter for animation, a configuration file including a text style for the input text for a frame in the sequence of frame images, and rendering, based on the configuration file, an output frame of an output video, where the output frame includes the frame in the sequence of frame images and a layer, and where the layer includes the input text stylized based on the text style. The method further includes providing an option enabling a user to change the at least one parameter for animation.

IPC Classes  ?

  • G06T 13/80 - 2D animation, e.g. using sprites
  • G11B 27/031 - Electronic editing of digitised analogue information signals, e.g. audio or video signals
  • H04L 51/04 - Real-time or near real-time messaging, e.g. instant messaging [IM]
  • H04L 51/10 - Multimedia information

40.

CUSTOMIZING MODIFIABLE VIDEOS OF MULTIMEDIA MESSAGING APPLICATION

      
Application Number 18509589
Status Pending
Filing Date 2023-11-15
First Publication Date 2024-03-14
Owner Snap Inc. (USA)
Inventor
  • Voss, Jeremy
  • Shaburov, Victor
  • Babanin, Ivan
  • Mashrabov, Aleksandr
  • Golobokov, Roman

Abstract

Provided are systems and methods for customizing modifiable videos. An example method includes analyzing recent messages associated with a user in a multimedia messaging application to determine a context of the recent messages, determining, based on the context, a property of a modifiable feature, selecting, based on the context, a list of relevant modifiable videos from a database configured to store modifiable videos associated with a preset modifiable feature, replacing a property of the preset modifiable feature in relevant modifiable videos of the list of relevant modifiable videos with the property of the modifiable feature, and rendering the list of relevant modifiable videos for viewing by the user, where the rendering includes displaying the modifiable feature in the relevant modifiable videos.

IPC Classes  ?

  • H04M 1/72439 - User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for image or video messaging
  • G06T 11/60 - Editing figures and text; Combining figures or text
  • G06T 13/80 - 2D animation, e.g. using sprites
  • H04L 51/02 - User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail using automatic reactions or user delegation, e.g. automatic replies or chatbot-generated messages
  • H04L 51/216 - Handling conversation history, e.g. grouping of messages in sessions or threads
  • H04M 1/72427 - User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality for supporting games or graphical animations
  • H04M 1/72436 - User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for text messaging, e.g. SMS or e-mail
  • H04M 1/72442 - User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality for playing music files

41.

AUTOMATICALLY ESTIMATING PLACE DATA ACCURACY

      
Application Number 18510195
Status Pending
Filing Date 2023-11-15
First Publication Date 2024-03-14
Owner Snap Inc. (USA)
Inventor Shughrue, Christopher

Abstract

Aspects of the present disclosure involve a system comprising a computer-readable storage medium storing a program and a method for performing operations comprising: receiving a plurality of records associated with a first geographical area; identifying a plurality of corrections to a first attribute in the first geographical area in the plurality of records for a particular time period; based on identifying the plurality of corrections to the first attribute, computing a first metric representing a quantity of the plurality of corrections to the first attribute per effort during the particular time period; accumulating a first value representing a total number of errors across a plurality of time periods up to and including the particular time period based on the identified plurality of corrections; and generating a first model that predicts accuracy of the first attribute in the first geographical area based on the metric and the accumulated first value.

IPC Classes  ?

42.

CONTEXT-SENSITIVE REMOTE EYEWEAR CONTROLLER

      
Application Number 18510381
Status Pending
Filing Date 2023-11-15
First Publication Date 2024-03-14
Owner Snap Inc. (USA)
Inventor
  • Nielsen, Simon
  • Rodriguez, Jonathan
  • Tham, Yu Jiang

Abstract

Context-sensitive remote controls for use with electronic devices (e.g., eyewear device). The electronic device is configured to perform activities (e.g., email, painting, navigation, gaming). The context-sensitive remote control includes a display having a display area, a display driver coupled to the display, and a transceiver. The remote control additionally includes memory that stores controller layout configurations for display in the display area of the display by the display driver. A processor in the context-sensitive remote control is configured to establish, via the transceiver, communication with an electronic device, detect an activity currently being performed by the electronic device, select one of the controller layout configurations responsive to the detected activity, and present, via the display driver, the selected controller layout configuration in the display area of the display.

IPC Classes  ?

  • A63F 13/22 - Setup operations, e.g. calibration, key configuration or button assignment
  • A63F 13/211 - Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
  • A63F 13/2145 - Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
  • A63F 13/24 - Constructional details thereof, e.g. game controllers with detachable joystick handles
  • A63F 13/327 - Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using local area network [LAN] connections using wireless networks, e.g. Wi-Fi or piconet
  • A63F 13/40 - Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
  • A63F 13/92 - Video game devices specially adapted to be hand-held while playing
  • G06F 3/0346 - Pointing devices displaced or positioned by the user; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
  • G06F 3/038 - Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
  • G06F 3/04842 - Selection of displayed objects or displayed text elements
  • G06F 3/04886 - Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

43.

FAST DATA ACCESSING SYSTEM USING OPTICAL BEACONS

      
Application Number 18514725
Status Pending
Filing Date 2023-11-20
First Publication Date 2024-03-14
Owner Snap Inc. (USA)
Inventor
  • Wang, Jian
  • Bayer, Karl
  • Nayar, Shree K.

Abstract

An apparatus to perform fast data access comprises a receiver, a processor, and a memory. The processor receives using the receiver a light signal from a light source. The light signal can be structured to generate a temporal code. The light source is an optical beacon that includes a Light-Emitting Diode (LED). The processor then decodes the light signal to generate a network address, and causes a display of a client device coupled to the apparatus to display information based on the network address. The network address can be a Uniform Resource Locator (URL) address and the information based on the network address includes a webpage associated with the URL. Other embodiments are described herein.

IPC Classes  ?

  • G06K 7/14 - Methods or arrangements for sensing record carriers by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
  • G06K 7/10 - Methods or arrangements for sensing record carriers by corpuscular radiation

44.

RADIAL GESTURE NAVIGATION

      
Application Number 18514760
Status Pending
Filing Date 2023-11-20
First Publication Date 2024-03-14
Owner Snap Inc. (USA)
Inventor
  • Blachly, Ty
  • Boyd, Nathan
  • Giovannini, Donald
  • Jayaram, Krish
  • Spiegel, Evan
  • Wu, William

Abstract

Systems and methods for radial gesture navigation are provided. In example embodiments, user input data is received from a user device. The user input data indicates a continuous physical user interaction associated with a display screen of the user device. An initial point and a current point are detected from the user input data. A radius distance for a circle that includes the current point and is centered about the initial point is determined. An action is selected from among multiple actions based on the radius distance being within a particular range among successive ranges along a straight line that starts at the initial point and extends through the circle. Each range among the successive ranges corresponds to a particular action among the multiple actions. The selected action is performed in response to detecting a completion of the continuous physical user interaction.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/041 - Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means

45.

CUSTOMIZED DIGITAL AVATAR ACCESSORIES

      
Application Number 18516664
Status Pending
Filing Date 2023-11-21
First Publication Date 2024-03-14
Owner Snap Inc. (USA)
Inventor
  • Blackstock, Jacob Edward
  • Escalante, Diego Muñoz
  • Grantham, Matthew Colin

Abstract

Among other things, embodiments of the present disclosure improve the functionality of electronic messaging software and systems by generating customized images with avatars of different users within electronic messages. For example, users of different mobile computing devices can exchange electronic communications with images generated to include avatars representing themselves as well as their friends, colleagues, and other acquaintances.

IPC Classes  ?

  • H04L 51/046 - Interoperability with other network applications or services
  • G06F 16/58 - Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
  • H04L 51/10 - Multimedia information
  • H04L 51/216 - Handling conversation history, e.g. grouping of messages in sessions or threads
  • H04L 51/52 - User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail for supporting social networking services
  • H04L 67/10 - Protocols in which an application is distributed across nodes in the network
  • H04L 67/306 - User profiles

46.

AUTO TRIMMING FOR AUGMENTED REALITY CONTENT IN MESSAGING SYSTEMS

      
Application Number 17941292
Status Pending
Filing Date 2022-09-09
First Publication Date 2024-03-14
Owner Snap Inc. (USA)
Inventor
  • Goodrich, Kyle
  • Lazarov, Maxim Maximov
  • Mcphee, Andrew James
  • Moreno, Daniel

Abstract

The subject technology receives frames of a source media content. The subject technology detects from the frames of the source media content, a first gesture indicating a cut point at a particular frame of the source media content, the cut point associated with a trimming operation to be performed on the source media content. The subject technology selects a starting frame and an ending frame from the frames based at least in part on the cut point at the particular frame. The subject technology performs the trimming operation based on the starting frame and the ending frame. The subject technology generates a second media content using the third set of frames. The subject technology provides for display at least a portion of the third set of frames of the second media content.

IPC Classes  ?

  • G11B 27/031 - Electronic editing of digitised analogue information signals, e.g. audio or video signals
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06V 40/20 - Movements or behaviour, e.g. gesture recognition

47.

3D CURSOR FUNCTIONALITY FOR AUGMENTED REALITY CONTENT IN MESSAGING SYSTEMS

      
Application Number 17941293
Status Pending
Filing Date 2022-09-09
First Publication Date 2024-03-14
Owner Snap Inc. (USA)
Inventor
  • Goodrich, Kyle
  • Lazarov, Maxim Maximov
  • Mcphee, Andrew James
  • Moreno, Daniel

Abstract

The subject technology detects a location and a position of a representation of a finger. The subject technology generates a first virtual object based on the location and the position of the representation of the finger. The subject technology detects a first collision event. The subject technology in response to the first collision event, modifies a set of dimensions of the second virtual object to a second set of dimensions. The subject technology detects a second location and a second position of the representation of the finger. The subject technology detects a second collision event. The subject technology modifies a set of dimensions of the third virtual object to a third set of dimensions. The subject technology renders the third virtual object based on the third set of dimensions within a third scene, the third scene comprising a modified scene from a second scene.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06T 7/70 - Determining position or orientation of objects or cameras
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts

48.

SHOOTING INTERACTION USING AUGMENTED REALITY CONTENT IN A MESSAGING SYSTEM

      
Application Number 17941301
Status Pending
Filing Date 2022-09-09
First Publication Date 2024-03-14
Owner Snap Inc. (USA)
Inventor
  • Goodrich, Kyle
  • Lazarov, Maxim Maximov
  • Mcphee, Andrew James
  • Moreno, Daniel

Abstract

The subject technology receives a set of frames. The subject technology detect a first gesture correspond to an open trigger finger gesture. The subject technology receives a second set of frames. The subject technology detects from the second set of frames, a second gesture correspond to a closed trigger finger gesture. The subject technology detects a location and a position of a representation of a finger from the closed trigger finger gesture. The subject technology generates a first virtual object based at least in part on the location and the position of the representation of the finger. The subject technology renders a movement of the first virtual object along a vector away from the location and the position of the representation of the finger within a first scene. The subject technology provides for display the rendered movement of the first virtual object along the vector within the first scene.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 7/70 - Determining position or orientation of objects or cameras
  • G06V 40/20 - Movements or behaviour, e.g. gesture recognition

49.

CURSOR FUNCTIONALITY FOR AUGMENTED REALITY CONTENT IN MESSAGING SYSTEMS

      
Application Number 17941303
Status Pending
Filing Date 2022-09-09
First Publication Date 2024-03-14
Owner Snap Inc. (USA)
Inventor
  • Goodrich, Kyle
  • Lazarov, Maxim Maximov
  • Mcphee, Andrew James
  • Moreno, Daniel

Abstract

The subject technology detects a location and a position of a representation of a finger in a set of frames captured by a camera of a client device. The subject technology generates a first virtual object based at least in part on the location and the position of the representation of the finger. The subject technology renders the first virtual object within a first scene. The subject technology detects a first collision event corresponding to a first collider of the first virtual object intersecting with a second collider of a second virtual object. The subject technology modifies a set of dimensions of the second virtual object to a second set of dimensions. The subject technology renders the second virtual object based on the second set of dimensions within a second scene. The subject technology provides for display the rendered second virtual object within the second scene.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 7/20 - Analysis of motion
  • G06T 7/70 - Determining position or orientation of objects or cameras
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
  • G06V 20/20 - Scenes; Scene-specific elements in augmented reality scenes
  • G06V 40/20 - Movements or behaviour, e.g. gesture recognition

50.

REMOTELY CHANGING SETTINGS ON AR WEARABLE DEVICES

      
Application Number US2023031364
Publication Number 2024/054377
Status In Force
Filing Date 2023-08-29
Publication Date 2024-03-14
Owner SNAP INC. (USA)
Inventor
  • Gurgul, Piotr
  • Moll, Sharon

Abstract

Systems, methods, and computer readable media are described for remotely changing settings on augmented reality (AR) wearable devices. Embodiments are disclosed that enable a user to change settings of an AR wearable device on a user interface (UI) provided by a host client device that can communicate wirelessly with the AR wearable device. The host client device and AR wearable device provide remote procedure calls (RFCs) and an application program interface (API) to access settings and determine if settings have been changed. The API enables the host client device to determine the settings on the AR wearable device without any prior knowledge of the settings on the AR wearable device. The RFCs and the API enable the host client device to automatically update the settings on the AR wearable device when the user changes the settings on the host client device.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 1/16 - Constructional details or arrangements
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • H04W 4/80 - Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication

51.

SELECTING A TILT ANGLE OF AN AR DISPLAY

      
Application Number US2023031487
Publication Number 2024/054381
Status In Force
Filing Date 2023-08-30
Publication Date 2024-03-14
Owner SNAP INC. (USA)
Inventor
  • Lucas, Benjamin
  • Meisenholder, David

Abstract

Systems, methods, and computer readable media for selecting a tilt angle of an augmented reality (AR) display of an AR wearable device. Some examples of the present disclosure capture simulation data of gaze fixations while users are performing tasks using applications resident on the AR wearable device. The tilt angle of the AR display is selected based on including more gaze fixations that are within the field of view (FOV) of the AR display than are outside the FOV of the AR display. In some examples, an AR wearable device is manufactured with a fixed vertical tilt angle for the AR display. In some examples, the AR wearable device can dynamically adjust the vertical tilt angle of the AR display based on the applications that a user of the AR wearable device is likely to use or is using.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 1/16 - Constructional details or arrangements
  • G02B 27/01 - Head-up displays

52.

AUTO TRIMMING FOR AUGMENTED REALITY CONTENT IN MESSAGING SYSTEMS

      
Application Number US2023073609
Publication Number 2024/054888
Status In Force
Filing Date 2023-09-07
Publication Date 2024-03-14
Owner SNAP INC. (USA)
Inventor
  • Goodrich, Kyle
  • Lazarov, Maxim Maximov
  • Mcphee, Andrew James
  • Moreno, Daniel

Abstract

The subject technology receives frames of a source media content. The subject technology detects from the frames of the source media content, a first gesture indicating a cut point at a particular frame of the source media content, the cut point associated with a trimming operation to be performed on the source media content. The subject technology selects a starting frame and an ending frame from the frames based at least in part on the cut point at the particular frame. The subject technology performs the trimming operation based on the starting frame and the ending frame. The subject technology generates a second media content using the third set of frames. The subject technology provides for display at least a portion of the third set of frames of the second media content.

IPC Classes  ?

  • G11B 27/031 - Electronic editing of digitised analogue information signals, e.g. audio or video signals
  • G11B 27/34 - Indicating arrangements
  • H04N 23/611 - Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
  • H04N 23/62 - Control of parameters via user interfaces

53.

CURSOR FUNCTIONALITY FOR AUGMENTED REALITY CONTENT IN MESSAGING SYSTEMS

      
Application Number US2023073635
Publication Number 2024/054906
Status In Force
Filing Date 2023-09-07
Publication Date 2024-03-14
Owner SNAP INC. (USA)
Inventor
  • Goodrich, Kyle
  • Lazarov, Maxim Maximov
  • Mcphee, Andrew James
  • Moreno, Daniel

Abstract

The subject technology detects a location and a position of a representation of a finger in a set of frames captured by a camera of a client device. The subject technology generates a first virtual object based at least in part on the location and the position of the representation of the finger. The subject technology renders the first virtual object within a first scene. The subject technology detects a first collision event corresponding to a first collider of the first virtual object intersecting with a second collider of a second virtual object. The subject technology modifies a set of dimensions of the second virtual object to a second set of dimensions. The subject technology renders the second virtual object based on the second set of dimensions within a second scene. The subject technology provides for display the rendered second virtual object within the second scene.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 3/0482 - Interaction with lists of selectable items, e.g. menus
  • G06F 3/04842 - Selection of displayed objects or displayed text elements
  • G06F 3/04845 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
  • G06F 3/04847 - Interaction techniques to control parameter settings, e.g. interaction with sliders or dials

54.

TRIGGER GESTURE FOR SELECTION OF AUGMENTED REALITY CONTENT IN MESSAGING SYSTEMS

      
Application Number US2023073776
Publication Number 2024/054995
Status In Force
Filing Date 2023-09-08
Publication Date 2024-03-14
Owner SNAP INC. (USA)
Inventor
  • Goodrich, Kyle
  • Moreno, Daniel

Abstract

The subject technology detects a first gesture corresponding to an open trigger finger gesture. The subject technology detects a location and a position of a representation of a finger from the open trigger finger gesture. The subject technology generates a first virtual object based at least in part on the location and the position of the representation of the finger. The subject technology detects a first collision event. The subject technology detects a second gesture corresponding to a closed trigger finger gesture. The subject technology selects the second virtual object. The subject technology renders the first virtual object as attached to the second virtual object in response to the selecting. The subject technology provides for display the rendered first virtual object as attached to the second virtual object within a first scene.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

55.

SCULPTING AUGMENTED REALITY CONTENT USING GESTURES IN A MESSAGING SYSTEM

      
Application Number US2023073783
Publication Number 2024/055001
Status In Force
Filing Date 2023-09-08
Publication Date 2024-03-14
Owner SNAP INC. (USA)
Inventor
  • Goodrich, Kyle
  • Kaminski, Kurt
  • Moreno, Daniel

Abstract

The subject technology detects from a set of frames, a first gesture, the first gesture corresponding to a pinch gesture. The subject technology detects a first location and a first position of a first representation of a first finger from the first gesture and a second location and a second position of a second representation of a second finger from the first gesture. The subject technology detects a first collision event corresponding to a first collider and a second collider intersecting with a third collider of a first virtual object. The subject technology detects a first change in the first location and the first position and a second change in the second location and the second position. The subject technology modifies the first virtual object to include an additional augmented reality content based at least in part on the first change and the second change.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
  • G06F 3/0482 - Interaction with lists of selectable items, e.g. menus
  • G06F 3/04842 - Selection of displayed objects or displayed text elements
  • G06F 3/04845 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
  • G06F 3/04847 - Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
  • G06V 40/20 - Movements or behaviour, e.g. gesture recognition

56.

GESTURES TO ENABLE MENUS USING AUGMENTED REALITY CONTENT IN A MESSAGING SYSTEM

      
Application Number US2023073786
Publication Number 2024/055004
Status In Force
Filing Date 2023-09-08
Publication Date 2024-03-14
Owner SNAP INC. (USA)
Inventor
  • Goodrich, Kyle
  • Moreno, Daniel

Abstract

The subject technology detects a first location and a first position of a first representation of a first finger and a second location and a second position of a second representation of a second finger. The subject technology detects a first particular location and a first particular position of a first particular representation of a first particular finger and a second particular location and a second particular position of a second particular representation of a second particular finger. The subject technology detects a first change in the first location and the first position and a second change in the second location and the second position. The subject technology detects a first particular change in the first particular location and the first particular position and a second particular change in the second particular location and the second particular position. The subject technology generates a set of virtual objects.

IPC Classes  ?

  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/04842 - Selection of displayed objects or displayed text elements

57.

REMOTELY CHANGING SETTINGS ON AR WEARABLE DEVICES

      
Application Number 17929985
Status Pending
Filing Date 2022-09-06
First Publication Date 2024-03-07
Owner Snap Inc. (USA)
Inventor
  • Gurgul, Piotr
  • Moll, Sharon

Abstract

Systems, methods, and computer readable media are described for remotely changing settings on augmented reality (AR) wearable devices. Embodiments are disclosed that enable a user to change settings of an AR wearable device on a user interface (UI) provided by a host client device that can communicate wirelessly with the AR wearable device. The host client device and AR wearable device provide remote procedure calls (RPCs) and an application program interface (API) to access settings and determine if settings have been changed. The API enables the host client device to determine the settings on the AR wearable device without any prior knowledge of the settings on the AR wearable device. The RPCs and the API enable the host client device to automatically update the settings on the AR wearable device when the user changes the settings on the host client device.

IPC Classes  ?

  • G06F 9/54 - Interprogram communication
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

58.

SELECTING A TILT ANGLE OF AN AR DISPLAY

      
Application Number 17930263
Status Pending
Filing Date 2022-09-07
First Publication Date 2024-03-07
Owner Snap Inc. (USA)
Inventor
  • Lucas, Benjamin
  • Meisenholder, David

Abstract

Systems, methods, and computer readable media for selecting a tilt angle of an augmented reality (AR) display of an AR wearable device. Some examples of the present disclosure capture simulation data of gaze fixations while users are performing tasks using applications resident on the AR wearable device. The tilt angle of the AR display is selected based on including more gaze fixations that are within the field of view (FOV) of the AR display than are outside the FOV of the AR display. In some examples, an AR wearable device is manufactured with a fixed vertical tilt angle for the AR display. In some examples, the AR wearable device can dynamically adjust the vertical tilt angle of the AR display based on the applications that a user of the AR wearable device is likely to use or is using.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G02B 27/01 - Head-up displays
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts

59.

SELECTING AR BUTTONS ON A HAND

      
Application Number 17939296
Status Pending
Filing Date 2022-09-07
First Publication Date 2024-03-07
Owner Snap Inc. (USA)
Inventor Crispin, Sterling

Abstract

Systems and methods are provided for performing AR button selection operations on an augmented reality (AR) device. The system displays, by an AR device, a plurality of AR objects on a display region that overlaps a first real-world object, each of the plurality of AR objects being associated with an object selection region. The system computes a first spatial relationship factor for a first AR object of the plurality of AR objects based on a position of the first AR object relative to a position of a second real-world object and adjusts the object selection region of the first AR object based on the first spatial relationship factor. The system activates the first AR object in response to determining that the second real-world object overlaps the object selection region of the first AR object.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
  • G06T 7/70 - Determining position or orientation of objects or cameras
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts

60.

AUTOMATED GIF GENERATION PLATFORM

      
Application Number 18388987
Status Pending
Filing Date 2023-11-13
First Publication Date 2024-03-07
Owner Snap Inc. (USA)
Inventor
  • Zhou, Kai
  • Au, Kenneth

Abstract

A system and a method for generating an automated GIF file generation system is described. In one aspect, the method includes accessing an animated GIF file, identifying a plurality of elements displayed in the animated GIF file, applying a variation of one or more elements to the animated GIF file, and generating a variant animated GIF file by applying the variation of the one or more elements to the animated GIF file. The system measures a trending metric of the variant animated GIF file based on a number of times the variant animated GIF file is shared on the communication platform and uses the trending metric as a feedback to generating the variant animated GIF file.

IPC Classes  ?

  • G06T 13/40 - 3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
  • G06F 16/903 - Querying
  • G06F 18/21 - Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06V 20/30 - Scenes; Scene-specific elements in albums, collections or shared content, e.g. social network photos or video
  • H04L 67/50 - Network services

61.

PRESENTING OVERVIEW OF PARTICIPANT REACTIONS WITHIN A VIRTUAL CONFERENCING SYSTEM

      
Application Number 18506861
Status Pending
Filing Date 2023-11-10
First Publication Date 2024-03-07
Owner Snap Inc. (USA)
Inventor
  • Lin, Andrew Cheng-Min
  • Lin, Walton

Abstract

Aspects of the present disclosure involve a system comprising a computer-readable storage medium storing a program and method for presenting an overview of participant reactions to a virtual conference. The program and method provide for a virtual conference between plural participants; provide, for each of the plural participants, display of reaction buttons which are selectable by the participant to indicate different reactions to the virtual conference; receive indication of selections of the reaction buttons by one or more of the plural participants; store an indication of the selections over time in association with recording the virtual conference; generate a graphical overview of reactions to the virtual conference based on the stored indication of the selections; and provide, for a first participant of the plural participants, display of the graphical overview.

IPC Classes  ?

  • H04L 12/18 - Arrangements for providing special services to substations for broadcast or conference
  • H04L 65/4038 - Arrangements for multi-party communication, e.g. for conferences with floor control

62.

FACE REENACTMENT

      
Application Number 18509502
Status Pending
Filing Date 2023-11-15
First Publication Date 2024-03-07
Owner Snap Inc. (USA)
Inventor
  • Savchenkov, Pavel
  • Matov, Dmitry
  • Mashrabov, Aleksandr
  • Pchelnikov, Alexey

Abstract

Provided are systems and methods for face reenactment. An example method includes receiving a target video that includes at least one target frame, where the at least one target frame includes a target face, receiving a scenario including a series of source facial expressions, determining, based on the target face, a target facial expression of the target face, synthesizing, based on a parametric face model and a texture model, an output face including the target face, where the target facial expression of the target face is modified to imitate a source facial expression of the series of source facial expressions, and generating, based on the output face, a frame of an output video. The parametric face model includes a template mesh pre-generated based on historical images of faces of a plurality of individuals, where the template mesh includes a pre-determined number of vertices.

IPC Classes  ?

  • G06V 40/16 - Human faces, e.g. facial parts, sketches or expressions
  • G06N 3/04 - Architecture, e.g. interconnection topology
  • G06Q 30/0251 - Targeted advertisements
  • G06T 11/00 - 2D [Two Dimensional] image generation

63.

MEDIA CONTENT ITEM GENERATION FOR A CONTENT SHARING PLATFORM

      
Application Number 18143331
Status Pending
Filing Date 2023-05-04
First Publication Date 2024-03-07
Owner Snap Inc. (USA)
Inventor
  • Wehrman, Ian Anthony
  • Goodwin, Giles
  • Iwata, Jared
  • Feingold, Eugene
  • Lemieux, David

Abstract

Systems and methods are provided for determining a set of selectors associated with the publisher identifier, each selector comprising specified content to extract from source data and one or more rules for extracting the specified content. The system and methods further provided for each location data in the list of location data, extracting, from the source data, specified content for each selector of at least a subset of the set of selectors based on the one or more rules specified in each selector of the at least the subset of the set of selectors; determining a template to use to generate the media content item, the template comprising regions corresponding to the one or more selectors; populating each region of the template using specified content for the corresponding selector; and generating the media content item from the populated template.

IPC Classes  ?

  • G06F 9/451 - Execution arrangements for user interfaces
  • G06F 8/36 - Software reuse
  • G06F 8/38 - Creation or generation of source code for implementing user interfaces
  • G06F 16/93 - Document management systems
  • G06F 16/955 - Retrieval from the web using information identifiers, e.g. uniform resource locators [URL]
  • G06F 16/958 - Organisation or management of web site content, e.g. publishing, maintaining pages or automatic linking

64.

REVEALING COLLABORATIVE OBJECT USING COUNTDOWN TIMER

      
Application Number US2023028387
Publication Number 2024/049575
Status In Force
Filing Date 2023-07-21
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Cho, Youjean
  • Ji, Chen
  • Liu, Fannie
  • Monroy-Hernández, Andrés
  • Tsai, Tsung-Yu
  • Vaish, Rajan

Abstract

A collaborative session (e.g., a virtual time capsule) in which access to a collaborative object and added virtual content is selectively provided to participants/users. In one example of the collaborative session, a processor provides users with access to a collaborative object using respective physically remote devices, and associates virtual content received from the users with the collaborative object during a collaboration period. The processor maintains a timer including a countdown indicative of when the collaboration period ends for associating virtual content with the collaborative object. The processor provides the users with access to the collaborative object with associated virtual content at the end of the collaboration period.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/048 - Interaction techniques based on graphical user interfaces [GUI]

65.

REAL-WORLD RESPONSIVENESS OF A COLLABORATIVE OBJECT

      
Application Number US2023028460
Publication Number 2024/049576
Status In Force
Filing Date 2023-07-24
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Cho, Youjean
  • Ji, Chen
  • Liu, Fannie
  • Monroy-Hernández, Andrés
  • Tsai, Tsung-Yu
  • Vaish, Rajan

Abstract

A collaborative session (e.g., a virtual time capsule) in which access to a collaborative object with an associated material and added virtual content is provided to users. In one example of the collaborative session, a user selects the associated material of the collaborative object. Physical characteristics are assigned to the collaborative object as a function of the associated material to be perceived by the participants when the collaborative object is manipulated. In one example, the material associated to the collaborative object is metal, wherein the interaction between the users and the collaborative object generates a response of the collaborative object that is indicative of the physical properties of metal, such as inertial, acoustic, and malleability.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
  • G06F 3/0484 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
  • G06Q 10/10 - Office automation; Time management
  • G06F 3/16 - Sound input; Sound output
  • G02B 27/01 - Head-up displays

66.

SELECTIVE COLLABORATIVE OBJECT ACCESS BASED ON TIMESTAMP

      
Application Number US2023028468
Publication Number 2024/049577
Status In Force
Filing Date 2023-07-24
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Cho, Youjean
  • Ji, Chen
  • Liu, Fannie
  • Monroy-Hernández, Andrés
  • Tsai, Tsung-Yu
  • Vaish, Rajan

Abstract

Collaborative sessions in which access to added virtual content is selectively made available to participants/users by a collaborative system. The system receives a request from a user to join a session and associates a timestamp with the user corresponding to receipt of the request. Users can edit the collaborative object if the timestamp is within the collaborative duration period and can view the collaborative object if the timestamp is after the collaborative duration period.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
  • G06Q 10/10 - Office automation; Time management
  • H04L 9/40 - Network security protocols
  • G02B 27/01 - Head-up displays

67.

SCISSOR HAND GESTURE FOR A COLLABORATIVE OBJECT

      
Application Number US2023028536
Publication Number 2024/049578
Status In Force
Filing Date 2023-07-25
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Cho, Youjean
  • Ji, Chen
  • Liu, Fannie
  • Monroy-Hernández, Andrés
  • Tsai, Tsung-Yu
  • Vaish, Rajan

Abstract

Collaborative sessions in which access to a collaborative object and added virtual content is selectively provided to participants/users. In one example of the collaborative session, a participant crops media content by use of a hand gesture to produce an image segment that can be associated to the collaborative object. The hand gesture resembles a pair of scissors and the camera and processor of the client device track a path of the hand gesture to identify an object within a displayed image to create virtual content of the identified object. The virtual content created by the hand gesture is then associated to the collaborative object.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/00 - Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
  • G06F 3/04842 - Selection of displayed objects or displayed text elements
  • G06V 40/20 - Movements or behaviour, e.g. gesture recognition
  • G06F 3/04845 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour

68.

PHYSICAL GESTURE INTERACTION WITH OBJECTS BASED ON INTUITIVE DESIGN

      
Application Number US2023028537
Publication Number 2024/049579
Status In Force
Filing Date 2023-07-25
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Cho, Youjean
  • Ji, Chen
  • Liu, Fannie
  • Monroy-Hernández, Andrés
  • Tsai, Tsung-Yu
  • Vaish, Rajan

Abstract

A collaborative session (e.g., a virtual time capsule) in which access to a collaborative object and added virtual content is selectively provided to participants/users. In one example of the collaborative session, a user interacts with the collaborative object using hand gestures. The virtual content associated with the collaborative object can be accessed with an opening hand gesture and the virtual content can be hidden with a closing hand gesture. The hand gestures are detected by cameras of a client device used by the user. The collaborative object can be moved and manipulated using a pointing gesture, wherein the position of the collaborative object can be confirmed to a new position by titling the client device of the user.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/00 - Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
  • G06V 40/20 - Movements or behaviour, e.g. gesture recognition

69.

AUTHENTICATING A SELECTIVE COLLABORATIVE OBJECT

      
Application Number US2023028542
Publication Number 2024/049580
Status In Force
Filing Date 2023-07-25
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Cho, Youjean
  • Ji, Chen
  • Liu, Fannie
  • Monroy-Hernández, Andrés
  • Tsai, Tsung-Yu
  • Vaish, Rajan

Abstract

A collaborative session (e.g., a virtual time capsule) in which access to a collaborative object and added virtual content is selectively provided to participants/users. In one example of the collaborative session, authentication of the collaborative object is performed by all of the users to complete the collaborative session. Each user authenticates the collaborative object, such as using a stamping gesture on a user interface of a client device or in an augmented reality session. User specific data is recorded with the stamping gesture to authenticate the collaborative object and the associated virtual content. In an example, user specific data may include device information, participant profile information, or biometric signal information. Biometric signal information, such as a fingerprint from a mobile device or a heart rate received from a connected smart device can be used to provide an authenticating signature to the seal.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
  • G06Q 10/10 - Office automation; Time management
  • G06F 21/32 - User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G02B 27/01 - Head-up displays

70.

COLLABORATIVE OBJECT ASSOCIATED WITH A GEOGRAPHICAL LOCATION

      
Application Number US2023028671
Publication Number 2024/049586
Status In Force
Filing Date 2023-07-26
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Cho, Youjean
  • Ji, Chen
  • Liu, Fannie
  • Monroy-Hernández, Andrés
  • Tsai, Tsung-Yu
  • Vaish, Rajan

Abstract

Collaborative sessions in which access to added virtual content is selectively made available to participants/users. A participant (the host) creates a new session and invites participants to join. The invited participants receive an invitation to join the session. The session creator (i.e., the host) and other approved participants can access the contents of a session. The session identifies a new participant when they join the session, and concurrently notifies the other participants in the session that a new participant is waiting for permission to access the added virtual content. The host or approved participants can set up the new participant with permissions for accessing added virtual content.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/00 - Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements

71.

CHARACTER AND COSTUME ASSIGNMENT FOR CO-LOCATED USERS

      
Application Number US2023028717
Publication Number 2024/049588
Status In Force
Filing Date 2023-07-26
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Kim, Daekun
  • Zhang, Lei
  • Cho, Youjean
  • Robinson, Ava
  • Tham, Yu Jiang
  • Vaish, Rajan
  • Monroy-Hernández, Andrés

Abstract

Multi-player co-located AR experiences are augmented by assigning characters and costumes to respective participants (a.k.a. "users" of AR-enabled mobile devices) in multi-player AR sessions for storytelling, play acting, and the like. Body tracking technology and augmented reality (AR) software are used to decorate the bodies of the co-located participants with virtual costumes within the context of the multi-player co-located AR experiences. Tracked bodies are distinguished to determine which body belongs to which user and hence which virtual costume belongs to which tracked body so that corresponding costumes may be assigned for display in augmented reality. A host-guest mechanism is used for networked assignment of characters and corresponding costumes in the co-located multi-player AR session. Body tracking technology is used to move the costume with the body as movement of the assigned body is detected.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06T 13/40 - 3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
  • G06Q 10/10 - Office automation; Time management

72.

GENERATING IMMERSIVE AUGMENTED REALITY EXPERIENCES FROM EXISTING IMAGES AND VIDEOS

      
Application Number US2023030926
Publication Number 2024/049687
Status In Force
Filing Date 2023-08-23
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Vaish, Rajan
  • Kratz, Sven
  • Monroy-Hernández, Andrés
  • Smith, Brian Anthony

Abstract

A two-dimensional element is identified from one or more two-dimensional images. A volumetric content item is generated based on the two-dimensional element identified from the one or more two-dimensional images. A display device presents the volumetric content item overlaid on a real-world environment that is within a field of view of a user of the display device.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • H04N 13/388 - Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
  • G06T 15/08 - Volume rendering
  • G06T 7/70 - Determining position or orientation of objects or cameras
  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
  • G06F 16/58 - Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually

73.

VOICE CONTROLLED UIS FOR AR WEARABLE DEVICES

      
Application Number US2023031018
Publication Number 2024/049696
Status In Force
Filing Date 2023-08-24
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Moll, Sharon
  • Gurgul, Piotr

Abstract

Systems, methods, and computer readable media for voice-controlled user interfaces (UIs) for augmented reality (AR) wearable devices are disclosed. Embodiments are disclosed that enable a user to interact with the AR wearable device without using physical user interface devices. An application has a non-voice-controlled UI mode and a voice-controlled UI mode. The user selects the mode of the UI. The application running on the AR wearable device displays UI elements on a display of the AR wearable device. The UI elements have types. Predetermined actions are associated with each of the UI element types. The predetermined actions are displayed with other information and used by the user to invoke the corresponding UI element.

IPC Classes  ?

  • G06F 3/16 - Sound input; Sound output
  • G06F 3/04817 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
  • G06F 3/0482 - Interaction with lists of selectable items, e.g. menus
  • G06F 1/16 - Constructional details or arrangements

74.

TIMELAPSE RE-EXPERIENCING SYSTEM

      
Application Number US2023072282
Publication Number 2024/050232
Status In Force
Filing Date 2023-08-16
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Vaish, Rajan
  • Kratz, Sven
  • Monroy-Hernández, Andrés
  • Smith, Brian Anthony

Abstract

A system captures via one or more sensors of a computing device, data of an environment observed by the one or more sensors at a first timeslot, and stores the data in a data store as a first portion of a timelapse memory experience. The system also captures, via the one or more sensors of a computing device, data of the environment observed by the one or more sensors at a second timeslot, and stores the data in a data store as a second portion of the timelapse memory experience. The system additionally associates the timelapse memory experience with a memory experience trigger, wherein the memory experience trigger can initiate a presentation of the timelapse memory experience.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

75.

TOUCH-BASED AUGMENTED REALITY EXPERIENCE

      
Application Number US2023072701
Publication Number 2024/050259
Status In Force
Filing Date 2023-08-23
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Vaish, Rajan
  • Kratz, Sven
  • Monroy-Hernández, Andrés
  • Smith, Brian Anthony

Abstract

The present disclosure relates to methods and systems for providing a touch-based augmented reality (AR) experience. During a capture phase, a first user may grip an object. An intensity of a force applied on the object in the grip and/or a duration of the grip may be recorded. A volumetric representation of the first user holding the object may also be captured. During an experience phase, a second user may touch the object, the object may provide haptic feedback (e.g., a vibration) to the second user at an intensity and a duration corresponding to an intensity of the force applied on the object and a duration of the grip of the object. If a volumetric representation of the first user holding the object is captured, touching the object may also cause a presentation of the first user's volumetric body that holds the object.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/04842 - Selection of displayed objects or displayed text elements

76.

ONE-HANDED ZOOM OPERATION FOR AR/VR DEVICES

      
Application Number US2023072707
Publication Number 2024/050260
Status In Force
Filing Date 2023-08-23
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Mahalingam, Anoosh Kruba Chandar
  • Pounds, Jennica
  • Rybin, Andrei
  • Santerre, Pierre-Yves

Abstract

An Augmented Reality (AR) system is provided. The AR system uses a combination of gesture and Direct Manipulation of Virtual Objects (DMVO) methodologies to provide for the user's selection and modification of virtual objects of an AR experience. The user indicates that they want to interact with a virtual object of the AR experience by moving their hand to overlap the virtual object. While keeping their hand in an overlapping position, the user makes gestures that cause the user's viewpoint of the virtual object to either zoom in or zoom out. To end the interaction, the user moves their hand such that their hand is no longer overlapping the virtual object.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/03 - Arrangements for converting the position or the displacement of a member into a coded form

77.

WRIST ROTATION MANIPULATION OF VIRTUAL OBJECTS

      
Application Number US2023072721
Publication Number 2024/050263
Status In Force
Filing Date 2023-08-23
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Mahalingam, Anoosh Kruba Chandar
  • Pounds, Jennica
  • Rybin, Andrei
  • Santerre, Pierre-Yves

Abstract

An Augmented Reality (AR) system is provided. The AR system uses a combination of gesture and Direct Manipulation of Virtual Objects (DMVO) methodologies to provide for the user's selection and modification of virtual objects of an AR experience. The user indicates that they want to interact with a virtual object of the AR experience by moving their hand to overlap the virtual object. While keeping their hand in an overlapping position, the user rotates their wrist and the virtual object is rotated as well. To end the interaction, the user moves their hand such that their hand is no longer overlapping the virtual object.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/03 - Arrangements for converting the position or the displacement of a member into a coded form
  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
  • G06F 3/04817 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
  • G06F 3/04845 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour

78.

CODED VISION SYSTEM

      
Application Number 18388977
Status Pending
Filing Date 2023-11-13
First Publication Date 2024-03-07
Owner Snap Inc. (USA)
Inventor
  • Charlton, Ebony James
  • Cansizoglu, Omer
  • Ouimet, Kirk
  • Boyd, Nathan Kenneth

Abstract

A system and method for presentation of computer vision (e.g., augmented reality, virtual reality) using user data and a user code is disclosed. A client device can detect an image feature (e.g., scannable code) in one or more images. The image feature is determined to be linked to a user account. User data from the user account can then be used to generate one or more augmented reality display elements that can be anchored to the image feature in the one or more images.

IPC Classes  ?

  • G06T 11/60 - Editing figures and text; Combining figures or text
  • A63F 13/00 - Video games, i.e. games using an electronically generated display having two or more dimensions
  • A63F 13/213 - Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
  • A63F 13/352 - Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers - Details of game servers involving special game server arrangements, e.g. regional servers connected to a national server or a plurality of servers managing partitions of the game world
  • A63F 13/58 - Controlling game characters or game objects based on the game progress by computing conditions of game characters, e.g. stamina, strength, motivation or energy level
  • A63F 13/65 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
  • A63F 13/79 - Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06V 20/20 - Scenes; Scene-specific elements in augmented reality scenes
  • G06V 40/16 - Human faces, e.g. facial parts, sketches or expressions
  • H04L 51/52 - User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail for supporting social networking services
  • H04L 67/306 - User profiles
  • H04L 67/52 - Network services specially adapted for the location of the user terminal

79.

EXTENDING USER INTERFACES OF MOBILE APPS TO AR EYEWEAR

      
Application Number US2023028257
Publication Number 2024/049565
Status In Force
Filing Date 2023-07-20
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Gurgul, Piotr
  • Moll, Sharon

Abstract

An architecture is provided for packaging visual overlay-based user interfaces (UIs) into mobile device applications to work as user interface extensions that allow certain flows and logic to be displayed on an eyewear device when connected to the mobile device application. The extension of the UIs of the mobile device applications to the display of the eyewear device allows for inexpensive experimentation with augmented reality (AR) UIs for eyewear devices and allows for reusing of business logic across mobile devices and associated eyewear devices. For example, a mobile device application for maps or navigation may be extended to show directions on an associated eyewear device once the destination is chosen in the navigation application on the mobile device. In this example, the business logic would still live in the navigation application on the mobile device but the user would see AR directions overlaid on a display of the eyewear device.

IPC Classes  ?

  • G06F 3/0484 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
  • G02B 27/01 - Head-up displays
  • H04W 4/80 - Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication
  • G06F 1/16 - Constructional details or arrangements
  • G06F 9/451 - Execution arrangements for user interfaces

80.

SELECTIVE COLLABORATIVE OBJECT ACCESS

      
Application Number US2023028377
Publication Number 2024/049573
Status In Force
Filing Date 2023-07-21
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Cho, Youjean
  • Ji, Chen
  • Liu, Fannie
  • Monroy-Hernández, Andrés
  • Tsai, Tsung-Yu
  • Vaish, Rajan

Abstract

Collaborative sessions in which access to added virtual content is selectively made available to participants/users. A participant (the host) creates a new session and invites participants to join. The invited participants receive an invitation to join the session. The session creator (i.e., the host) and other approved participants can access the contents of a session. The session identifies a new participant when they join the session, and concurrently notifies the other participants in the session that a new participant is waiting for permission to access the added virtual content. The host or approved participants can set up the new participant with permissions for accessing added virtual content.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/04842 - Selection of displayed objects or displayed text elements
  • G02B 27/01 - Head-up displays

81.

TIMELAPSE OF GENERATING A COLLABORATIVE OBJECT

      
Application Number US2023028664
Publication Number 2024/049585
Status In Force
Filing Date 2023-07-26
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Cho, Youjean
  • Ji, Chen
  • Liu, Fannie
  • Monroy-Hernández, Andrés
  • Tsai, Tsung-Yu
  • Vaish, Rajan

Abstract

A collaborative session (e.g., a virtual time capsule) in which access to a collaborative object and added virtual content is selectively provided to participants/users. In one example of the collaborative session, a participant (the host) creates a new session and invites participants to join. The session creator (i.e., the host) and other approved participants can access the contents of a session (e.g., which may be recorded using an application such as lens cloud feature; available from Snap Inc. of Santa Monica, California). A timestamp is associated with each received virtual content, and the users are provided with a timelapse of the collaborative object as a function of the timestamps.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/0484 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
  • G06F 3/0481 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
  • G06F 3/00 - Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements

82.

CO-LOCATED FULL-BODY GESTURES

      
Application Number US2023028708
Publication Number 2024/049587
Status In Force
Filing Date 2023-07-26
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Kim, Daekun
  • Zhang, Lei
  • Cho, Youjean
  • Robinson, Ava
  • Tham, Yu Jiang
  • Vaish, Rajan
  • Monroy-Hernández, Andrés

Abstract

A method for detecting full-body gestures by a mobile device includes a host mobile device detecting the tracked body of a co-located participant in a multi-party session. When the participant's tracked body provides a full-body gesture, the host's mobile device recognizes that there is a tracked body providing a full-body gesture. The host mobile device iterates through the list of participants in the multi-party session and finds the closest participant mobile device with respect to the screen-space position of the head of the gesturing participant. The host mobile device then obtains the user ID of the closest participant mobile device and broadcasts the recognized full-body gesture event to all co-located participants in the multi-party session, along with the obtained user ID. Each participant's mobile device may then handle the gesture event as appropriate for the multi-party session. For example, a character or costume may be assigned to a gesturing participant.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/00 - Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements

83.

AUTHORING TOOLS FOR CREATING INTERACTIVE AR EXPERIENCES

      
Application Number US2023028720
Publication Number 2024/049589
Status In Force
Filing Date 2023-07-26
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Zhang, Lei
  • Kim, Daekun
  • Cho, Youjean
  • Robinson, Ava
  • Tham, Yu Jiang
  • Vaish, Rajan
  • Monroy-Hernández, Andrés

Abstract

Described are authoring tools for creating interactive AR experiences. The story-authoring application enables a user with little or no programming skills to create an interactive story that includes recording voice commands for advancing to the next scene, inserting and manipulating virtual objects in a mixed-reality environment, and recording a variety of interactions with connected IoT devices. The story creation interface is presented on the display as a virtual object in an AR environment.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 3/16 - Sound input; Sound output
  • G06F 3/14 - Digital output to display device
  • G10L 15/26 - Speech to text systems

84.

VIRTUAL AR INTERFACES FOR CONTROLLING IOT DEVICES USING MOBILE DEVICE ORIENTATION SENSORS

      
Application Number US2023028776
Publication Number 2024/049592
Status In Force
Filing Date 2023-07-27
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Zhang, Lei
  • Cho, Youjean
  • Kim, Daekun
  • Robinson, Ava
  • Tham, Yu Jiang
  • Vaish, Rajan
  • Monroy-Hernández, Andrés

Abstract

Described are virtual AR interfaces for generating a virtual rotational interface for the purpose of controlling connected IoT devices using the inertial measurement unit (IMU) of a portable electronic device. The IMU control application enables a user of a portable electronic device to activate a virtual rotational interface overlay on a display and adjust a feature of a connected IoT product by rotating a portable electronic device. The device IMU moves a slider on the virtual rotational interface. The IMU control application sends a control signal to the IoT product which executes an action in accordance with the slider position. The virtual rotational interface is presented on the display as a virtual object in an AR environment. The IMU control application detects the device orientation (in the physical environment) and in response presents a corresponding slider element on the virtual rotational interface (in the AR environment).

IPC Classes  ?

  • H04L 67/125 - Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks involving control of end-device applications over a network
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 3/00 - Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements

85.

INTERACTION RECORDING TOOLS FOR CREATING INTERACTIVE AR STORIES

      
Application Number US2023028788
Publication Number 2024/049594
Status In Force
Filing Date 2023-07-27
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Zhang, Lei
  • Cho, Youjean
  • Kim, Daekun
  • Robinson, Ava
  • Tham, Yu Jiang
  • Vaish, Rajan
  • Monroy-Hernández, Andrés

Abstract

Recording tools for creating interactive AR experiences. An interaction recording application enables a user with little or no programming skills to perform and record user behaviors that are associated with reactions between story elements such as virtual objects and connected IoT devices. The user behaviors include a range of actions, such as speaking a trigger word and apparently touching a virtual object. The corresponding reactions include starting to record a subsequent scene and executing actions between story elements. The trigger recording interface is presented on the display as an overlay relative to the physical environment.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/16 - Sound input; Sound output
  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
  • G10L 15/22 - Procedures used during a speech recognition process, e.g. man-machine dialog
  • G02B 27/01 - Head-up displays
  • G16Y 10/75 - Information technology; Communication

86.

RECORDING FOLLOWING BEHAVIORS BETWEEN VIRTUAL OBJECTS AND USER AVATARS IN AR EXPERIENCES

      
Application Number US2023028882
Publication Number 2024/049596
Status In Force
Filing Date 2023-07-27
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Zhang, Lei
  • Robinson, Ava
  • Kim, Daekun
  • Cho, Youjean
  • Tham, Yu Jiang
  • Vaish, Rajan
  • Monroy-Hernández, Andrés

Abstract

Described are recording tools for generating following behaviors and creating interactive AR experiences. The following recording application enables a user with little or no programming skills to virtually connect virtual objects to other elements, including virtual avatars representing fellow users, thereby creating an interactive story in which multiple elements are apparently and persistently connected. The following interface includes methods for selecting objects and instructions for connecting a virtual object to a target object. In one example, the recording application presents on the display a virtual tether between the objects until a connecting action is detected. The following interface is presented on the display as an overlay, in the foreground relative to the physical environment.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/00 - Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
  • G06F 3/0487 - Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G02B 27/01 - Head-up displays

87.

VIRTUAL INTERFACES FOR CONTROLLING IOT DEVICES

      
Application Number US2023028885
Publication Number 2024/049597
Status In Force
Filing Date 2023-07-27
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Kim, Daekun
  • Zhang, Lei
  • Cho, Youjean
  • Robinson, Ava
  • Tham, Yu Jiang
  • Vaish, Rajan
  • Monroy-Hernández, Andrés

Abstract

A virtual interface application presented in augmented reality (AR) is described for controlling Internet of Things (IoT) products. The virtual interface application enables a user of a portable electronic device to activate a virtual control interface overlay on a display, receive a selection from the user using her hands or feet, and send a control signal to a nearby IoT product which executes an action in accordance with the selection. The virtual control interface is presented on the display as a virtual object in an AR environment. The virtual interface application includes a foot tracking tool for detecting an intersection between the foot location (in the physical environment) and the virtual surface position (in the AR environment). When an intersection is detected, the virtual interface application sends a control signal with instructions to the IoT product.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/04842 - Selection of displayed objects or displayed text elements
  • G06F 3/0482 - Interaction with lists of selectable items, e.g. menus
  • G06F 3/04817 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
  • G06F 3/00 - Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
  • H04L 67/125 - Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks involving control of end-device applications over a network

88.

MULTISENSORIAL PRESENTATION OF VOLUMETRIC CONTENT

      
Application Number US2023031066
Publication Number 2024/049700
Status In Force
Filing Date 2023-08-24
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Vaish, Rajan
  • Kratz, Sven
  • Monroy-Hernández, Andrés
  • Smith, Brian Anthony

Abstract

Input indicative of a selection of volumetric content for presentation is received. The volumetric content comprises a volumetric representation of one or more elements of a real-world three-dimensional space. In response to the input, device state data associated with the volumetric content is accessed. The device state data describes a state of one or more network-connected devices associated with the real-world three-dimensional space. The volumetric content is presented. The presentation of the volumetric content includes presentation of the volumetric representation of the one or more elements overlaid on the real-world three-dimensional space by a display device and configuring the one or more network-connected devices using the device state data.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • H04N 13/388 - Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
  • G06T 17/20 - Wire-frame description, e.g. polygonalisation or tessellation
  • G06T 15/08 - Volume rendering

89.

CONTEXTUAL MEMORY EXPERIENCE TRIGGERS SYSTEM

      
Application Number US2023072274
Publication Number 2024/050229
Status In Force
Filing Date 2023-08-16
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Vaish, Rajan
  • Kratz, Sven
  • Monroy-Hernández, Andrés
  • Smith, Brian Anthony

Abstract

A system monitors an environment via one or more sensors included in a computing device and applies a trigger to detect that a memory experience is stored in a data store based on the monitoring. The system creates an augmented reality memory experience, a virtual reality memory experience, or a combination thereof, based on the trigger if the memory experience is detected. The system additionally projects the augmented reality memory experience, the virtual reality memory experience, or the combination thereof, via the computing device.

IPC Classes  ?

  • G06F 3/06 - Digital input from, or digital output to, record carriers

90.

SOCIAL MEMORY RE-EXPERIENCING SYSTEM

      
Application Number US2023072277
Publication Number 2024/050231
Status In Force
Filing Date 2023-08-16
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Vaish, Rajan
  • Kratz, Sven
  • Monroy-Hernández, Andrés
  • Smith, Brian Anthony

Abstract

A system monitors a user environment via one or more sensors included in a computing device and detects, via a trigger, that event data is stored in a data store based on the monitoring. The system further detects one or more participants in the event data and invites the one or more participants to share an augmented reality event data and/or to a virtual reality event data. The system also creates, based on the event data, an augmented reality event data and/or a virtual reality event data, and presents the augmented reality event data and/or the virtual reality event data to the one or more participants in a synchronous mode and/or in an asynchronous mode, via the computing device.

IPC Classes  ?

  • G06Q 50/10 - Services
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

91.

MULTI-PERSPECTIVE AUGMENTED REALITY EXPERIENCE

      
Application Number US2023072557
Publication Number 2024/050245
Status In Force
Filing Date 2023-08-21
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Vaish, Rajan
  • Kratz, Sven
  • Monroy-Hernández, Andrés
  • Smith, Brian Anthony

Abstract

The present disclosure relates to methods and systems for providing a multi-perspective augmented reality experience. A volumetric video of a three-dimensional space is captured. The volumetric video of the three-dimensional space includes a volumetric representation of a first user within the three-dimensional space. The volumetric video is displayed by a display device worn by a second user, and the second user sees the volumetric representation of the first user within the three-dimensional space. Input indicative of an interaction (e.g., entering or leaving) of the second user with the volumetric representation of the first user is detected. Based on detecting the input indicative of the interaction, the display device switches to a display of a recorded perspective of the first user. Thus, by interacting with a volumetric representation of the first user in a volumetric video, the second user views the first user's perspective of the three-dimensional space.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06T 15/20 - Perspective computation
  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
  • G06T 15/08 - Volume rendering
  • H04N 13/388 - Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume
  • G02B 27/01 - Head-up displays

92.

CONTROLLING AND EDITING PRESENTATION OF VOLUMETRIC CONTENT

      
Application Number US2023072568
Publication Number 2024/050246
Status In Force
Filing Date 2023-08-21
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Vaish, Rajan
  • Kratz, Sven
  • Monroy-Hernández, Andrés
  • Smith, Brian Anthony

Abstract

A display device presents volumetric content comprising a volumetric video. The volumetric video comprises a volumetric representation of one or more elements a three-dimensional space. Input indicative of a control operation associated with the presentation of the volumetric video is received. The presentation of the volumetric video by the display device is controlled by executing the control operation. While the control operation is being executed, the volumetric representation of the one or more elements of the three-dimensional space are displayed from multiple perspectives based on movement of a user.

IPC Classes  ?

  • G06F 3/16 - Sound input; Sound output
  • G06F 3/0484 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
  • G06F 3/0481 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance

93.

MIXING AND MATCHING VOLUMETRIC CONTENTS FOR NEW AUGMENTED REALITY EXPERIENCES

      
Application Number US2023072718
Publication Number 2024/050262
Status In Force
Filing Date 2023-08-23
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Kratz, Sven
  • Monroy‐hernández, Andrés
  • Smith, Brian Anthony
  • Vaish, Rajan

Abstract

A volumetric content presentation system includes a head-worn display device, which includes one or more processors, and a memory storing instructions that, when executed by the one or more processors, configure the display device to access AR content items that correspond to either real -world objects or virtual objects, mix and match these AR content items, and present volumetric content that includes these mixed and matched AR content items overlaid on a real-world environment to create a new AR scene that a user can experience.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • H04N 13/388 - Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume
  • H04N 13/332 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
  • G06F 3/16 - Sound input; Sound output
  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object

94.

MULTI-DIMENSIONAL EXPERIENCE PRESENTATION USING AUGMENTED REALITY

      
Application Number US2023072726
Publication Number 2024/050264
Status In Force
Filing Date 2023-08-23
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Vaish, Rajan
  • Kratz, Sven
  • Monroy-Hernández, Andrés
  • Smith, Brian Anthony

Abstract

The present disclosure relates to methods and systems for providing a presentation of an experience (e.g., a journey) to a user using augmented reality (AR). During a capture phase, persons in the journey may take videos or pictures using their smartphones, GoPros, and/or smart glasses. A drone may also take videos or pictures during the journey. During an experience phase, an AR topographical rendering of the real-world environment of the journey may be rendered on a tabletop, highlighting/animating a path persons took in the journey. The persons may be rendered as miniature avatars/dolls overlaid on the representation of the real-world environment. When the user clicks on a point in the presentation of the journey, a perspective (e.g., the videos or pictures) at that point is presented.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 3/00 - Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
  • H04W 4/80 - Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication

95.

3D SPACE CARVING USING HANDS FOR OBJECT CAPTURE

      
Application Number US2023073217
Publication Number 2024/050460
Status In Force
Filing Date 2023-08-31
Publication Date 2024-03-07
Owner SNAP INC. (USA)
Inventor
  • Micusik, Branislav
  • Evangelidis, Georgios
  • Wolf, Daniel

Abstract

A method for carving a 3D space using hands tracking is described. In one aspect, a method includes accessing a first frame from a camera of a display device, tracking, using a hand tracking algorithm operating at the display device, hand pixels corresponding to one or more user hands depicted in the first frame, detecting, using a sensor of the display device, depths of the hand pixels, identifying a 3D region based on the depths of the hand pixels, and applying a 3D reconstruction engine to the 3D region.

IPC Classes  ?

  • G06T 7/50 - Depth or shape recovery
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06V 40/20 - Movements or behaviour, e.g. gesture recognition
  • G02B 27/01 - Head-up displays

96.

System to generate contextual queries

      
Application Number 16732068
Grant Number 11921773
Status In Force
Filing Date 2019-12-31
First Publication Date 2024-03-05
Grant Date 2024-03-05
Owner SNAP INC. (USA)
Inventor Adler, Manny Jerrold

Abstract

A contextual query system is configured to perform operations that include: causing display of a graphical user interface at a client device, the graphical user interface including a display of image data that comprises a set of image features; generating a query based on the set of image features of the image data; accessing media content based on the query at a repository, the repository comprising a collection of media content; and causing display of a presentation of the media content within the graphical user interface at the client device.

IPC Classes  ?

  • G06F 16/532 - Query formulation, e.g. graphical querying
  • G06F 16/535 - Filtering based on additional data, e.g. user or group profiles
  • G06F 16/538 - Presentation of query results
  • G06F 16/58 - Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
  • G06F 16/583 - Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
  • G06F 16/587 - Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using geographical or spatial information, e.g. location

97.

Normalized brightness control for user perception of visual media

      
Application Number 18157403
Grant Number 11922901
Status In Force
Filing Date 2023-01-20
First Publication Date 2024-03-05
Grant Date 2024-03-05
Owner Snap Inc. (USA)
Inventor Fawcett, Rudd

Abstract

A system and method for control of perceived brightness level in a display device based on the brightness of individual content items to provide a consistent viewing experience. This is a method of visual content-based brightness control. Adjusting the perceived display of individual visual content items does not rely on the adjustment of a display device's settings or controls, such as a general setting to control the brightness of a display screen, but rather evaluates the visual content being presented and adjusts the presentation layers for display of the visual content provided as part of the display screen. In this way, adjusting parameter(s) of one or more of these presentation layers provides control over the brightness level of the displayed content as a function of the brightness of the content.

IPC Classes  ?

98.

Emotion recognition for workforce analytics

      
Application Number 16667366
Grant Number 11922356
Status In Force
Filing Date 2019-10-29
First Publication Date 2024-03-05
Grant Date 2024-03-05
Owner SNAP INC. (USA)
Inventor
  • Shaburov, Victor
  • Monastyrshyn, Yurii

Abstract

Methods and systems for videoconferencing include generating work quality metrics based on emotion recognition of an individual such as a call center agent. The work quality metrics allow for workforce optimization. One example method includes the steps of receiving a video including a sequence of images, detecting an individual in one or more of the images, locating feature reference points of the individual, aligning a virtual face mesh to the individual in one or more of the images based at least in part on the feature reference points, dynamically determining over the sequence of images at least one deformation of the virtual face mesh, determining that the at least one deformation refers to at least one facial emotion selected from a plurality of reference facial emotions, and generating quality metrics including at least one work quality parameter associated with the individual based on the at least one facial emotion.

IPC Classes  ?

  • G06Q 10/0639 - Performance analysis of employees; Performance analysis of enterprise or organisation operations
  • G06V 40/16 - Human faces, e.g. facial parts, sketches or expressions
  • G10L 25/63 - Speech or voice analysis techniques not restricted to a single one of groups specially adapted for particular use for comparison or discrimination for estimating an emotional state
  • H04N 21/4402 - Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
  • G10L 15/22 - Procedures used during a speech recognition process, e.g. man-machine dialog
  • G10L 17/26 - Recognition of special voice characteristics, e.g. for use in lie detectors; Recognition of animal voices
  • H04N 7/15 - Conference systems
  • H04N 21/442 - Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed or the storage space available from the internal hard disk
  • H04N 21/4788 - Supplemental services, e.g. displaying phone caller identification or shopping application communicating with other users, e.g. chatting

99.

HAND-TRACKING STABILIZATION

      
Application Number 17822634
Status Pending
Filing Date 2022-08-26
First Publication Date 2024-02-29
Owner Snap Inc. (USA)
Inventor Lucas, Benjamin

Abstract

An Augmented Reality (AR) system provides stabilization of hand-tracking input data. The AR system provides for display a user interface of an AR application. The AR system captures, using one or more cameras of the AR system, video frame tracking data of a gesture being made by a user while the user interacts with the AR user interface. The AR system generates skeletal 3D model data of a hand of the user based on the video frame tracking data that includes one or more skeletal 3D model features corresponding to recognized visual landmarks of portions of the hand of the user. The AR system generates targeting data based on the skeletal 3D model data where the targeting data identifies a virtual 3D object of the AR user interface. The AR system filters the targeting data using a targeting filter component and provides the filtered targeting data to the AR application.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06V 20/64 - Three-dimensional objects

100.

PRESENTING CAPTURED SCREEN CONTENT WITHIN A VIRTUAL CONFERENCING SYSTEM

      
Application Number 17823688
Status Pending
Filing Date 2022-08-31
First Publication Date 2024-02-29
Owner Snap Inc. (USA)
Inventor Lin, Andrew Cheng-Min

Abstract

Aspects of the present disclosure involve a system comprising a computer-readable storage medium storing a program and method for presenting captured screen content within a virtual conferencing system. The program and method provide, in association with designing a room for virtual conferencing, a first interface for configuring at least one participant video element which is assignable to a respective participant video feed; receive, via the first interface, an indication of user input for setting first properties for the at least one participant video element; provide, in association with designing the room, a second interface for configuring presentation of screen content captured during virtual conferencing; receive, via the second interface, an indication of user input for setting second properties for the presentation of the screen content captured during virtual conferencing; and provide, in association with virtual conferencing, display of the room based on the first properties and the second properties.

IPC Classes  ?

  • H04L 12/18 - Arrangements for providing special services to substations for broadcast or conference
  • G06F 3/0484 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
  1     2     3     ...     46        Next Page