Meta View, Inc.

United States of America

Back to Profile

1-10 of 10 for Meta View, Inc. Sort by
Query
Patent
United States - USPTO
Aggregations Reset Report
Date
2019 5
Before 2019 5
IPC Class
G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer 9
G06T 19/00 - Manipulating 3D models or images for computer graphics 8
G06F 3/0481 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance 6
G02B 27/01 - Head-up displays 3
G02B 27/00 - Optical systems or apparatus not provided for by any of the groups , 2
See more
Found results for  patents

1.

Systems and methods for reducing processing load when simulating user interaction with virtual objects in an augmented reality space and/or evaluating user interaction with virtual objects in an augmented reality space

      
Application Number 15442534
Grant Number 10489931
Status In Force
Filing Date 2017-02-24
First Publication Date 2019-11-26
Grant Date 2019-11-26
Owner Meta View, Inc. (USA)
Inventor
  • Kinstner, Zachary R.
  • Lo, Raymond Chun Hing

Abstract

Systems and methods for reducing processing load when simulating user interaction with virtual objects in an interactive space and/or evaluating user interaction with virtual objects in an interactive space are described herein. Interactions may include interactions between one or more real-world objects and one or more virtual objects. A real world object may be detected and/or modeled in a virtual world as a collection of point charges.

IPC Classes  ?

  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/0481 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance

2.

Systems and methods to present virtual content in an interactive space

      
Application Number 15582419
Grant Number 10481755
Status In Force
Filing Date 2017-04-28
First Publication Date 2019-11-19
Grant Date 2019-11-19
Owner Meta View, Inc. (USA)
Inventor
  • Ngo, Kevin
  • Vigraham, Saranyan

Abstract

A system configured to present virtual content in an interactive space may comprise one or more of a light source, an optical element, one or more physical processors, non-transitory electronic storage, and/or other components. The light source may be configured to emit light. The optical element being configured to provide the light emitted from the light source to an eye of the user. The non-transitory electronic storage may be configured to store virtual content information defining virtual content. The virtual content may include one or more of a virtual presentation area, one or more virtual tools, one or more virtual object, and/or other virtual content. The virtual presentation area may be provided for generating and/or displaying presentations of virtual content. The presentation may include a set of scenes. An individual scene may include one or more virtual objects posed on the virtual presentation area.

IPC Classes  ?

  • G06F 3/0481 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
  • G06T 15/50 - Lighting effects
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts

3.

Systems and methods to provide views of virtual content in an interactive space

      
Application Number 15955243
Grant Number 10475246
Status In Force
Filing Date 2018-04-17
First Publication Date 2019-11-12
Grant Date 2019-11-12
Owner Meta View, Inc. (USA)
Inventor
  • Gribetz, Meron
  • Kawar, Yazan
  • Frank, Rebecca

Abstract

A system configured to provide views of virtual content in an interactive space may comprise one or more of a light source, an optical element, one or more physical processor, non-transitory electronic storage, and/or other components. The optical element being configured to provide light emitted from the light source into one or more eyes of a user. The non-transitory electronic storage may be configured to store virtual content information defining virtual content. The virtual content may include one or more of a virtual gallery, one or more virtual objects, and/or other virtual content. The virtual gallery may comprise a set of supports. The virtual gallery may be configured to simulate removable engagement of individual virtual objects to individual supports.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 3/0481 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

4.

System and method for managing interactive virtual frames for virtual objects in a virtual environment

      
Application Number 16253570
Grant Number 10699490
Status In Force
Filing Date 2019-01-22
First Publication Date 2019-10-24
Grant Date 2020-06-30
Owner Meta View, Inc. (USA)
Inventor
  • Kinstner, Zachary R.
  • Frank, Rebecca B.
  • Gribetz, Yishai

Abstract

The methods, systems, techniques, and components described herein allow interactions with virtual objects in a virtual environment, such as a Virtual Reality (VR) environment or Augmented Reality (AR) environment, to be modeled accurately. More particularly, the methods, systems, techniques, and components described herein allow interactive virtual frames to be created for virtual objects in a virtual environment. The virtual frames may be built using line primitives that form frame boundaries based on shape boundaries of virtual objects enclosed by the virtual frame. An area of interactivity defined by the virtual frame may allow users to interact with the virtual object in the virtual environment.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 1/16 - Constructional details or arrangements
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/0484 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
  • G06T 17/10 - Volume description, e.g. cylinders, cubes or using CSG [Constructive Solid Geometry]

5.

System and method for providing views of virtual content in an augmented reality environment

      
Application Number 16034803
Grant Number 10521966
Status In Force
Filing Date 2018-07-13
First Publication Date 2019-06-13
Grant Date 2019-12-31
Owner Meta View, Inc. (USA)
Inventor
  • Gribetz, Meron
  • Frank, Rebecca B.

Abstract

A system configured for providing views of virtual content in an augmented reality environment may comprise one or more of a light source, an optical element, one or more physical processor, non-transitory electronic storage, and/or other components. The light source may be configured to emit light. The optical element being configured to reflect light emitted from the light source into one or more eyes of a user. The non-transitory electronic storage may be configured to store virtual content information defining virtual content. The virtual content may include one or more of an annular dock, one or more virtual objects, and/or other virtual content. The annular dock may comprise a set of sockets. The annular dock may be configured to simulate removable engagement of individual virtual objects to individual sockets. The light source may be controlled to generate views of the annular dock to be perceived as surrounding the user.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G02B 27/01 - Head-up displays
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/0481 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
  • G06F 3/0482 - Interaction with lists of selectable items, e.g. menus
  • G06F 3/0484 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range

6.

System and method for facilitating virtual interactions with a three-dimensional virtual environment in response to sensor input into a control device having sensors

      
Application Number 15990321
Grant Number 10521028
Status In Force
Filing Date 2018-05-25
First Publication Date 2018-09-27
Grant Date 2019-12-31
Owner Meta View, Inc. (USA)
Inventor Gribetz, Yishai

Abstract

Various implementations provide for a three-dimensional trackpad in which sensors and a three-dimensional physical region may be used to interact with a three-dimensional virtual environment. The methods, systems, techniques, and components described herein may facilitate interactions with virtual objects in a three-dimensional virtual environment in response to sensor input into a control device having one or more sensors implemented thereon. The control device may be coupled to a display that may be configured to display the three-dimensional virtual environment. In various implementations, the sensor(s) capture physical movement of a user interaction element (a hand, a stylus, a physical object, etc.) within a specified three-dimensional physical region. The physical movement may be translated into a virtual interaction with the three-dimensional virtual environment. A virtual action in the three-dimensional virtual environment may be identified and displayed.

IPC Classes  ?

  • G06F 3/0354 - Pointing devices displaced or positioned by the user; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 3/0489 - Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using dedicated keyboard keys or combinations thereof
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/03 - Arrangements for converting the position or the displacement of a member into a coded form
  • G06F 3/0481 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
  • G02B 27/01 - Head-up displays

7.

System and method for modifying virtual objects in a virtual environment in response to user interactions

      
Application Number 15996281
Grant Number 10438419
Status In Force
Filing Date 2018-06-01
First Publication Date 2018-09-27
Grant Date 2019-10-08
Owner Meta View, Inc. (USA)
Inventor
  • Kinstner, Zachary R.
  • Frank, Rebecca B.
  • Gribetz, Yishai

Abstract

The methods, systems, techniques, and components described herein allow interaction volumes of virtual objects in a virtual environment, such as a Virtual or Augmented Reality environment, to be modified based on user interactions taken on virtual frames created for those virtual objects. A user interaction element of a virtual frame may receive a user interaction. The user interaction may comprise one or more instructions to modify the size, shape, or other visual property of the virtual object. For example, the user interaction may comprise one or more instructions to change a size of the virtual object while maintaining a scale of the virtual object. In response to the user interaction, visual properties of the virtual frame and/or the virtual object may be modified. Interaction volumes of component elements of the virtual frame as well as interaction volumes of the virtual object may be modified in response to the user interaction.

IPC Classes  ?

  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

8.

System and method for tracking a human hand in an augmented reality environment

      
Application Number 15962852
Grant Number 10698496
Status In Force
Filing Date 2018-04-25
First Publication Date 2018-08-23
Grant Date 2020-06-30
Owner Meta View, Inc. (USA)
Inventor Gribetz, Yishai

Abstract

A system configured for tracking a human hand in an augmented reality environment may comprise a distancing device, one or more physical processors, and/or other components. The distancing device may be configured to generate output signals conveying position information. The position information may include positions of surfaces of real-world objects, including surfaces of a human hand. Feature positions of one or more hand features of the hand may be determined through iterative operations of determining estimated feature positions of individual hand features from estimated feature position of other ones of the hand features.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/03 - Arrangements for converting the position or the displacement of a member into a coded form
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06T 7/246 - Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
  • G06K 9/00 - Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
  • G06K 9/46 - Extraction of features or characteristics of the image
  • G06K 9/62 - Methods or arrangements for recognition using electronic means
  • G06K 9/32 - Aligning or centering of the image pick-up or image-field

9.

System and method for providing views of virtual content in an augmented reality environment

      
Application Number 15263313
Grant Number 10026231
Status In Force
Filing Date 2016-09-12
First Publication Date 2018-07-17
Grant Date 2018-07-17
Owner META VIEW, INC. (USA)
Inventor
  • Gribetz, Meron
  • Frank, Rebecca B.

Abstract

A system configured for providing views of virtual content in an augmented reality environment may comprise one or more of a light source, an optical element, one or more physical processor, non-transitory electronic storage, and/or other components. The light source may be configured to emit light. The optical element being configured to reflect light emitted from the light source into one or more eyes of a user. The non-transitory electronic storage may be configured to store virtual content information defining virtual content. The virtual content may include one or more of an annular dock, one or more virtual objects, and/or other virtual content. The annular dock may comprise a set of sockets. The annular dock may be configured to simulate removable engagement of individual virtual objects to individual sockets. The light source may be controlled to generate views of the annular dock to be perceived as surrounding the user.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G02B 27/01 - Head-up displays
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/0481 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
  • G06F 3/0482 - Interaction with lists of selectable items, e.g. menus

10.

System and method for simulating user interaction with virtual objects in an interactive space

      
Application Number 15885260
Grant Number 10037629
Status In Force
Filing Date 2018-01-31
First Publication Date 2018-06-07
Grant Date 2018-07-31
Owner META VIEW, INC. (USA)
Inventor
  • Kinstner, Zachary R.
  • Lo, Raymond Chun Hing
  • Frank, Rebecca B.

Abstract

Systems and methods for simulating user interaction with virtual objects in an augmented reality environment are provided. Three-dimensional point cloud information from a three-dimensional volumetric imaging sensor may be obtained. An object position of a virtual object may be determined. Individual potential force vectors for potential forces exerted on the virtual object may be determined. An individual potential force vector may be defined by one or more of a magnitude, a direction, and/or other information. An aggregate scalar magnitude of the individual potential force vectors may be determined. An aggregate potential force vector may be determined by aggregating the magnitudes and directions of the individual potential force vectors. It may be determined whether the potential forces exerted on the virtual object are conflicting.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06T 7/70 - Determining position or orientation of objects or cameras
  • G06K 9/00 - Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 17/10 - Volume description, e.g. cylinders, cubes or using CSG [Constructive Solid Geometry]