Traditional head-mounted display experiences require a hand controller (e.g. thumbstick, touchpad, gamepad, keyboard, etc.) to manipulate audio controls. The invention is a hands-free, body-based navigation technology that puts the participant's body in direct control of movement through sonic space. Participants lean towards a sound source to make it louder and lean away to make it quieter. In some embodiments, the more a participant leans, the faster the volume changes. Because the interactions were designed to respond to natural bearing and balancing instincts, movement coordination is intuitive and easy to learn.
G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
A63F 13/213 - Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
A63F 13/428 - Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
A63F 13/803 - Driving vehicles or craft, e.g. cars, airplanes, ships, robots or tanks
H04N 13/204 - Image signal generators using stereoscopic image cameras
G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
G06F 3/046 - Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by electromagnetic means
G06T 19/00 - Manipulating 3D models or images for computer graphics
G06F 3/03 - Arrangements for converting the position or the displacement of a member into a coded form
G06F 3/0346 - Pointing devices displaced or positioned by the user; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
Locomotion-based motion sickness has long been a complaint amongst virtual reality gamers and drone pilots. Traditional head-mounted display experiences require a handheld controller (e.g. thumbstick, touchpad, gamepad, keyboard, etc.) for locomotion. Teleportation compromises immersive presence and smooth navigation leads to sensory imbalances that can cause dizziness and nausea (even when using room-scale sensor systems). Designers have therefore had to choose between comfort and immersion. The invention is a hands-free, body-based navigation technology that puts the participant's body in direct control of movement through virtual space. Participants lean forward to advance in space; lean back to reverse; tip left or right to strafe/sidestep; and rotate to look around. In some embodiments, the more a participant leans, the faster they go. Because the interactions were designed to respond to natural bearing and balancing instincts, movement coordination is intuitive and vection-based cybersickness is reduced.
G06F 3/0346 - Pointing devices displaced or positioned by the user; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
3.
Head-mounted display for navigating virtual and augmented reality
Locomotion-based motion sickness has long been a complaint amongst virtual reality gamers and drone pilots. Traditional head-mounted display experiences require a handheld controller (e.g. thumb stick, touchpad, gamepad, keyboard, etc.) for locomotion. Teleportation compromises immersive presence and smooth navigation leads to sensory imbalances that can cause dizziness and nausea (even when using room-scale sensor systems). Designers have therefore had to choose between comfort and immersion. The invention is a hands-free, body-based navigation technology that puts the participant's body in direct control of movement through virtual space. Participants lean forward to advance in space; lean back to reverse; tip left or right to strafe/sidestep; and rotate to look around. In some embodiments, the more a participant leans, the faster they go. Because the interactions were designed to respond to natural bearing and balancing instincts, movement coordination is intuitive and vection-based cybersickness is reduced.
G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
A63F 13/213 - Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
A63F 13/428 - Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
A63F 13/803 - Driving vehicles or craft, e.g. cars, airplanes, ships, robots or tanks
H04N 13/204 - Image signal generators using stereoscopic image cameras
G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
G06F 3/046 - Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by electromagnetic means
G06T 19/00 - Manipulating 3D models or images for computer graphics
G06F 3/03 - Arrangements for converting the position or the displacement of a member into a coded form
G06F 3/0346 - Pointing devices displaced or positioned by the user; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
A63F 13/00 - Video games, i.e. games using an electronically generated display having two or more dimensions
A63F 13/211 - Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
A63F 13/235 - Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console using a wireless connection, e.g. infrared or piconet
A63F 13/92 - Video game devices specially adapted to be hand-held while playing
4.
Head-mounted display for navigating a virtual environment
In some embodiments, a head-mounted display apparatus with a visual display and one or more sensors may make navigation of virtual environments more natural. The invention enables a participant to pivot, tip and aim the apparatus to orient and move through virtual space hands-free.
G06F 3/0481 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
A63F 13/213 - Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
A63F 13/428 - Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
A63F 13/803 - Driving vehicles or craft, e.g. cars, airplanes, ships, robots or tanks
H04N 13/204 - Image signal generators using stereoscopic image cameras
G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot
G06F 3/046 - Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by electromagnetic means
G06T 19/00 - Manipulating 3D models or images for computer graphics
G06F 3/03 - Arrangements for converting the position or the displacement of a member into a coded form
G06F 3/0346 - Pointing devices displaced or positioned by the user; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
A63F 13/00 - Video games, i.e. games using an electronically generated display having two or more dimensions
A63F 13/211 - Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
A63F 13/235 - Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console using a wireless connection, e.g. infrared or piconet
A63F 13/92 - Video game devices specially adapted to be hand-held while playing
A63F 13/525 - Changing parameters of virtual cameras
5.
Remote controlled vehicle with augmented reality overlay
In some embodiments, extemporaneous control of remote objects can be made more natural using the invention, enabling a participant to pivot, tip and aim a head-mounted display apparatus to control a remote-controlled toy or full-sized vehicle, for example, hands-free. If the vehicle is outfitted with a camera, then the participant may see the remote location from first-person proprioceptive perspective.
G09G 5/00 - Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
A63H 30/04 - Electrical arrangements using wireless transmission
A63F 13/92 - Video game devices specially adapted to be hand-held while playing
A63F 13/235 - Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console using a wireless connection, e.g. infrared or piconet
A63F 13/211 - Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
In some embodiments, extemporaneous control of remote objects can be made more natural using the invention, enabling a participant to pivot, tip and aim a handheld display device to control a remote-controlled toy or full-sized vehicle, for example. If the vehicle is outfitted with a camera, then the participant may see the remote location from first-person proprioceptive perspective.
A63F 13/211 - Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
A63F 13/45 - Controlling the progress of the video game
A63F 13/235 - Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console using a wireless connection, e.g. infrared or piconet
A63F 13/92 - Video game devices specially adapted to be hand-held while playing
7.
Head-mounted display for navigating a virtual environment
In some embodiments, a head-mounted apparatus with a visual display and one or more sensors may make navigation of virtual environments more natural. The invention enables a participant to pivot, tip and aim the apparatus to orient and move through virtual space hands-free.
G09G 5/00 - Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
A63F 13/525 - Changing parameters of virtual cameras
A63F 13/211 - Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
A63F 13/428 - Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
8.
Remote controlled vehicle with a head-mounted display
In some embodiments, extemporaneous control of remote objects can be made more natural using the invention, enabling a participant to pivot, tip and aim a head-mounted display apparatus to control a remote-controlled toy or full-sized vehicle, for example, hands-free. If the vehicle is outfitted with a camera, then the participant may see the remote location from first-person proprioceptive perspective.
G06T 19/00 - Manipulating 3D models or images for computer graphics
A63F 13/213 - Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
A63F 13/211 - Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
A63F 13/803 - Driving vehicles or craft, e.g. cars, airplanes, ships, robots or tanks
In some embodiments, extemporaneous control of remote objects can be made more natural using the invention, enabling a participant to pivot, tip and aim a handheld display device to control a remote-controlled toy or full-sized vehicle, for example. If the vehicle is outfitted with a camera, then the participant may see the remote location from first-person proprioceptive perspective.
G09G 5/00 - Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
A63H 30/04 - Electrical arrangements using wireless transmission
A63F 13/211 - Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
A63F 13/45 - Controlling the progress of the video game
10.
Head-mounted display apparatus for navigating a virtual environment
In some embodiments, a head-mounted apparatus with a visual display and one or more sensors may make navigation of virtual environments more natural. The invention enables a participant to pivot, tip and aim the apparatus to orient and move through virtual space hands-free.
G06F 3/0481 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
G06F 3/046 - Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by electromagnetic means
G06T 19/00 - Manipulating 3D models or images for computer graphics
A63F 13/211 - Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
A63F 13/45 - Controlling the progress of the video game
A63H 30/04 - Electrical arrangements using wireless transmission
11.
Remote controlled vehicle with a head-mounted display apparatus
In some embodiments, extemporaneous control of remote objects can be made more natural using the invention, enabling a participant to pivot, tip and aim a head-mounted display apparatus to control a remote-controlled toy or full-sized vehicle, for example, hands-free. If the vehicle is outfitted with a camera, then the participant may see the remote location from first-person proprioceptive perspective.
Departing from one-way linear cinema played on a single rectangular screen, this multi-channel virtual environment involves a cinematic paradigm that undoes habitual ways of framing things, employing architectural concepts in a polylinear video/sound construction to create a type of experience that allows the world to reveal itself and permits discovery on the part of participants. Techniques are disclosed for peripatetic navigation through virtual space with a handheld computing device, leveraging human spatial memory to form a proprioceptive sense of location, allowing a participant to easily navigate amongst a plurality of simultaneously playing videos and to center in front of individual video panes in said space, making it comfortable for a participant to rest in a fixed posture and orientation while selectively viewing any one of the video streams, and providing spatialized 3D audio cues that invite awareness of other content unfolding simultaneously in the virtual environment.
G09G 5/00 - Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
G06F 1/16 - Constructional details or arrangements
H04N 5/445 - Receiver circuitry for displaying additional information
G06F 3/0481 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
G06F 3/046 - Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by electromagnetic means
G06T 19/00 - Manipulating 3D models or images for computer graphics
A63F 13/211 - Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
A63F 13/45 - Controlling the progress of the video game
A63H 30/04 - Electrical arrangements using wireless transmission