Sony Interactive Entertainment Inc.

Japan

Back to Profile

1-100 of 1,451 for Sony Interactive Entertainment Inc. Sort by
Query
Patent
World - WIPO
Aggregations Reset Report
Date
New (last 4 weeks) 24
2024 April (MTD) 22
2024 March 11
2024 February 25
2024 January 20
See more
IPC Class
G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer 257
G09G 5/00 - Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators 118
G06T 19/00 - Manipulating 3D models or images for computer graphics 107
A63F 13/213 - Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells 70
A63F 13/86 - Watching games played by other players 67
See more
Found results for  patents
  1     2     3     ...     15        Next Page

1.

INPUT DEVICE

      
Application Number JP2023035955
Publication Number 2024/084944
Status In Force
Filing Date 2023-10-02
Publication Date 2024-04-25
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Igarashi, Takeshi
  • Tadano, Katsuhisa
  • Okuyama, Isao

Abstract

Provided is an input device that enables a user to quickly recognize functions assigned to operation buttons. A controller (2000) has: a main body (2100); a plurality of second operation buttons (2120, 3200) that are attached to the main body and have an upper surface portion that can be pressed downward; and mark members (2400, 3400) that can be attached to and detached from the plurality of second operation buttons. Each mark member has a top portion (2401, 3401) on which a mark is indicated for the user to recognize the functions assigned to the plurality of second operation buttons.

IPC Classes  ?

  • G06F 3/02 - Input arrangements using manually operated switches, e.g. using keyboards or dials
  • G05G 1/01 - Arrangements of two or more controlling members with respect to one another
  • G05G 1/02 - Controlling members for hand-actuation by linear movement, e.g. push buttons

2.

INPUT DEVICE AND OPERATING BUTTON

      
Application Number JP2023035957
Publication Number 2024/084946
Status In Force
Filing Date 2023-10-02
Publication Date 2024-04-25
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Morimoto, So
  • Tadano, Katsuhisa

Abstract

Provided are an input device that facilitates the work of mounting an operating button, and an operating button. A controller (2000) comprises: a first magnetic structure which includes at least two magnetic poles that are provided in at least one of a button receiving portion (2301) and an operating button (2301) and are spaced apart in a first direction; and a second magnetic structure which includes a magnetic body (2152) or a magnet that is provided in the other of the button receiving portion and the operating button, and that is opposed to the at least two magnetic poles. The operating button comprises an engagement portion (2151) that is positioned in a second direction intersecting the first direction with respect to the at least two magnetic poles, and that is engaged with the button receiving portion.

IPC Classes  ?

  • H01H 13/14 - Operating parts, e.g. push-button
  • A63F 13/24 - Constructional details thereof, e.g. game controllers with detachable joystick handles
  • G05G 1/01 - Arrangements of two or more controlling members with respect to one another
  • G05G 1/02 - Controlling members for hand-actuation by linear movement, e.g. push buttons

3.

INPUT DEVICE AND OPERATION BUTTON

      
Application Number JP2023035956
Publication Number 2024/084945
Status In Force
Filing Date 2023-10-02
Publication Date 2024-04-25
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Morimoto, So
  • Tadano, Katsuhisa

Abstract

Provided is a novel input device that enables comfortable operation for a user who has difficulty feeling comfortable when operating conventional input devices. A controller (2000) comprises: a body (2100); a first operation button (2110) attached to the body; and a second operation button (2120E) attached to the body at a position different from the position of the first operation button. The first operation button has an upper surface part serving as an operation part with which the user performs an operation. The second operation button has an upper surface part (2123E) serving as an operation part with which the user performs an operation, a portion of the upper surface part having an overhang (2132E) that covers a portion of the upper surface part of the first operation button.

IPC Classes  ?

  • H01H 13/14 - Operating parts, e.g. push-button
  • A63F 13/24 - Constructional details thereof, e.g. game controllers with detachable joystick handles
  • G05G 1/01 - Arrangements of two or more controlling members with respect to one another
  • G05G 1/02 - Controlling members for hand-actuation by linear movement, e.g. push buttons
  • H01H 13/705 - Switches having rectilinearly-movable operating part or parts adapted for pushing or pulling in one direction only, e.g. push-button switch having a plurality of operating members associated with different sets of contacts, e.g. keyboard with contacts carried by or formed from layers in a multilayer structure, e.g. membrane switches characterised by construction, mounting or arrangement of operating parts, e.g. push-buttons or keys
  • H01H 21/00 - Switches operated by an operating part in the form of a pivotable member acted upon directly by a solid body, e.g. by a hand

4.

METERING BITS IN VIDEO ENCODER TO ACHIEVE TARGETED DATA RATE AND VIDEO QUALITY LEVEL

      
Application Number US2023076743
Publication Number 2024/086483
Status In Force
Filing Date 2023-10-12
Publication Date 2024-04-25
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor Mehta, Milan

Abstract

For stability of a bit rate for groups of pictures (GOPs), a rate buffer bit controller feedback loop and a proportional integral derivative (PID) bit controller feedback loop (700) may be used to maintain at least one video buffer.

IPC Classes  ?

  • H04N 19/172 - Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a picture, frame or field
  • H04N 19/124 - Quantisation
  • H04N 19/174 - Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a slice, e.g. a line of blocks or a group of blocks
  • H04N 19/147 - Data rate or code amount at the encoder output according to rate distortion criteria
  • H04N 19/152 - Data rate or code amount at the encoder output by measuring the fullness of the transmission buffer

5.

HAPTIC ASSET GENERATION FOR ECCENTRIC ROTATING MASS (ERM) FROM LOW FREQUENCY AUDIO CONTENT

      
Application Number US2023076369
Publication Number 2024/081588
Status In Force
Filing Date 2023-10-09
Publication Date 2024-04-18
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Pontiga, Christopher M.
  • Kwun, Arthur

Abstract

Computer game developers can implicitly create haptic assets from audio assets. A low pass filter passes (302) only audio assets with frequencies less than a threshold to a mapping module. The audio assets are then mapped (304) to haptic assets that can be output (306) by an ERM (208/700) of a computer game controller (206). The haptic output can be in synchronization with play of the audio assets on speakers.

IPC Classes  ?

  • G05G 9/047 - Manually-actuated control mechanisms provided with one single controlling member co-operating with two or more controlled members, e.g. selectively, simultaneously the controlling member being movable in different independent ways, movement in each individual way actuating one controlled member only in which movement in two or more ways can occur simultaneously the controlling member being movable by hand about orthogonal axes, e.g. joysticks
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/038 - Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
  • A63F 13/20 - Input arrangements for video game devices
  • A63F 13/285 - Generating tactile feedback signals via the game input device, e.g. force feedback
  • G06F 3/0481 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
  • G06F 3/00 - Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements

6.

ALTERING AUDIO AND/OR PROVIDING NON-AUDIO CUES ACCORDING TO LISTENER'S AUDIO DEPTH PERCEPTION

      
Application Number US2023076305
Publication Number 2024/081567
Status In Force
Filing Date 2023-10-07
Publication Date 2024-04-18
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor Dorn, Victoria

Abstract

The 3D audio perception of a listener such as a computer gamer is tested "stereoscopically" and the results input to a source of audio such as a computer game. Audio (802) from the source of audio (such as a head-mounted display of a computer game system or speaker outputting audio from a game console) may be altered (810) to account for the listener's measured 3D audio acuity. In addition, or alternatively, visual or haptic cues may be provided (814) to alert the listener of 3D audio events.

IPC Classes  ?

  • G06F 16/60 - Information retrieval; Database structures therefor; File system structures therefor of audio data
  • G10L 13/02 - Methods for producing synthetic speech; Speech synthesisers

7.

GROUP CONTROL OF COMPUTER GAME USING AGGREGATED AREA OF GAZE

      
Application Number US2023076306
Publication Number 2024/081568
Status In Force
Filing Date 2023-10-07
Publication Date 2024-04-18
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor Karp, Sarah

Abstract

Groups of people control a computer game using teamwork. This can be done by eye tracking (400) of each person to detect where each person is looking on screen at objects such as game control objects. The control action of the object looked at by the most people (404) in a "heat map" style of data collection is implemented (408) by the game.

IPC Classes  ?

  • A63F 13/213 - Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
  • G02B 27/01 - Head-up displays
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

8.

CUSTOMIZABLE VIRTUAL REALITY SCENES USING EYE TRACKING

      
Application Number US2023076394
Publication Number 2024/081598
Status In Force
Filing Date 2023-10-09
Publication Date 2024-04-18
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor Karimi, Sepideh

Abstract

Eye tracking (1100) of the wearer of a virtual reality headset is used to customize/personalize (1102) VR video. Based on eye tracking, the VR scene may present different types of trees (302, 304, 306) for different types of gaze directions. As another example, based on gaze direction, a VR scene can be augmented with additional objects (502) based on gaze direction at a particular related object. A friend's gaze-dependent personalization may be imported (1104) into the wearer's system to increase companionship and user engagement. Customized options can be recorded and sold to other players.

IPC Classes  ?

  • A63F 13/525 - Changing parameters of virtual cameras
  • A63F 13/67 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor adaptively or by learning from player actions, e.g. skill level adjustment or by storing successful combat sequences for re-use

9.

REAL WORLD SIMULATION FOR META-VERSE

      
Application Number US2023076553
Publication Number 2024/081704
Status In Force
Filing Date 2023-10-11
Publication Date 2024-04-18
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor Rao, Akshay, V.

Abstract

Methods and systems for reconstructing a game world of a video game includes tracking status of game objects in the game world to detect wear on the one or more game objects exceeding a predefined threshold. An option to rebuild the one or more game objects is provided to a user and tools to rebuild the one or more game objects are provide in response to the user selecting the option to rebuild the game objects. The rebuilt game objects are used during subsequent gameplay of the video game.

IPC Classes  ?

  • A63F 13/63 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor by the player, e.g. authoring using a level editor

10.

HAPTIC FINGERPRINT OF USER'S VOICE

      
Application Number US2023076578
Publication Number 2024/081720
Status In Force
Filing Date 2023-10-11
Publication Date 2024-04-18
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor Dorn, Victoria

Abstract

To enhance the sensory experience of voice, in some cases at a later time than the speech was spoken (300) to enable reliving emotions and experiences, vocal sounds captured by a microphone are processed (304) by a computer game controller API. The API plays back (306) the vocal sounds at a later time in haptic format on the controller. The vocal sounds may be computer game dialogue, party chat, or vocal sounds of the user as demanded by the computer game.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/16 - Sound input; Sound output
  • G10L 25/48 - Speech or voice analysis techniques not restricted to a single one of groups specially adapted for particular use
  • G06F 3/00 - Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
  • G10L 25/00 - Speech or voice analysis techniques not restricted to a single one of groups

11.

METHOD AND SYSTEM FOR AUTO-PLAYING PORTIONS OF A VIDEO GAME

      
Application Number US2023075615
Publication Number 2024/076882
Status In Force
Filing Date 2023-09-29
Publication Date 2024-04-11
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor Rudi, Olga

Abstract

A method for providing an auto-play mode option to a user during gameplay of a video game includes accessing, by a server, a user play model, which incorporates extracted features related to gameplay by the user and classification of the extracted features. The accessing of the model is triggered at a current time during gameplay. The method also includes identifying, by the server, predicted interactive activity that is predicted to occur ahead of the current time of gameplay. The method further includes identifying, by the server, at least part of the predicted interactive activity to be anticipated grinding content (AGC). The method also includes providing a notification, by the server, to a display screen of a user device, where the notification identifies the AGC in upcoming gameplay and provides the user with an option to use the auto-play mode during gameplay of the AGC.

IPC Classes  ?

  • A63F 13/5375 - Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for graphically or textually suggesting an action, e.g. by displaying an arrow indicating a turn in a driving game
  • A63F 13/67 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor adaptively or by learning from player actions, e.g. skill level adjustment or by storing successful combat sequences for re-use
  • A63F 13/79 - Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories
  • A63F 13/35 - Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers - Details of game servers
  • A63F 13/497 - Partially or entirely replaying previous game actions

12.

SIGNAL PROCESSING CIRCUIT, SIGNAL PROCESSING METHOD, AND PROGRAM

      
Application Number JP2022037227
Publication Number 2024/075198
Status In Force
Filing Date 2022-10-05
Publication Date 2024-04-11
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Mizuno Masayoshi
  • Arai Kiyotsugu

Abstract

Provided is a signal processing circuit for processing event signals generated by an event-based vision sensor (EVS), the signal processing circuit comprising a memory for storing program code and a processor for executing operations according to the program code, wherein the operations include: detecting at least one line segment or curve formed by a set of in-block positions of event signals generated in blocks into which an EVS detection area is divided; and correcting at least one of a first line segment or a first curve detected in the first block, or a second line segment or a second curve detected in a second block adjacent to the first block, so that a first end point of the first line segment or the first curve overlaps a second end point of the second line segment or the second curve.

IPC Classes  ?

  • H04N 23/60 - Control of cameras or camera modules
  • G06T 1/60 - Memory management
  • H04N 23/54 - Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
  • H04N 25/00 - Circuitry of solid-state image sensors [SSIS]; Control thereof

13.

TEXT MESSAGE OR APP FALLBACK DURING NETWORK FAILURE IN A VIDEO GAME

      
Application Number US2023074008
Publication Number 2024/076819
Status In Force
Filing Date 2023-09-12
Publication Date 2024-04-11
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Rudi, Olga
  • Lin, Frank
  • Coles, David
  • Sundaram, Chockalingam, Ravi
  • Madhavan, Coimbatore, Ravi

Abstract

A method for managing gameplay of a video game is provided, including: executing a session of a video game by a cloud gaming resource; streaming video generated by the session over a network to a client device associated to a player of the video game, to enable gameplay of the session by the player; detecting a loss of network connectivity between the client device and the session; responsive to detecting the loss of network connectivity, then initiating transmission of updates regarding the session, via an alternative communication channel, to a secondary device associated to the player.

IPC Classes  ?

  • A63F 13/358 - Adapting the game course according to the network or server load, e.g. for reducing latency due to different connection speeds between clients
  • A63F 13/67 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor adaptively or by learning from player actions, e.g. skill level adjustment or by storing successful combat sequences for re-use

14.

SYSTEMS AND METHODS FOR APPLYING A MODIFICATION MICROSERVICE TO A GAME INSTANCE

      
Application Number US2023075608
Publication Number 2024/076881
Status In Force
Filing Date 2023-09-29
Publication Date 2024-04-11
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor Colenbrander, Roelof, Roderick

Abstract

A method for implementing a modification microservice with a game cloud system is described. The method includes executing a game instance of a game. The game instance is executed using a plurality of microservices assembled for the game instance. The method further includes accessing a modification microservice engineered to be executed with the game instance. The modification microservice adds a compute capability to the game instance. The modification microservice is executed outside of a server system in which the plurality of microservices is assembled for the game instance. Also, the modification microservice is accessed by one or more application programming interface (API) calls that obtain results data from said execution of the modification microservice. The one or more API calls are managed via a modification interface that manages the access to the modification microservice and use of the results data by the game instance.

IPC Classes  ?

  • A63F 13/352 - Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers - Details of game servers involving special game server arrangements, e.g. regional servers connected to a national server or a plurality of servers managing partitions of the game world
  • A63F 13/335 - Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using wide area network [WAN] connections using Internet
  • A63F 13/355 - Performing operations on behalf of clients with restricted processing capabilities, e.g. servers transform changing game scene into an MPEG-stream for transmitting to a mobile phone or a thin client
  • A63F 13/67 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor adaptively or by learning from player actions, e.g. skill level adjustment or by storing successful combat sequences for re-use
  • A63F 13/73 - Authorising game programs or game devices, e.g. checking authenticity

15.

SYSTEMS AND METHODS FOR INTEGRATING REAL-WORLD CONTENT IN A GAME

      
Application Number US2023075725
Publication Number 2024/076919
Status In Force
Filing Date 2023-10-02
Publication Date 2024-04-11
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor Rao, Sharath

Abstract

A method for integration of real- world content into a game is described. The method includes receiving a request to play the game and accessing overlay multimodal data generated from a portion of real-world multimodal data received as user generated content (RGC). The overlay multimodal data relates to authored multimodal data generated for the game. The method includes replacing the authored multimodal data in one or more scenes of the game with the overlay multimodal data.

IPC Classes  ?

  • A63F 13/213 - Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
  • A63F 13/63 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor by the player, e.g. authoring using a level editor
  • A63F 13/65 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition

16.

RAPID GENERATION OF 3D HEADS WITH NATURAL LANGUAGE

      
Application Number US2023074151
Publication Number 2024/073247
Status In Force
Filing Date 2023-09-14
Publication Date 2024-04-04
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Aquino, Mager, Kamel
  • Raymond, Jade

Abstract

Two dimensional images are converted (302) to a 3D neural radiance field (NeRF), which is modified (402) based on text input to resemble the type of character demanded by the text. An open-source "CLIP" model scores (404) how well an image matches a line of text to produce a final 3D NeRF, which may be converted (408) to a polygonal mesh and imported into a computer simulation such as a computer game.

IPC Classes  ?

  • G06N 3/006 - Artificial life, i.e. computing arrangements simulating life based on simulated virtual individual or collective life forms, e.g. social simulations or particle swarm optimisation [PSO]
  • G06N 3/042 - Knowledge-based neural networks; Logical representations of neural networks
  • G06N 5/02 - Knowledge representation; Symbolic representation
  • A63F 13/67 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor adaptively or by learning from player actions, e.g. skill level adjustment or by storing successful combat sequences for re-use
  • G06N 20/00 - Machine learning
  • G06F 40/40 - Processing or translation of natural language

17.

SYSTEMS AND METHODS FOR MANAGING VIRTUAL WORLD IN CLOUD-BASED GAMING

      
Application Number US2023033162
Publication Number 2024/072663
Status In Force
Filing Date 2023-09-19
Publication Date 2024-04-04
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor Azmandian, Mahdi

Abstract

A cloud-based gaming system generates first and second instances of a virtual world of an online game for first and second players, respectively. First and second video streams of the first and second instances of the virtual world, respectively, are transmitted to the first and second players, respectively. The second video stream includes a ghosted version of a feature within the first instance of the virtual world. A request is received from the second player to merge the first and second instances of the virtual world. With the first player's approval, a merged instance of the virtual world is automatically generated by the cloud-gaming system as a combination of the first and second instances of the virtual world. Third and fourth video streams of the merged instance of the virtual world are transmitted to the first and second players, respectively, in lieu of the first and second video streams, respectively.

IPC Classes  ?

  • A63F 13/355 - Performing operations on behalf of clients with restricted processing capabilities, e.g. servers transform changing game scene into an MPEG-stream for transmitting to a mobile phone or a thin client
  • A63F 13/47 - Controlling the progress of the video game involving branching, e.g. choosing one of several possible scenarios at a given point in time
  • A63F 13/493 - Resuming a game, e.g. after pausing, malfunction or power failure
  • A63F 13/69 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor by enabling or updating specific game elements, e.g. unlocking hidden features, items, levels or versions

18.

USING VECTOR GRAPHICS TO CREATE 3D CONTENT

      
Application Number US2023072967
Publication Number 2024/073203
Status In Force
Filing Date 2023-08-27
Publication Date 2024-04-04
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor Krishnamurthy, Sudha

Abstract

Deep learning techniques such as vector graphics (300) are used to create 3D content and assets for metaverse applications. Vector graphics is a scalable format that provides rich 3D content. A vector graphics encoder (302) such as a deep neural network such as a recurrent neural network (RNN) or transformer receives (400) vector graphics and generates (402) an encoded output. The encoded output is decoded (404) by a 3D decoder such as another deep neural network that outputs 2D graphics for comparison with the original image. Loss is computed (408) between the original and the output of the 3D decoder. The loss is back propagated (410) to train the vector graphics encoder to generate 3D content.

IPC Classes  ?

19.

CUSTOMIZED DIGITAL HUMANS AND PETS FOR META VERSE

      
Application Number US2023073253
Publication Number 2024/073215
Status In Force
Filing Date 2023-08-31
Publication Date 2024-04-04
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor Krishnamurthy, Sudha

Abstract

Deep learning is used to dynamically adapt virtual humans (300) in metaverse applications. The adaptation can be according to user preferences (400). In addition or alternatively, virtual humans and pets (302) can be adapted for metaverse applications based on demographics (408) of the user. The user's personal demographics may be used to establish (410) the costume, skin color, emotion, voice, and behavior of the virtual humans. Similar considerations may be used to adapt virtual pets to the user's experience of the metaverse.

IPC Classes  ?

  • A63F 13/655 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition by importing photos, e.g. of the player
  • A63F 13/213 - Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
  • G06N 20/00 - Machine learning
  • G06T 13/40 - 3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings

20.

OUTPUTTING BRAILLE OR SUBTITLES USING COMPUTER GAME CONTROLLER

      
Application Number US2023074147
Publication Number 2024/073246
Status In Force
Filing Date 2023-09-14
Publication Date 2024-04-04
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor Rudi, Olga

Abstract

To help a computer game player in understanding a computer game, upon pausing (300, 500) the game, visual subtitles may be presented (304). In addition, or alternatively, Braille representing subtitles may be output (506) as a series of vibrations on a touch pad of the controller. When the person's finger reaches the edge of the touch pad, a new series of Braille subtitles may be presented (510). Depending on where the player is in reading the subtitles and how fast the player reads them, the game video may be slowed down (310) from normal speed.

IPC Classes  ?

  • G09B 21/00 - Teaching, or communicating with, the blind, deaf or mute
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

21.

GAME ASSET OPTIMIZATION OVER NETWORK AT OPTIMIZER SERVER

      
Application Number US2023074514
Publication Number 2024/073269
Status In Force
Filing Date 2023-09-18
Publication Date 2024-04-04
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor Starich, Michael

Abstract

A method including receiving from a device over a network at an optimizer server a plurality of game assets of a video game. The method including generating at least one combined game asset to represent the plurality of game assets. The method including sending the at least one combined game asset to the device for use in the video game.

IPC Classes  ?

  • A63F 13/355 - Performing operations on behalf of clients with restricted processing capabilities, e.g. servers transform changing game scene into an MPEG-stream for transmitting to a mobile phone or a thin client
  • G06T 17/00 - 3D modelling for computer graphics
  • A63F 13/60 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor

22.

SIGNAL PROCESSING CIRCUIT, SIGNAL PROCESSING METHOD, AND PROGRAM

      
Application Number JP2022036235
Publication Number 2024/069805
Status In Force
Filing Date 2022-09-28
Publication Date 2024-04-04
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Mizuno Masayoshi
  • Arai Kiyotsugu

Abstract

Provided is a signal processing circuit which processes an event signal generated by an event-based vision sensor (EVS) and which comprises a memory for storing a program code and a processor for executing an operation according to the program code, wherein the operation includes using a first method to detect a relationship between positions within a block from event signals generated in blocks obtained by dividing an EVS detection region if the ratio of the eigenvalues of the variance-covariance matrix of the positions exceeds a threshold value and using a second method different from the first method to detect the relationship between the positions if the ratio of the eigenvalues does not exceed the threshold value.

IPC Classes  ?

  • H04N 23/60 - Control of cameras or camera modules

23.

SYSTEMS AND METHODS FOR MODIFYING USER SENTIMENT FOR PLAYING A GAME

      
Application Number US2023073680
Publication Number 2024/064529
Status In Force
Filing Date 2023-09-07
Publication Date 2024-03-28
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor Palacios, Jorge Arroyo

Abstract

A method for modifying user sentiment is described. The method includes analyzing behavior of a group of players during a play of a game. The behavior of the group of players is indicative of a sentiment of the group of players during the play of the game. The method includes accessing a non-player character (NPC) during the play of the game. The NPC has a characteristic that influences a change in the sentiment of the group of players. The method includes placing the NPC into one or more scenes of the game during the play of the game for a period of time until the change in the sentiment of the group of players is determined. The change in the sentiment of the group of players is determined by analyzing of the behavior of the group of players during said play of the game.

IPC Classes  ?

  • A63F 13/213 - Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
  • A63F 13/215 - Input arrangements for video game devices characterised by their sensors, purposes or types comprising means for detecting acoustic signals, e.g. using a microphone
  • A63F 13/65 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
  • A63F 13/67 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor adaptively or by learning from player actions, e.g. skill level adjustment or by storing successful combat sequences for re-use

24.

AI PLAYER MODEL GAMEPLAY TRAINING AND HIGHLIGHT REVIEW

      
Application Number US2023074451
Publication Number 2024/064614
Status In Force
Filing Date 2023-09-18
Publication Date 2024-03-28
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Arimatsu, Kazuyuki
  • Kaushik, Lakshmish

Abstract

Methods and systems for engaging an Al player of a user to play a video game on behalf of the user includes creating the Al player for the user using at least some of the attributes of the user, training the Al player using inputs provided by the user during game play of the video game, and providing access to the video game for game play to the Al player. The access allows the Al player to provide inputs to the video game that substantially mimics a play style of the user. Control of the game play of the video game can be transitioned to the user at any time during the game play of the Al player. The user can also control the game play of the Al player from a video recording of the game play.

IPC Classes  ?

  • A63F 13/355 - Performing operations on behalf of clients with restricted processing capabilities, e.g. servers transform changing game scene into an MPEG-stream for transmitting to a mobile phone or a thin client
  • A63F 13/493 - Resuming a game, e.g. after pausing, malfunction or power failure
  • A63F 13/497 - Partially or entirely replaying previous game actions
  • A63F 13/56 - Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding
  • A63F 13/67 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor adaptively or by learning from player actions, e.g. skill level adjustment or by storing successful combat sequences for re-use
  • A63F 13/86 - Watching games played by other players

25.

MULTI-ORDER OPTIMIZED AMBISONICS ENCODING

      
Application Number US2023073250
Publication Number 2024/059438
Status In Force
Filing Date 2023-08-31
Publication Date 2024-03-21
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor Sangston, Brandon

Abstract

A technique for encoding Ambisonics audio includes inputting audio to multiple Ambisonics encoders producing respective Ambisonics soundfields. Prior to mixing the soundfields, each soundfield is weighted to mitigate artifacts from order-truncation. After weighting, the soundfields are mixed to produce Ambisonics audio. Accordingly, an apparatus includes at least one processor configured with instructions which are executable to receive mono audio sources with direction and target Ambisonics order respectively and send respective mono audio with respective direction to a respective Ambisonics encoder to cause the encoder to output a respective soundfield of respective Ambisonics order.

IPC Classes  ?

  • G10L 19/008 - Multichannel audio signal coding or decoding using interchannel correlation to reduce redundancy, e.g. joint-stereo, intensity-coding or matrixing
  • H04S 3/00 - Systems employing more than two channels, e.g. quadraphonic
  • H04N 21/2368 - Multiplexing of audio and video streams

26.

OPERATION DEVICE

      
Application Number JP2022034561
Publication Number 2024/057481
Status In Force
Filing Date 2022-09-15
Publication Date 2024-03-21
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Mori, Hideki
  • Koda, Yuta
  • Takahashi, Yoshihisa
  • Shiono, Koichi

Abstract

Provided is an operation device (10) capable of expressing a pseudo weight. This operation device (10) includes: a plurality of link shafts (SF); a plurality of node mechanism units (ND) that form a grid with the plurality of link shafts (SF), each of the node mechanism units (ND) respectively holding one end of two or more link shafts (SF) among the plurality of link shafts (SF) in a manner that allows changing of the orientations of the two or more link shafts (SF); a placement table (90) on which the plurality of node mechanism units (ND) are placed; and a pulling mechanism (80) that pulls the node mechanism units (ND) in a direction for returning to a predetermined reference position on the placement table (90).

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

27.

INTERACTIVE AUGMENTED REALITY TROPHY SYSTEM

      
Application Number US2023028614
Publication Number 2024/058859
Status In Force
Filing Date 2023-07-25
Publication Date 2024-03-21
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor Sutton, Ryan

Abstract

Interactive display of virtual trophies includes scanning a surface for one or more location anchor points. A trophy rack location is determined using the location anchor points. A trophy rack mesh is applied over an image frame of the surface using the determined trophy rack location. One or more trophy models are displayed over the trophy rack mesh with a display device. A trophy rack layout is generated information from the one or more trophy models and the trophy rack mesh and finally the trophy rack layout information is stored or transmitted.

IPC Classes  ?

  • G06T 17/20 - Wire-frame description, e.g. polygonalisation or tessellation
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 15/20 - Perspective computation
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06T 15/04 - Texture mapping
  • G06T 15/08 - Volume rendering

28.

MULTI-ORDER OPTIMIZED AMBISONICS DECODING

      
Application Number US2023073252
Publication Number 2024/059439
Status In Force
Filing Date 2023-08-31
Publication Date 2024-03-21
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor Sangston, Brandon

Abstract

Ambisonics audio such as may be used for computer simulations such as computer games is improved by using multi-order optimizations that frame an optimization problem that minimizes a cost function (602) across a subset of Ambisonics orders for a chosen Ambisonics order "N". In a simple form, this cost function minimizes error across all orders (0 <= n <= N), and additional weighting (604) is applied to emphasize or de-emphasize particular orders. The cost functions and optimization criteria may be different for binaural and speaker outputs.

IPC Classes  ?

  • G10L 19/00 - Speech or audio signal analysis-synthesis techniques for redundancy reduction, e.g. in vocoders; Coding or decoding of speech or audio signals, using source filter models or psychoacoustic analysis
  • G06F 16/60 - Information retrieval; Database structures therefor; File system structures therefor of audio data

29.

OPERATION DEVICE

      
Application Number JP2022033877
Publication Number 2024/053087
Status In Force
Filing Date 2022-09-09
Publication Date 2024-03-14
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Mori, Hideki
  • Matsugami, Hiroya
  • Nishidate, Masaomi
  • Takahashi, Yoshihisa
  • Shiono, Koichi
  • Kaneko, Hirofumi

Abstract

Provided is an operation device (10) capable of expressing haptic perception. The operation device (10) comprises: a plurality of link shafts (SF); a plurality of node mechanism parts (ND) forming a lattice shape with the plurality of link shafts (SF), each of the node mechanism parts (ND) holding one end of at least two or more link shafts (SF) among the plurality of link shafts (SF) so that it is possible to change the orientation of the two or more link shafts (SF); and a vibration unit that vibrates the operation device (10) according to the state of at least one of the plurality of node mechanism parts (ND).

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

30.

TRACKING HISTORICAL GAME PLAY OF ADULTS TO DETERMINE GAME PLAY ACTIVITY AND COMPARE TO ACTIVITY BY A CHILD, TO IDENTIFY AND PREVENT CHILD FROM PLAYING ON AN ADULT ACCOUNT

      
Application Number US2023073592
Publication Number 2024/054877
Status In Force
Filing Date 2023-09-06
Publication Date 2024-03-14
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor Osman, Steven

Abstract

Methods and systems for warning misuse of a user account of an adult user includes tracking use of the user account. The interactions at the user account are monitored and when the content accessed by a user is adult content and the user is determined to be a child, providing an alert to the adult user informing the adult user of the child accessing age-inappropriate content.

IPC Classes  ?

  • G06F 21/31 - User authentication
  • G06F 21/62 - Protecting access to data via a platform, e.g. using keys or access control rules

31.

AI STREAMER WITH FEEDBACK TO AI STREAMER BASED ON SPECTATORS

      
Application Number US2023072339
Publication Number 2024/050236
Status In Force
Filing Date 2023-08-16
Publication Date 2024-03-07
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor Arimatsu, Kazuyuki

Abstract

A method is provided, including: executing a session of a video game; executing an artificial intelligence (Al) player that performs gameplay in the session of the video game; streaming video of the Al player's gameplay over a network to one or more spectator devices for viewing by one or more spectators respectively associated to the one or more spectator devices; receiving, over the network from the one or more spectator devices, feedback data indicating reactions of the one or more spectators to the video of the Al player's gameplay; adjusting the gameplay by the Al player based on the feedback data.

IPC Classes  ?

  • A63F 13/355 - Performing operations on behalf of clients with restricted processing capabilities, e.g. servers transform changing game scene into an MPEG-stream for transmitting to a mobile phone or a thin client
  • A63F 13/67 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor adaptively or by learning from player actions, e.g. skill level adjustment or by storing successful combat sequences for re-use
  • A63F 13/86 - Watching games played by other players
  • A63F 13/87 - Communicating with other players during game play, e.g. by e-mail or chat

32.

CONTROLLER BUTTON LABELING

      
Application Number US2023071883
Publication Number 2024/050214
Status In Force
Filing Date 2023-08-08
Publication Date 2024-03-07
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor Black, Glenn

Abstract

An accessibility computer game controller includes a central control button (402) on a round base (400) and peripheral control buttons (404) on the base surrounding the central control button. The peripheral control buttons can have distinct sizes and shapes. Removable button labels (1112) can be applied on top of or underneath the buttons to aid in button identification.

IPC Classes  ?

  • A63F 13/24 - Constructional details thereof, e.g. game controllers with detachable joystick handles
  • A63F 13/245 - Constructional details thereof, e.g. game controllers with detachable joystick handles specially adapted to a particular type of game, e.g. steering wheels
  • A63F 13/90 - Constructional details or arrangements of video game devices not provided for in groups  or , e.g. housing, wiring, connections or cabinets
  • A63F 13/00 - Video games, i.e. games using an electronically generated display having two or more dimensions
  • A63F 13/20 - Input arrangements for video game devices
  • A63F 13/92 - Video game devices specially adapted to be hand-held while playing
  • A63F 13/98 - Accessories, i.e. detachable arrangements optional for the use of the video game device, e.g. grip supports of game controllers

33.

DUAL CAMERA TRACKING SYSTEM

      
Application Number US2023072873
Publication Number 2024/050280
Status In Force
Filing Date 2023-08-24
Publication Date 2024-03-07
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor Zhao, Frank

Abstract

A dual camera tracking system includes a main imager (200) and an auxiliary imager (204) the output of which is used to alter an aim and/or focus of the main imager. Both imagers may be mounted on a common housing (210). In embodiments, the common housing may be a head-mounted display (HMD) for a computer simulation such as a computer game.

IPC Classes  ?

  • H04N 13/25 - Image signal generators using stereoscopic image cameras using image signals from one sensor to control the characteristics of another sensor
  • H04N 23/67 - Focus control based on electronic image sensor signals
  • H04N 23/69 - Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
  • H04N 23/90 - Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
  • G02B 27/01 - Head-up displays

34.

SMOOTH SWITCHOVER OF COMPUTER GAME CONTROL

      
Application Number US2023071881
Publication Number 2024/044470
Status In Force
Filing Date 2023-08-08
Publication Date 2024-02-29
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Rudi, Olga
  • Azmandian, Mahdi
  • Arroyo Palacios, Jorge

Abstract

Techniques are described for smooth switchover of computer game control. The current states of game input is communicated (1800) to a new player assuming control. The new player is allowed time (1802) to catch up to the game. New player control is detected (1804), and any errors are communicated (1806) to the new player. If there are differences between the old control scheme and that of the new player, they are reconciled (1808). The outgoing player is adjusted (1810) to the transition.

IPC Classes  ?

  • A63F 13/23 - Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console
  • A63F 13/25 - Output arrangements for video game devices
  • A63F 13/45 - Controlling the progress of the video game

35.

INFORMATION PROCESSING DEVICE AND IMAGE GENERATION METHOD

      
Application Number JP2023026528
Publication Number 2024/042929
Status In Force
Filing Date 2023-07-20
Publication Date 2024-02-29
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Tokunaga Nodoka
  • Matsunaga Kiyobumi
  • Fujihara Masahiro

Abstract

An estimation processing unit 220 derives posture information indicating the posture of an HMD worn on the head of a user. A game image generation unit 230 uses the posture information relating to the HMD to generate a content image of a three-dimensional virtual reality space displayed on the HMD. A system image generation unit 240 generates a system image for configuring settings relating to a camera image to be distributed together with the content image in a state where the HMD is worn on the head of the user.

IPC Classes  ?

  • G06F 3/04847 - Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
  • A63F 13/21 - Input arrangements for video game devices characterised by their sensors, purposes or types
  • A63F 13/25 - Output arrangements for video game devices
  • A63F 13/525 - Changing parameters of virtual cameras
  • G09G 5/00 - Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
  • G09G 5/377 - Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of individual graphic patterns using a bit-mapped memory - Details of the operation on graphic patterns for mixing or overlaying two or more graphic patterns
  • G09G 5/38 - Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of individual graphic patterns using a bit-mapped memory with means for controlling the display position

36.

INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING SYSTEM, METHOD FOR CONTROLLING INFORMATION PROCESSING DEVICE, AND PROGRAM

      
Application Number JP2022032117
Publication Number 2024/042687
Status In Force
Filing Date 2022-08-25
Publication Date 2024-02-29
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Mori, Hideki
  • Nishidate, Masaomi
  • Matsugami, Hiroya

Abstract

This information processing device is connected to a first type of display device, and is communicably connected to another information processing device which is connected to a display device of a different type than the first type of display device. The information processing device comprises a processor. The processor accesses setting information for a virtual space shared with the other information processing device, and executes a process for arranging a virtual character in the virtual space represented by the setting information. In the process for arranging the virtual character, the virtual character is arranged by using a parameter determined according to the type of the display device connected to the information processing device.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics

37.

INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING SYSTEM, METHOD FOR CONTROLLING INFORMATION PROCESSING DEVICE, AND PROGRAM

      
Application Number JP2022032118
Publication Number 2024/042688
Status In Force
Filing Date 2022-08-25
Publication Date 2024-02-29
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Mori, Hideki
  • Nishidate, Masaomi
  • Matsugami, Hiroya

Abstract

An information processing device connected to a display device which captures an image of at least one user candidate who is present in the periphery and sets one person selected from the captured user candidate as a user, the information processing device comprising a processor, obtaining an image of the user candidate captured by the display device and sending the image to another information processing device, accepting, from the other information processing device, information identifying one user candidate selected from the user candidate captured by the display device, and controlling the display device to set the user candidate identified by the accepted information as a user.

IPC Classes  ?

  • A63F 13/25 - Output arrangements for video game devices
  • A63F 13/48 - Starting a game, e.g. activating a game device or waiting for other players to join a multiplayer session

38.

COMMUNICATION SYSTEM, RECEPTION DEVICE, TRANSMISSION DEVICE, PROGRAM, AND COMMUNICATION METHOD

      
Application Number JP2022032264
Publication Number 2024/042722
Status In Force
Filing Date 2022-08-26
Publication Date 2024-02-29
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor Tanikawa, Masakazu

Abstract

Provided are a communication system, reception device, transmission device, program, and communication method which can perform high-quality communication. The communication system comprises: a transmission device that includes a transmission-side processor, a transmission-side memory, and first and second transmission-side communication IFs, wherein the transmission device transmits frame data; and a reception device that includes a reception-side processor, a reception-side memory, and first and second reception-side communication IFs. In a plurality of communication states in which both the first and second reception-side communication IFs are used to receive the frame data, the reception-side processor generates a control signal for switching to a single communication state in which one, selected between the first and second-side communication IFs on the basis of the communication quality, is used to receive the frame data. In the case of being in the single communication state, the reception-side processor generates a control signal for switching to the plurality of communication states on the basis of the communication quality of one of the first and second reception-side communication IFs, which receives the plurality of pieces of frame data.

IPC Classes  ?

39.

INFORMATION PROCESSING DEVICE AND INFORMATION PROCESSING METHOD

      
Application Number JP2023028693
Publication Number 2024/038787
Status In Force
Filing Date 2023-08-07
Publication Date 2024-02-22
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Hayakawa Yuto
  • Minamino Takanori

Abstract

In the present invention, a play area detection unit of an image generation device detects a play area 420 on the basis of an image captured by a stereo camera of a head-mounted display, and a user is prompted to confirm the play area. A play area editing unit receives an editing operation in which a new boundary line 424 of the play area is drawn, and detects an edited play area 426. If the size of a bounding rectangle 428 of the edited play area exceeds an upper limit, the play area editing unit cuts the play area 426 at a cut line 430 perpendicular to an adjustment axis x and thus determines a final play area 432.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/0481 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

40.

MOBILE GAME TRAINER USING CONSOLE CONTROLLER

      
Application Number US2023026261
Publication Number 2024/039445
Status In Force
Filing Date 2023-06-26
Publication Date 2024-02-22
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Juenger, Elizabeth
  • Sutton, Ryan
  • Grossi, Gary
  • Grimm, Jason
  • Wu, Mingtao
  • Tsuchikawa, Yuji

Abstract

A system and method provide application assistance with a mobile device including running an application on a computer system. A challenging application state of the application is detected. An assistance for the challenging application state may be determined wherein the assistance includes display of one or more assistance frames on the mobile device. Information regarding the determined one or more assistance frames is sent to the mobile device.

IPC Classes  ?

  • A63F 13/5375 - Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for graphically or textually suggesting an action, e.g. by displaying an arrow indicating a turn in a driving game
  • A63F 13/422 - Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle automatically for the purpose of assisting the player, e.g. automatic braking in a driving game
  • A63F 13/497 - Partially or entirely replaying previous game actions
  • A63F 13/87 - Communicating with other players during game play, e.g. by e-mail or chat
  • H04N 21/431 - Generation of visual interfaces; Content or additional data rendering
  • H04N 21/44 - Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to MPEG-4 scene graphs

41.

IMAGE BASED AVATAR CUSTOMIZATION

      
Application Number US2023026262
Publication Number 2024/039446
Status In Force
Filing Date 2023-06-26
Publication Date 2024-02-22
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor Sutton, Ryan

Abstract

Image-based customization comprises extracting feature parameters of a subject in a digital image with one or more neural networks trained with a machine learning algorithm configured to determine feature parameters of the subject. The feature parameters are then applied to a virtual model of the subject. Aspects of the present disclosure relate to importation of real-world objects into a virtual application. More specifically, aspects of the present disclosure are related to scanning and importation of real-world objects and human body features into an application for avatar customization.

IPC Classes  ?

  • G06T 13/40 - 3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
  • G06F 3/04847 - Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
  • G06T 15/04 - Texture mapping
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
  • G06T 17/00 - 3D modelling for computer graphics

42.

GAME PLATFORM FEATURE DISCOVERY

      
Application Number US2023026263
Publication Number 2024/039447
Status In Force
Filing Date 2023-06-26
Publication Date 2024-02-22
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Sutton, Ryan
  • Grimm, Jason
  • Uppuluri, Satish
  • Juenger, Elizabeth
  • Grossi, Gary
  • Tsuchikawa, Yuji
  • Wu, Mingtao
  • Parsons, Brian

Abstract

Feature discovery includes determining what a user is doing or trying to do with respect to a computer platform from situational awareness information relating to the user's use of the platform. Feature discovery logic is applied to the situational awareness information and personalized user information to determine (a) when to present information to the user regarding a platform feature or features relevant to what the user is doing or trying to do, (b) what information to present to the user regarding the feature(s), and (c) how to best present the information to the user with a user interface. After the user interface presents the information regarding the platform feature(s) the feature discovery logic, personalized user information, or situational awareness information are updated according to the user's response to presentation of the information.

IPC Classes  ?

  • G06F 11/34 - Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation
  • G06F 9/451 - Execution arrangements for user interfaces
  • H04L 67/53 - Network services using third party service providers
  • G06F 3/00 - Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
  • G06F 7/00 - Methods or arrangements for processing data by operating upon the order or content of the data handled

43.

AUTOMATED DETECTION OF VISUAL IMPAIRMENT AND ADJUSTMENT OF SETTINGS FOR VISUAL IMPAIRMENT

      
Application Number US2023026264
Publication Number 2024/039448
Status In Force
Filing Date 2023-06-26
Publication Date 2024-02-22
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Uppuluri, Satish
  • Grimm, Jason
  • Juenger, Elizabeth
  • Sutton, Ryan

Abstract

A method, system and computer program product for automated visual setting importation is disclosed. A first application running on a first device requests a vision setting for a second application. A vision setting of the first application running on a first device is determined to correspond to the vision setting for the second application. The vision setting for the second application is then applied to the corresponding vision setting of the first application.

IPC Classes  ?

  • G09B 21/00 - Teaching, or communicating with, the blind, deaf or mute
  • A61B 3/00 - Apparatus for testing the eyes; Instruments for examining the eyes
  • A63F 13/00 - Video games, i.e. games using an electronically generated display having two or more dimensions

44.

CUSTOMIZED SELECTIVE ATTENUATION OF GAME AUDIO

      
Application Number US2023026265
Publication Number 2024/039449
Status In Force
Filing Date 2023-06-26
Publication Date 2024-02-22
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Uppuluri, Satish
  • Grimm, Jason

Abstract

In customized audio attenuation a computer system generates audible sounds in one or more frequency ranges from electronic signals. An audiogram for a listener is inferred from the listener's response to the audible sounds and an attenuation profile is determined from the audiogram. The attenuation profile includes an attenuation level for each of the one or more frequency ranges. Each attenuation level is inversely related to the listener's sensitivity to hearing sounds in the corresponding frequency range. Subsequent signals or data corresponding to subsequent audible sounds in the one or more frequency ranges are generated. The attenuation profile is applied to the subsequent signals to generate attenuated signals and the attenuated signals are transmitted to an audio transducer

IPC Classes  ?

  • H04S 7/00 - Indicating arrangements; Control arrangements, e.g. balance control

45.

OPERATION DEVICE, INFORMATION PROCESSING SYSTEM, AND COMPUTER PROGRAM

      
Application Number JP2022030535
Publication Number 2024/034043
Status In Force
Filing Date 2022-08-10
Publication Date 2024-02-15
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor Mizuno Tomomasa

Abstract

A controller 6 is an operation device through which a user inputs an operation for an application (e.g., game) executed on an information processing device. The controller 6 comprises an operation button 76, function buttons 88, a storage unit 92, and a processor 96. The storage unit 92 of the controller 6 stores configuration information relating to an operation on the controller 6. When an operation is input for the operation button 76 together with an operation for the function buttons 88, the processor 96 of the controller 6 switches the configuration information to be applied to an operation on the controller 6.

IPC Classes  ?

  • G06F 3/023 - Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
  • G06F 3/0484 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
  • A63F 13/22 - Setup operations, e.g. calibration, key configuration or button assignment

46.

USE OF AI TO MONITOR USER CONTROLLER INPUTS AND ESTIMATE EFFECTIVENESS OF INPUT SEQUENCES WITH RECOMMENDATIONS TO INCREASE SKILL SET

      
Application Number US2023071958
Publication Number 2024/036230
Status In Force
Filing Date 2023-08-09
Publication Date 2024-02-15
Owner
  • SONY INTERACTIVE ENTERTAINMENT LLC (USA)
  • SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Patel, Benaisha
  • Fryer-Mccolloch, Morgan
  • Bean, Celeste

Abstract

Methods and systems for providing assistance to a user for playing a video game includes identifying attributes of inputs of the user from prior gameplays of the video game. The attributes are analyzed to identify input capabilities of the user. Skills required to progress in the video game are identified and hints are provided to the user to guide the user to obtain certain ones of the skills. The obtained skills assist the user to progress in the video game.

IPC Classes  ?

  • A63F 13/67 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor adaptively or by learning from player actions, e.g. skill level adjustment or by storing successful combat sequences for re-use
  • A63F 13/5375 - Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for graphically or textually suggesting an action, e.g. by displaying an arrow indicating a turn in a driving game
  • A63F 13/26 - Output arrangements for video game devices having at least one additional display device, e.g. on the game controller or outside a game booth

47.

INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING SYSTEM, AND COMPUTER PROGRAM

      
Application Number JP2022030536
Publication Number 2024/034044
Status In Force
Filing Date 2022-08-10
Publication Date 2024-02-15
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Ogawa Tomohiro
  • Nishida Makoto
  • Ogiso Toru

Abstract

An information processing device 10 comprises an acquisition unit and a display control unit. The acquisition unit of the information processing device 10 acquires setting information that is stored in a controller 6 and that pertains to operation of the controller 6. When it is detected that a prescribed operation using a first button that is provided on the controller 6 has been input during execution of an application, the display control unit of the information processing device 10 displays, on the display device 4, information pertaining to the setting information that has been acquired by the acquisition unit and that pertains to the operation of the controller 6.

IPC Classes  ?

  • G06F 3/0484 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
  • A63F 13/22 - Setup operations, e.g. calibration, key configuration or button assignment
  • G06F 3/0338 - Pointing devices displaced or positioned by the user; Accessories therefor with detection of limited linear or angular displacement of an operating part of the device from a neutral position, e.g. isotonic or isometric joysticks

48.

INFORMATION PROCESSING SYSTEM

      
Application Number JP2022030537
Publication Number 2024/034045
Status In Force
Filing Date 2022-08-10
Publication Date 2024-02-15
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Ogawa Tomohiro
  • Nishida Makoto
  • Ogiso Toru

Abstract

This information processing system 1 comprises a controller 6 and a notification unit. The controller 6 stores a plurality of items of configuration information that relate to an operation on the controller 6 and can be selected by a user. The notification unit of the information processing system 1 has a plurality of types of feedback systems that, when the items of configuration information applied to the operation on the controller 6 are switched, present an indication of the switching to the user. The plurality of types of feedback systems may provide information that is sensed through different types of senses.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/023 - Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
  • G06F 3/048 - Interaction techniques based on graphical user interfaces [GUI]

49.

INCORPORATING CAMERA THROUGH OR AUGMENTED REALITY VISION INTO A HEADSET

      
Application Number US2023027796
Publication Number 2024/035521
Status In Force
Filing Date 2023-07-14
Publication Date 2024-02-15
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Sun, Hee Gyung
  • Rottler, Benjamin Andrew

Abstract

Incorporating camera through (CT) or augmented-reality (AR) vision into a display of a headset, including: displaying the CT or AR vision on a view screen of the display of the headset, when the CT or AR vision is selected by a user; prompting the user to enter or select at least one object to be highlighted on the view screen in a CT or AR vision mode; and highlighting the at least one selected object on the view screen.

IPC Classes  ?

50.

NOISE, ACTIVE NOISE CANCELATION, AND FILTERING TO IMPROVE PRIVACY FOR HMD MOTION SENSORS

      
Application Number US2023071285
Publication Number 2024/030837
Status In Force
Filing Date 2023-07-28
Publication Date 2024-02-08
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor Rudi, Olga

Abstract

To protect a user's privacy by reducing a malicious developer's ability to eavesdrop on unwitting HMD users by converting signals from a motion sensor (406) in the HMD (200, 300) to speech or speaker recognition, a microphone (408) can record ambient sound and voice which is subtracted (606) from the motion sensor data before the sensor data is made available to the game developer. Additionally, ANC (active noise cancellation) techniques can be adapted to cancel noise from a motion sensor's data. In another technique, a band pass filter (412) subtracts frequency in the sensor signals within the voice range. Still a third technique blends (800) statistical noise into the motion sensor signal before passing to game developers to obfuscate the user's speech.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • A61B 3/113 - Objective types, i.e. instruments for examining the eyes independent of the patients perceptions or reactions for determining or recording eye movement
  • H04R 1/10 - Earpieces; Attachments therefor
  • G06F 3/16 - Sound input; Sound output
  • A63F 13/212 - Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
  • G06F 3/03 - Arrangements for converting the position or the displacement of a member into a coded form

51.

SMS, PHONE AND VIDEO CALL SUPPORT WHILE GAMING

      
Application Number US2023071293
Publication Number 2024/030839
Status In Force
Filing Date 2023-07-30
Publication Date 2024-02-08
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Rudi, Olga
  • Azmandian, Mahdi
  • Arroyo Palacios, Jorge

Abstract

Privacy of a conversation between a computer game player and an entity such as a person or virtual assistant is preserved (204) by various techniques so that other players of the computer game who are not part of the conversation cannot apprehend the conversation. The conversation may be voice or txt but is not an online chat associated with the computer game (200).

IPC Classes  ?

  • G06F 21/30 - Authentication, i.e. establishing the identity or authorisation of security principals
  • G06F 21/50 - Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems

52.

TUNABLE FILTERING OF VOICE-RELATED COMPONENTS FROM MOTION SENSOR

      
Application Number US2023071377
Publication Number 2024/030876
Status In Force
Filing Date 2023-07-31
Publication Date 2024-02-08
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Rudi, Olga
  • Azmandian, Mahdi
  • Arroyo Palacios, Jorge

Abstract

To protect a user's privacy by reducing a malicious developer's ability to eavesdrop on unwitting HMD users by converting signals from a motion sensor (406) in the HMD (200, 300) to speech or speaker recognition, a microphone (408) can record ambient sound and voice which is subtracted (606) from the motion sensor data before the sensor data is made available to the game developer. In another technique, a band pass filter (412) subtracts frequency in the sensor signals within the voice range. Still a third technique blends (800) statistical noise into the motion sensor signal before passing to game developers to obfuscate the user's speech. The amount by which voice components in the motion signal are eliminated (500) or obfuscated (504) can be tuned (900) by a person or app.

IPC Classes  ?

  • G10L 15/20 - Speech recognition techniques specially adapted for robustness in adverse environments, e.g. in noise or of stress induced speech
  • A63F 13/211 - Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
  • G10L 15/22 - Procedures used during a speech recognition process, e.g. man-machine dialog
  • G10L 21/0208 - Noise filtering
  • A63F 13/215 - Input arrangements for video game devices characterised by their sensors, purposes or types comprising means for detecting acoustic signals, e.g. using a microphone
  • G06N 20/20 - Ensemble learning
  • G10L 15/16 - Speech classification or search using artificial neural networks
  • G10L 15/30 - Distributed recognition, e.g. in client-server systems, for mobile phones or network applications
  • G10L 25/30 - Speech or voice analysis techniques not restricted to a single one of groups characterised by the analysis technique using neural networks

53.

GESTURE TRAINING FOR SKILL ADAPTATION AND ACCESSIBILITY

      
Application Number US2023071382
Publication Number 2024/030878
Status In Force
Filing Date 2023-07-31
Publication Date 2024-02-08
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Rudi, Olga
  • Azmandian, Mahdi
  • Arroyo Palacios, Jorge
  • Coimbatore Madhavan, Ravi

Abstract

A system that teaches players gestures, for instance during the introduction (300) of the game, and asks (302) the player to invoke the gesture. Rather than asking the player to repeat over and over until the player succeeds, the game looks for commonality in the player's attempts, and after a small number of attempts, the game can learn (306) how that player interprets the gesture given the player's own ability. The game can then adapt itself (308-312) to look for that pattern to trigger the action.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

54.

MULTI-USER CROSS-DEVICE SYNCHRONIZATION OF METAVERSE INSTANTIATION

      
Application Number US2023070232
Publication Number 2024/030723
Status In Force
Filing Date 2023-07-14
Publication Date 2024-02-08
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor Benedetto, Warren

Abstract

A method including receiving a request to establish a multi-player session for users to enable participation in a metaverse. The method including determining whether an application generating the metaverse is installed on local devices of the users selected for participation in the metaverse by the users. The method including launching a corresponding local instance of the application when the application is installed on a corresponding local device. The method including launching a corresponding cloud instance of the application on a cloud based streaming server when the application is not installed on the corresponding local device. The method including determining that each instance of the application for the users has been launched, wherein each instance for the users is a local instance or a cloud instance. The method including enabling a start of the multi-player session when the instances of the application for the users has been launched.

IPC Classes  ?

  • A63F 13/48 - Starting a game, e.g. activating a game device or waiting for other players to join a multiplayer session
  • A63F 13/355 - Performing operations on behalf of clients with restricted processing capabilities, e.g. servers transform changing game scene into an MPEG-stream for transmitting to a mobile phone or a thin client
  • A63F 13/77 - Game security or game management aspects involving data related to game devices or game servers, e.g. configuration data, software version or amount of memory

55.

IMPROVING ACCURACY OF INTERACTIONS FOR GAZE-ENABLED AR OBJECTS WHEN IN MOTION

      
Application Number US2023070287
Publication Number 2024/030725
Status In Force
Filing Date 2023-07-14
Publication Date 2024-02-08
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor Benedetto, Warren

Abstract

Methods and systems for providing augment reality overlay associated with a real-world object include detecting a gaze target of a user viewing the real-world environment using a pair of AR glasses by tracking a gaze of the user. Position parameters affecting the gaze of the user are tracked and one or more attributes of the gaze target are selectively corrected to allow the user to maintain their gaze on the gaze target. An AR trigger element associated with the gaze target is triggered in response to the gaze of the user. The AR trigger element provides additional information related to the gaze target selected by the user.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06V 20/20 - Scenes; Scene-specific elements in augmented reality scenes
  • G06Q 20/32 - Payment architectures, schemes or protocols characterised by the use of specific devices using wireless devices

56.

EYE TRACKING FOR ACCESSIBILITY AND VISIBILITY OF CRITICAL ELEMENTS AS WELL AS PERFORMANCE ENHANCEMENTS

      
Application Number US2023071294
Publication Number 2024/030840
Status In Force
Filing Date 2023-07-30
Publication Date 2024-02-08
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor Osman, Steven

Abstract

A map of a person's spatial vision abilities, including areas of low acuity and areas of high acuity, may be generated from medical records (200) or from a calibration phase (304). During presentation of a computer simulation such as a computer game, the map is provided (502) to a foveated renderer to optimize (504) which areas should be rendered most crisply. Content placement may be optimized to ensure that any critical elements to the game, for instance, any text that needs to be seen or treasures or special pickups that need to be seen clearly can be moved into regions (602) of the player's field of view that the person has higher acuity in.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06V 20/20 - Scenes; Scene-specific elements in augmented reality scenes
  • H04N 13/383 - Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes

57.

HAPTICS SUPPORT FOR UI NAVIGATION

      
Application Number US2023071295
Publication Number 2024/030841
Status In Force
Filing Date 2023-07-30
Publication Date 2024-02-08
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor Uppuluri, Satish

Abstract

Haptic support for UI navigation is provided so that in addition to visual cues such as a cursor moving onscreen and audio cues such as sound effects, tactile feedback is provided through haptic generators in an input device (200) such as a computer simulation controller. As the cursor moves right, for example, a haptic generator (212) on the right side (216) of the controller may be activated to generate a tactile sensation on the right side of the controller, and vice versa.

IPC Classes  ?

  • G05G 5/03 - Means for enhancing the operator's awareness of the arrival of the controlling member at a command or datum position; Providing feel, e.g. means for creating a counterforce
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/0354 - Pointing devices displaced or positioned by the user; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
  • G06F 3/0362 - Pointing devices displaced or positioned by the user; Accessories therefor with detection of 1D translations or rotations of an operating part of the device, e.g. scroll wheels, sliders, knobs, rollers or belts
  • G06F 3/0482 - Interaction with lists of selectable items, e.g. menus
  • G06F 3/0483 - Interaction with page-structured environments, e.g. book metaphor
  • G06F 3/0485 - Scrolling or panning
  • A63F 13/214 - Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
  • A63F 13/285 - Generating tactile feedback signals via the game input device, e.g. force feedback
  • G06F 3/0346 - Pointing devices displaced or positioned by the user; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors

58.

IMPROVING FIDELITY OF MOTION SENSOR SIGNAL BY FILTERING VOICE AND HAPTIC COMPONENTS

      
Application Number US2023071379
Publication Number 2024/030877
Status In Force
Filing Date 2023-07-31
Publication Date 2024-02-08
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Rudi, Olga
  • Azmandian, Mahdi
  • Arroyo Palacios, Jorge

Abstract

To improve the fidelity of a motion sensor (406), voice-induced components in signals from the motion sensor as well as haptic-induced components in signals from the motion sensor are canceled (504, 1106) prior to outputting the final motion signal to an app (410) requiring knowledge of device motion, such as motion of a HMD for a computer game.

IPC Classes  ?

  • A63F 13/21 - Input arrangements for video game devices characterised by their sensors, purposes or types
  • A63F 13/211 - Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

59.

INTERACTION DEVICE

      
Application Number JP2022029570
Publication Number 2024/028962
Status In Force
Filing Date 2022-08-01
Publication Date 2024-02-08
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Hirata, Shinichi
  • Ando, Toshiyuki
  • Nakamura, Hitoshi

Abstract

Provided is an interaction device comprising: a deformable exterior body 110; a radar sensor 12 which is arranged on one surface side inside the exterior body, has a predetermined range as a detection range, the predetermined range including the inside of the exterior body from the one surface and extending to the outside of the exterior body surface on the opposite side from the one surface, and detects the position of a target within this detection range; and a detection target body 111 which is arranged inside the exterior body 110 and within the detection range of the radar sensor 12 and moves in accordance with the deformation of the exterior body 110.

IPC Classes  ?

  • G06F 3/02 - Input arrangements using manually operated switches, e.g. using keyboards or dials
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

60.

INFORMATION PROCESSING DEVICE, AND GAME PLAY CONTROL METHOD

      
Application Number JP2023026527
Publication Number 2024/024611
Status In Force
Filing Date 2023-07-20
Publication Date 2024-02-01
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor Ono Tsuyoshi

Abstract

An executing unit 116 executes a game on the basis of operation inputs from a user. As a general rule, a restriction processing unit 112 restricts game play by the user when a timing at which game play by the user is to be restricted is reached. When the timing at which the game play by the user is to be restricted is reached, the restriction processing unit 112 does not restrict the game play by the user if the executing unit 116 is executing an activity for which a termination condition has been set.

IPC Classes  ?

  • A63F 13/75 - Enforcing rules, e.g. detecting foul play or generating lists of cheating players
  • A63F 13/79 - Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories

61.

REPORTING AND CROWD-SOURCED REVIEW WHETHER GAME ACTIVITY IS APPROPRIATE FOR USER

      
Application Number US2023069822
Publication Number 2024/026198
Status In Force
Filing Date 2023-07-07
Publication Date 2024-02-01
Owner
  • SONY INTERACTIVE ENTERTAINMENT LLC (USA)
  • SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Osman, Steven
  • Bean, Celeste

Abstract

A method performed for evaluating activity in a video game, including: executing a multi-player session of a video game; during the multi-player session, receiving flag event data from a first player device, the flag event data indicating that a first player has flagged a gameplay incident occurring during the multi-player session as potentially inappropriate; responsive to receiving the flag event data, then sending a request to a plurality of second player devices, wherein responsive to said request each of the plurality of second player devices renders a voting interface to obtain voting input from each of a plurality of second players regarding whether the gameplay incident is inappropriate; receiving said voting input from the plurality of second player devices, and responsive to said voting input identifying a threshold amount of the plurality of second players considering the gameplay incident to be inappropriate, then administering a penalty for the gameplay incident.

IPC Classes  ?

  • A63F 13/75 - Enforcing rules, e.g. detecting foul play or generating lists of cheating players

62.

IMPAIRED PLAYER ACCESSABILITY WITH OVERLAY LOGIC PROVIDING HAPTIC RESPONSES FOR IN-GAME EFFECTS

      
Application Number US2023069996
Publication Number 2024/026205
Status In Force
Filing Date 2023-07-11
Publication Date 2024-02-01
Owner
  • SONY INTERACTIVE ENTERTAINMENT LLC (USA)
  • SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Luizello, Alessandra
  • Osman, Steven

Abstract

A method including executing game logic of a video game to generate a plurality of video frames for a game play of the video game by a player, wherein game state data is generated during the executing the game logic. The method including identifying an effect that is generated for at least one video frame in the plurality of video frames based on the game state data. The method including translating the effect to a haptic response presentable to the player simultaneous with the at least one video frame, wherein the haptic response is communicating a gaming experience that is rendered by the effect in the at least one video frame.

IPC Classes  ?

  • A63F 13/285 - Generating tactile feedback signals via the game input device, e.g. force feedback
  • A63F 13/60 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
  • A63F 13/215 - Input arrangements for video game devices characterised by their sensors, purposes or types comprising means for detecting acoustic signals, e.g. using a microphone

63.

USER SENTIMENT DETECTION TO IDENTIFY USER IMPAIRMENT DURING GAME PLAY PROVIDING FOR AUTOMATIC GENERATION OR MODIFICATION OF IN-GAME EFFECTS

      
Application Number US2023070000
Publication Number 2024/026206
Status In Force
Filing Date 2023-07-11
Publication Date 2024-02-01
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor Dorn, Victoria

Abstract

A method including executing game logic of a video game to generate a plurality of video frames for a game play of the video game by a player, wherein game state data is generated during the executing the game logic. The method including determining a current context in the game play of the video game based on the game state data. The method including determining a user sentiment of the player towards the current context in the game play of the video game. The method including determining that the user sentiment of the player is inconsistent with an expected user sentiment for the current context. The method including generating an in-game effect for the current context. The method including presenting the in- game effect simultaneous with one or more video frames associated with the current context in the game play of the video game.

IPC Classes  ?

  • A63F 13/5375 - Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for graphically or textually suggesting an action, e.g. by displaying an arrow indicating a turn in a driving game
  • A63F 13/67 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor adaptively or by learning from player actions, e.g. skill level adjustment or by storing successful combat sequences for re-use
  • A63F 13/213 - Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
  • A63F 13/79 - Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories

64.

METHODS AND SYSTEM FOR PREDICTING DURATION OF MULTI-PLAYER GAME SESSION

      
Application Number US2023070026
Publication Number 2024/020301
Status In Force
Filing Date 2023-07-12
Publication Date 2024-01-25
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor Dorn, Victoria

Abstract

A method for generating game duration metrics includes assembling a gaming session for a game, with the gaming session identifying a first player and a second player, and accessing a database that includes an historical duration metric for playing the game by each of the first player and the second player. The method also includes accessing contextual data related to the gaming session to be played by the first player and the second player, and generating a prediction of time metrics for the first player and the second player. The predicted time metrics provide an estimated duration time for play of the gaming session. The method further includes executing the gaming session for the first player and the second player. As the gaming session is progressed by the first player and the second player, the duration time for the gaming session is updated periodically.

IPC Classes  ?

  • A63F 13/795 - Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories for providing a buddy list
  • A63F 13/79 - Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories

65.

INTENT IDENTIFICATION FOR DIALOGUE SUPPORT

      
Application Number US2023023872
Publication Number 2024/019818
Status In Force
Filing Date 2023-05-30
Publication Date 2024-01-25
Owner
  • SONY INTERACTIVE ENTERTAINMENT LLC (USA)
  • SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Patel, Benaisha
  • Luizello, Alessandra
  • Rudi, Olga

Abstract

Systems and methods of intent identification for customized dialogue support in virtual environments are provided. Dialogue intent models stored in memory may each specify one or more intents each associated with a dialogue filter. Input data may be received from a user device of a user. Such input data may be captured during an interactive session of an interactive title that provides a virtual environment to the user device. The input data may be analyzed based on the intent models in response to a detected dialogue trigger and may be determined to correspond to one of the stored intents. The dialogue filter associated with the determined intent may be applied to a plurality of available dialogue outputs associated with the detected dialogue filter. A customized dialogue output may be generated in accordance with a filtered subset of the available dialogue outputs.

IPC Classes  ?

  • A63F 13/87 - Communicating with other players during game play, e.g. by e-mail or chat
  • G10L 15/22 - Procedures used during a speech recognition process, e.g. man-machine dialog
  • G10L 25/69 - Speech or voice analysis techniques not restricted to a single one of groups specially adapted for particular use for evaluating synthetic or decoded voice signals

66.

GENERATING CUSTOMIZED SUMMARIES OF VIRTUAL ACTIONS AND EVENTS

      
Application Number US2023023974
Publication Number 2024/019820
Status In Force
Filing Date 2023-05-31
Publication Date 2024-01-25
Owner
  • SONY INTERACTIVE ENTERTAINMENT LLC (USA)
  • SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Bartolome, Angela
  • Dan, Supriti
  • Luluquisin, Genie-Rose
  • Patel, Benaisha
  • Bean, Celeste

Abstract

A method and system for generating a customized summary of virtual actions and events. Gameplay data sent over a communication network from a client device of the player engaged in a current activity of the respective interactive content title within a current gameplay session may be monitored. A trigger in the monitored gameplay data is detected and associated with a request for a summary that encapsulates actions and events of past gameplay associated with the trigger. A subset of the actions and events of the past gameplay for the summary is selected based on one or more selected customized tags associated with the trigger. The summary is generated based on the selected subset of the actions and events and provided to the client device for presentation.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • A63F 9/24 - Games using electronic circuits not otherwise provided for
  • A63F 13/40 - Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
  • G09B 19/22 - Games, e.g. card games

67.

SPECTATOR PARTICIPATION IN ESPORTS EVENTS

      
Application Number US2023024036
Publication Number 2024/019822
Status In Force
Filing Date 2023-05-31
Publication Date 2024-01-25
Owner
  • SONY INTERACTIVE ENTERTAINMENT LLC (USA)
  • SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Bean, Celeste
  • Luizello, Alessandra
  • Luluquisin, Genie-Rose
  • Patel, Benaisha
  • Rudi, Olga

Abstract

The present technology provides solutions for spectator-based interactions in a virtual esports environment. A method can include establishing an interactive session associated with the virtual esports environment, the interactive session including a plurality of user devices, wherein a subset of the user devices is associated with designated players, and another subset of the user devices is associated with designated spectators; receiving data from one of the user devices over a communication network, the data indicating a spectator action performed by a spectator associated with the user device; identifying that the spectator action is associated with one of the players in the virtual esports environment; and modifying the virtual esports environment based on the spectator action and the association with the identified player.

IPC Classes  ?

  • A63F 13/31 - Communication aspects specific to video games, e.g. between several handheld game devices at close range
  • A63F 13/23 - Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console
  • A63F 13/52 - Controlling the output signals based on the game progress involving aspects of the displayed game scene
  • A63F 13/86 - Watching games played by other players
  • H04N 21/2187 - Live feed
  • A63F 13/5372 - Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for tagging characters, objects or locations in the game scene, e.g. displaying a circle under the character controlled by the player
  • A63F 13/25 - Output arrangements for video game devices

68.

CROWD-SOURCED ESPORTS STREAM PRODUCTION

      
Application Number US2023024039
Publication Number 2024/019823
Status In Force
Filing Date 2023-05-31
Publication Date 2024-01-25
Owner
  • SONY INTERACTIVE ENTERTAINMENT LLC (USA)
  • SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Walker, Victoria
  • Fryer-Mcculloch, Morgan
  • Powell, Brielle
  • Rudi, Olga
  • Osman, Steven
  • Bartolome, Angela
  • Mccoy, Charles

Abstract

The present technology provides solutions for crowd-sourcing stream productions for a virtual esports environment. A method can include generating a virtual environment associated with an interactive session that includes a plurality of spectator devices, wherein each of the spectator devices is presented with a different display based on a corresponding vantage point located within the virtual environment; receiving a plurality of media captures from the spectator devices, wherein each of the media captures is captured from the corresponding vantage point of the spectator device within the virtual environment; selecting one of the media captures based on a comparison of visibility of an asset in the virtual environment; and streaming the selected media capture to a primary display on a requesting device.

IPC Classes  ?

  • A63F 13/86 - Watching games played by other players
  • A63F 13/5252 - Changing parameters of virtual cameras using two or more virtual cameras concurrently or sequentially, e.g. automatically switching between fixed virtual cameras when a character changes room or displaying a rear-mirror view in a car-driving game
  • H04N 13/117 - Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation the virtual viewpoint locations being selected by the viewers or determined by viewer tracking
  • H04N 21/218 - Source of audio or video content, e.g. local disk arrays
  • A63F 13/352 - Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers - Details of game servers involving special game server arrangements, e.g. regional servers connected to a national server or a plurality of servers managing partitions of the game world
  • H04N 21/472 - End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification or for manipulating displayed content

69.

IMAGE INFORMATION PROCESSING DEVICE, IMAGE INFORMATION PROCESSING METHOD, AND PROGRAM

      
Application Number JP2022028392
Publication Number 2024/018605
Status In Force
Filing Date 2022-07-21
Publication Date 2024-01-25
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor Kimura, Atsushi

Abstract

An image information processing device according to the present invention receives a series of image data obtained by capturing images in a three-dimensional space via a moving camera, and estimates map information of an imaged object and camera orientation information from when each piece of image data was captured. The image information processing device includes a processor that: estimates the map information and the camera orientation information at a plurality of points; sets, as relevant points, a group of projection points projected onto mutually-different pieces of the image data obtained by capturing images of shared selection points in the captured three-dimensional space; compares the brightness of the projection points related to the relevant points; and executes an optimization process to optimize the estimated map information. In the optimization process, the processor recognizes objects, in the three-dimensional space, captured in each of the plurality of pieces of image data, and uses a prescribed method to select selection points from the recognized objects that were each captured in all of the plurality pieces of of image data.

IPC Classes  ?

  • G06T 7/593 - Depth or shape recovery from multiple images from stereo images

70.

VIDEO DISPLAY CONTROL DEVICE, CONTROL METHOD THEREFOR, AND PROGRAM

      
Application Number JP2023005202
Publication Number 2024/018668
Status In Force
Filing Date 2023-02-15
Publication Date 2024-01-25
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Matsumura, Michiko
  • Aoki, Sachiyo
  • Ikeda, Takakazu
  • Hatasawa, Yasunari
  • Aizawa, Takahiro
  • Ogiso, Toru
  • Morishita, Kaoru

Abstract

Provided is a video display control device that extracts one or more candidate video output settings with which display by a connected display device is possible, and for each of the one or more candidate video output settings, outputs a video signal based on the candidate video output setting to inspect whether or not display by the display device is possible.

IPC Classes  ?

  • H04N 17/04 - Diagnosis, testing or measuring for television systems or their details for receivers
  • G09G 5/12 - Synchronisation between the display unit and other units, e.g. other display units, video-disc players

71.

CUSTOMIZED DIALOGUE SUPPORT

      
Application Number US2023023866
Publication Number 2024/019817
Status In Force
Filing Date 2023-05-30
Publication Date 2024-01-25
Owner
  • SONY INTERACTIVE ENTERTAINMENT LLC (USA)
  • SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Fryer-Mcculloch, Morgan
  • Dorn, Victoria
  • Powell, Brielle
  • Osman, Steve
  • Norton, Geoff

Abstract

Systems and methods for customized dialogue support in virtual environments are provided. Dialogue maps stored in memory may specify dialogue triggers each associated with a corresponding dialogue instruction. Data regarding an interactive session associated with a user device may be monitored based on one or more of the stored dialogue maps. The presence of one of the dialogue triggers specified by the one or more dialogue maps may be detected based on the monitored data. Customized dialogue output may be generated in response to the detected dialogue trigger and based on the dialogue instruction corresponding to the detected dialogue trigger. The customized dialogue output may be provided to the interactive session in real-time with detection of the detected dialogue trigger.

IPC Classes  ?

  • G10L 17/24 -  the user being prompted to utter a password or a predefined phrase
  • G10L 21/16 - Transforming into a non-visible representation
  • G06F 3/16 - Sound input; Sound output

72.

CONTEXTUAL SCENE ENHANCEMENT

      
Application Number US2023023875
Publication Number 2024/019819
Status In Force
Filing Date 2023-05-30
Publication Date 2024-01-25
Owner
  • SONY INTERACTIVE ENTERTAINMENT LLC (USA)
  • SONY INTERACTIVE ENTERAINMENT INC. (Japan)
Inventor
  • Walker, Victoria
  • Rudi, Olga
  • Chen, Aslanta
  • Nitta-Hill, Phoenix
  • Mccoy, Charles
  • Osman, Steven

Abstract

Certain aspects of the present disclosure include systems and techniques for generating content that indicates a sensation associated with audio. One example method generally includes monitoring audio to be played during display of an associated portion of an interactive content stream provided over a communication network to at least one viewing device during an interactive session, and analyzing, via a machine learning component, the audio to determine a sensation associated with at least a portion of the audio. The method may also include determining an effect indicating the sensation, wherein the effect is associated with one or more output devices associated with the at least one viewing device, and outputting an indication of the effect to the associated output devices, wherein the effect is configured to be output along with the audio in real-time with the display of the associated portion of the interactive content stream.

IPC Classes  ?

  • H04N 21/233 - Processing of audio elementary streams
  • H04N 21/8545 - Content authoring for generating interactive applications
  • H04N 21/43 - Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronizing decoder's clock; Client middleware

73.

VALIDATING A REAL-WORLD OBJECT'S DIGITAL TWIN

      
Application Number US2023069693
Publication Number 2024/015704
Status In Force
Filing Date 2023-07-06
Publication Date 2024-01-18
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor Benedetto, Warren

Abstract

Methods and systems for providing access to a digital twin of a real- world object for use in a metaverse system includes receiving a request to validate the real-world object. The real-world object is validated using a plurality of physical attributes of the real-world object. Upon successful validation, the digital twin is unlocked so as to allow usage of the digital twin in the metaverse system.

IPC Classes  ?

  • G06Q 30/06 - Buying, selling or leasing transactions
  • G06F 21/00 - Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity

74.

DYNAMIC ADJUSTMENT OF IN-GAME THEME PRESENTATION BASED ON CONTEXT OF GAME ACTIVITY

      
Application Number US2023069821
Publication Number 2024/015716
Status In Force
Filing Date 2023-07-07
Publication Date 2024-01-18
Owner
  • SONY INTERACTIVE ENTERTAINMENT LLC (USA)
  • SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Walker, Victoria
  • Rudi, Olga

Abstract

A method for enabling visualization of audio content of a video game is provided, including the following operations: executing a session of a video game; activating an audio visualization theme for the session of the video game; responsive to activating the audio visualization theme, then dynamically adjusting gameplay video of the session to include visual elements indicative of audio content of the session; wherein said visual elements are rendered in a designated border region of the gameplay video based on the audio content of the session.

IPC Classes  ?

  • A63F 13/424 - Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving acoustic input signals, e.g. by using the results of pitch or rhythm extraction or voice recognition

75.

SYSTEMS AND METHODS FOR REDUCED FRICTION REPORTING OF DISRUPTIVE BEHAVIOR INCIDENTS IN CLOUD-BASED GAMING

      
Application Number US2023023778
Publication Number 2024/015150
Status In Force
Filing Date 2023-05-26
Publication Date 2024-01-18
Owner
  • SONY INTERACTIVE ENTERTAINMENT LLC (USA)
  • SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Bartolome, Angela
  • Bean, Celeste
  • Rudi, Olga

Abstract

A disruptive behavior incident flag notification is received from a game player that marks a time during play of a video game corresponding to a potential disruptive behavior incident. A video clip for the disruptive behavior incident flag notification is automatically generated and stored with the disruptive behavior incident flag notification in association with an account of the game player. The game player accesses and reviews the disruptive behavior incident flag notification and corresponding generated video clip. The game player applies a validation indicator to the disruptive behavior incident flag notification. The validation indicator for the disruptive behavior incident flag notification as received from the game player is used to prioritize the disruptive behavior incident flag notification and corresponding generated video clip for platform moderator review. An incident reporting score for the game player is updated based on a platform moderator report for the disruptive behavior incident flag notification.

IPC Classes  ?

  • A63F 13/75 - Enforcing rules, e.g. detecting foul play or generating lists of cheating players
  • A63F 13/355 - Performing operations on behalf of clients with restricted processing capabilities, e.g. servers transform changing game scene into an MPEG-stream for transmitting to a mobile phone or a thin client
  • A63F 13/46 - Computing the game score
  • A63F 13/79 - Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories
  • A63F 13/86 - Watching games played by other players

76.

INFORMATION PROCESSING DEVICE, CONTROL METHOD FOR INFORMATION PROCESSING DEVICE, AND PROGRAM

      
Application Number JP2023024508
Publication Number 2024/009919
Status In Force
Filing Date 2023-06-30
Publication Date 2024-01-11
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Aizawa, Takahiro
  • Aoki, Sachiyo
  • Okabayashi, Taichi
  • Matsumura, Michiko
  • Hatasawa, Yasunari
  • Ikeda, Takakazu
  • Koge, Masahiro
  • Ogiso, Toru

Abstract

This information processing device is connected to video output equipment and outputs a video signal to said video output equipment. The information processing device comprises a processor and a storage unit. Said processor executes an application, receives a setting of an automatic low latency mode, and holds said setting in the storage unit. When the video signal is to be output at a variable refresh rate according to the processing by the application that has been executed, the processor performs operations on the assumption that the automatic low latency mode is ON regardless of the setting of the automatic low latency mode that has been held in the storage unit.

IPC Classes  ?

  • H04N 21/436 - Interfacing a local distribution network, e.g. communicating with another STB or inside the home
  • H04N 21/443 - OS processes, e.g. booting an STB, implementing a Java virtual machine in an STB or power management in an STB

77.

USE OF MACHINE LEARNING TO TRANSFORM SCREEN RENDERS FROM THE PLAYER VIEWPOINT

      
Application Number US2023068707
Publication Number 2024/006634
Status In Force
Filing Date 2023-06-20
Publication Date 2024-01-04
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Black, Glenn
  • Bean, Celeste
  • Bashkirov, Sergey

Abstract

Machine learning (206) is used to transform screen renders from the viewpoint (202) of the player's character to be from the viewpoint (208) of other non-player characters (NPC) in the room. One or more neural networks are trained (302) using game images captured (300) during a human play session, and then subsequently the neural networks are used to create realistic video from the NPC viewpoints. To avoid rendering multiple viewpoints simultaneously, a single viewpoint is rendered, and neural networks are used to transform it for the viewpoints of other NPCs in the area. A group of NPCs may be treated as a batch and a single viewpoint transformed to multiple viewpoints in a single inference pass. For cloud gaming the game video can be rendered/sent immediately for the player but the actions of the NPCs are delayed for a frame while the neural network generates the behavior.

IPC Classes  ?

  • A63F 13/352 - Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers - Details of game servers involving special game server arrangements, e.g. regional servers connected to a national server or a plurality of servers managing partitions of the game world
  • A63F 9/24 - Games using electronic circuits not otherwise provided for
  • A63F 13/213 - Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
  • G06N 20/00 - Machine learning

78.

INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD, AND PROGRAM

      
Application Number JP2022026456
Publication Number 2024/004196
Status In Force
Filing Date 2022-07-01
Publication Date 2024-01-04
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Yamamoto, Toru
  • Ihara, Koji
  • Ichikawa, Keisuke
  • Fujikawa, Tomoya

Abstract

This information processing device acquires setting information in which content of automatic processing and a condition for execution of the automatic processing are associated with each other, and executes the automatic processing when the condition for execution is met.

IPC Classes  ?

  • A63F 13/422 - Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle automatically for the purpose of assisting the player, e.g. automatic braking in a driving game
  • A63F 13/79 - Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories

79.

GAZE TRACKING FOR USER INTERFACE

      
Application Number US2023026251
Publication Number 2024/006221
Status In Force
Filing Date 2023-06-26
Publication Date 2024-01-04
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Sun, Hee Gyung
  • Rottler, Benjamin, Andrew

Abstract

Gaze tracking to track gaze of a user within a user view screen including: combining the gaze tracking with head movement tracking, wherein head movements of the head movement tracking provide rough estimate of a direction of the gaze of the user, while eye movements of the gaze tracking provide fine tuning of the direction of the gaze of the user within the user view screen; dividing the user view screen into a plurality of gaze zones when gaze zone estimation is turned on; and combining the gaze tracking with the gaze zone estimation to select a gaze zone from the plurality of gaze zones as the direction of the gaze of the user.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 7/70 - Determining position or orientation of objects or cameras

80.

IMAGE TRANSMISSION DEVICE AND IMAGE TRANSMISSION METHOD

      
Application Number JP2022026211
Publication Number 2024/004134
Status In Force
Filing Date 2022-06-30
Publication Date 2024-01-04
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor Ohba Akio

Abstract

This image transmission device sets a viewscreen 150 which is curved around a visual line 162 of a user 160 as the central axis, and creates an original image 152 in which the magnification rate of an image is amplified toward the center of the image, by changing the density, according to an angle formed relative to the central axis, in a direction in which color information is acquired in a three-dimensional space of an object to be displayed. A display control unit sets a viewscreen 154, determines a pixel value by performing sampling from appropriate positons in the original image 152, and creates a display image 156 having distortion corresponding to an ocular lens.

IPC Classes  ?

  • H04N 21/234 - Processing of video elementary streams, e.g. splicing of video streams or manipulating MPEG-4 scene graphs
  • H04N 13/00 - PICTORIAL COMMUNICATION, e.g. TELEVISION - Details thereof
  • H04N 21/2662 - Controlling the complexity of the video stream, e.g. by scaling the resolution or bitrate of the video stream based on the client capabilities
  • H04N 21/431 - Generation of visual interfaces; Content or additional data rendering

81.

ENHANCED SCREEN SHARE FOR ONLINE COMPUTER APPLICATIONS

      
Application Number US2023022854
Publication Number 2024/005987
Status In Force
Filing Date 2023-05-19
Publication Date 2024-01-04
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Agoston, Steve
  • Starich, Michael
  • Watson, William
  • Kimura, Joseph
  • Percy, Corbin
  • Rao, Akshay

Abstract

In a method and system for viewer interaction with streaming media, a scene from a media presentation is displayed with a broadcasting device that includes an operating system level broadcaster interface overlay. The overlay generates an image frame from the scene of the media presentation. The image frame is sent to a viewer device. Viewer interaction parameters are received from a viewing device and a viewer interaction is displayed over a subsequent scene of the media presentation with the broadcaster interface overlay. On a viewing device an image frame of the media presentation is received from the broadcasting device. An operating system level viewer interface overlay is generated over the image frame and the image frame of the media presentation is displayed. Viewer interaction parameters are generated from a viewer interaction with the operating system level viewer interface overlay and sent to the broadcasting device.

IPC Classes  ?

  • H04H 60/37 - Arrangements for identifying or recognising characteristics with a direct linkage to broadcast information or to broadcast space-time, e.g. for identifying broadcast stations or for identifying users for identifying segments of broadcast information, e.g. scenes or extracting programme ID
  • H04N 5/272 - Means for inserting a foreground image in a background image, i.e. inlay, outlay
  • H04N 21/241 - Operating system [OS] processes, e.g. server setup
  • H04N 21/443 - OS processes, e.g. booting an STB, implementing a Java virtual machine in an STB or power management in an STB
  • H04N 21/45 - Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies 
  • H04N 21/4725 - End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification or for manipulating displayed content for requesting additional data associated with the content using interactive regions of the image, e.g. hot spots
  • H04N 21/8545 - Content authoring for generating interactive applications
  • H04H 60/07 - Arrangements for generating broadcast information; Arrangements for generating broadcast-related information with a direct linkage to broadcast information or to broadcast space-time; Arrangements for simultaneous generation of broadcast information and broadcast-related information characterised by processes or methods for the generation
  • H04L 12/18 - Arrangements for providing special services to substations for broadcast or conference
  • H04L 65/403 - Arrangements for multi-party communication, e.g. for conferences
  • H04L 65/611 - Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio for multicast or broadcast
  • H04N 1/32 - Circuits or arrangements for control or supervision between transmitter and receiver
  • H04N 5/45 - Picture in picture
  • H04N 21/2343 - Processing of video elementary streams, e.g. splicing of video streams or manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
  • H04N 21/236 - Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator ] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream

82.

VIRTUAL REALITY/AUGMENTED REALITY SYSTEMS CONFIGURABLE WITH MULTIPLE TYPES OF CONTROLLERS

      
Application Number US2023026250
Publication Number 2024/006220
Status In Force
Filing Date 2023-06-26
Publication Date 2024-01-04
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Sun, Hee Gyung
  • Rottler, Benjamin, Andrew

Abstract

Managing controller connection, including: determining whether at least one controller is connected to a game system, and performing and repeating the following until transitioned into a one-handed operation or two-handed operation of a user: (a) if more than one controller connection is detected, transitioning into the two-handed operation; (b) if no controller connection is detected, requesting the user to connect the at least one controller; (c) if connection of only a first controller is detected, requesting the user to connect a second controller; (d) if connection of the second controller is not detected, transitioning into the one-handed operation, wherein the transition into the one-handed operation is made after determining and deciding to continue with only the first controller.

IPC Classes  ?

  • G05G 13/00 - Manually-actuated control mechanisms provided with two or more controlling members and also two or more controlled members
  • G05G 1/01 - Arrangements of two or more controlling members with respect to one another
  • G05G 9/00 - Manually-actuated control mechanisms provided with one single controlling member co-operating with two or more controlled members, e.g. selectively, simultaneously
  • G05G 9/047 - Manually-actuated control mechanisms provided with one single controlling member co-operating with two or more controlled members, e.g. selectively, simultaneously the controlling member being movable in different independent ways, movement in each individual way actuating one controlled member only in which movement in two or more ways can occur simultaneously the controlling member being movable by hand about orthogonal axes, e.g. joysticks
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 3/0346 - Pointing devices displaced or positioned by the user; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors

83.

AUDIO PLAYBACK DEVICE, AUDIO PLAYBACK METHOD, AND AUDIO PLAYBACK PROGRAM

      
Application Number JP2023022046
Publication Number 2024/004651
Status In Force
Filing Date 2023-06-14
Publication Date 2024-01-04
Owner
  • SONY GROUP CORPORATION (Japan)
  • SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Onishi, Takuto
  • Koizumi, Masahiko
  • Sugai, Chihiro
  • Endo, Taiki
  • Kitahara, Keiichi
  • Sato, Satsuki

Abstract

An audio playback device (100), according to the present disclosure, comprises: a reception unit (132) that receives, from a user, a request to playback a second audio signal, which is an audio signal that differs from a first audio signal which is an original audio signal of content; and a playback unit (133) that, upon reception of the request by the reception unit, localizes the second audio signal to an arbitrary position in an acoustic space including azimuth and height directions, and that outputs the first audio signal and the second audio signal in parallel.

IPC Classes  ?

  • H04S 7/00 - Indicating arrangements; Control arrangements, e.g. balance control

84.

INFORMATION PROCESSING DEVICE, SYSTEM, INFORMATION PROCESSING METHOD, INFORMATION PROCESSING PROGRAM, AND COMPUTER SYSTEM

      
Application Number JP2022023579
Publication Number 2023/242893
Status In Force
Filing Date 2022-06-13
Publication Date 2023-12-21
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Iwaki Hideaki
  • Miyada Naoyuki

Abstract

Provided is an information processing device comprising: a detection unit that detects a target on the basis of a first image acquired using a frame-based vision sensor; a setting unit that sets, in the first image, at least one region of interest that includes at least a part of the target; a counting unit that counts, on the basis of an event signal generated by an event-based sensor, the event quantity of the event signal in a region of focus corresponding to the region of interest; an image generation unit that constructs a second image on the basis of the event signal if the event quantity counted by the counting unit satisfies a prescribed condition; and a calculation unit that calculates a motion vector of the region of focus in the second image.

IPC Classes  ?

  • H04N 7/18 - Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
  • G06T 7/20 - Analysis of motion

85.

OPERATION DEVICE, CONTROL METHOD THEREFOR, INFORMATION PROCESSING APPARATUS, AND PROGRAM

      
Application Number JP2022023846
Publication Number 2023/242962
Status In Force
Filing Date 2022-06-14
Publication Date 2023-12-21
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Hong, Sulgi
  • Yunoki, Hirotomo
  • Miyazaki, Yoshio

Abstract

An operation device (15) comprises an operation member (20) that is capable of a tilting operation, and a control circuit (151) that controls the operation member (20), wherein in accordance with the position of the operation member (20), the control circuit (151) restricts the direction and/or the angle in which the operation member (20) is capable of a tilting operation.

IPC Classes  ?

  • G06F 3/0338 - Pointing devices displaced or positioned by the user; Accessories therefor with detection of limited linear or angular displacement of an operating part of the device from a neutral position, e.g. isotonic or isometric joysticks
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/038 - Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry

86.

INFORMATION PROCESSING DEVICE, CONTROLLER DISPLAY METHOD AND COMPUTER PROGRAM

      
Application Number JP2023019468
Publication Number 2023/238678
Status In Force
Filing Date 2023-05-25
Publication Date 2023-12-14
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Nomura Masanori
  • Mulase Yurika
  • Suzuki Kyo
  • Watanabe Shoji
  • Ohara Shizuka
  • Yonetomi Shoi

Abstract

A captured image acquisition unit 212 acquires a captured image which depicts the front direction of a user wearing a head mount display (HMD100). An estimation processing unit 230 estimates the position of an input device 16 on the basis of a captured image which depicts a controller (input device 16) which a user is to grip by gripping a gripping part by inserting their hand into a curved section. A display control unit 276 displays a captured image which depicts the front direction of the user on the HMD 100. The display control unit 276 also displays the captured image along with an object which suggests an area to be gripped by the user, on the basis of the estimation results about the position of the input device 16.

IPC Classes  ?

  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object

87.

SYNCHRONOUS DYNAMIC VISION SENSOR LED AI TRACKING SYSTEM AND METHOD

      
Application Number US2023018388
Publication Number 2023/239454
Status In Force
Filing Date 2023-04-12
Publication Date 2023-12-14
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Ye, Xiaoyong
  • Nakamura, Yuichiro

Abstract

A tracking system and method, comprising a processor; and a controller operably coupled to the processor. Two or more light sources are mounted in a known configuration with respect to each other and with respect to the controller body and the two or more light sources are configured to flash a predetermined time sequence. A dynamic vision sensor is configured to output signals corresponding to two or more events at two or more corresponding light-sensitive elements in an array in response to changes in light output from the two or more light sources and corresponding to times of the two or more events and locations of the events in the light-sensitive elements in the array. The processor determines a position and orientation of the controller from the times and location of the two or more events, and the known information.

IPC Classes  ?

  • G06F 3/03 - Arrangements for converting the position or the displacement of a member into a coded form
  • A63F 13/211 - Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
  • A63F 13/213 - Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
  • G02B 27/01 - Head-up displays
  • G06N 3/02 - Neural networks
  • G06T 7/20 - Analysis of motion
  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods

88.

ASYNCHRONOUS DYNAMIC VISION SENSOR LED AI TRACKING SYSTEM AND METHOD

      
Application Number US2023018389
Publication Number 2023/239455
Status In Force
Filing Date 2023-04-12
Publication Date 2023-12-14
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Ye, Xiaoyong
  • Nakamura, Yuichiro

Abstract

A tracking system uses a dynamic vision sensor (DVS) operably coupled to the processor. The DVS has an array of light-sensitive elements in a known configuration. The DVS outputs signals corresponding to two or more events at two or more corresponding light-sensitive elements in the array in response to changes in light output from two or more light sources mounted in a known configuration with respect to each other and with respect to a controller. The output signals indicate times of the events and locations of the corresponding light-sensitive elements. The processor determines an association between each event and two or more corresponding light sources and fits a position and orientation of the controller using the determined association, the known configuration of the light sources and the locations of the corresponding light-sensitive elements in the array.

IPC Classes  ?

  • G01B 11/25 - Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. moiré fringes, on the object
  • G06T 7/50 - Depth or shape recovery
  • H04N 23/69 - Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
  • H04N 23/74 - Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means

89.

DEPLOYMENT OF DYNAMIC VISION SENSOR HYBRID ELEMENT IN METHOD FOR TRACKING A CONTROLLER AND SIMULTANEOUS BODY TRACKING, SLAM OR SAFETY SHUTTER

      
Application Number US2023018390
Publication Number 2023/239456
Status In Force
Filing Date 2023-04-12
Publication Date 2023-12-14
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Ye, Xiaoyong
  • Nakamura, Yuichiro

Abstract

A position and orientation of a controller are determined from a known configuration of two or more light sources with respect to each other and with respect to a controller body and from output signals from a dynamic vision sensor (DVS) generated in response to changes in light output from the light sources. The output signals indicate times events at corresponding light-sensitive elements in an array in the DVS and array locations of the light-sensitive elements. A position and orientation of one or more objects are determined from signals generated by two or more light-sensitive elements resulting from other light reaching the two or more light-sensitive elements.

IPC Classes  ?

  • A63F 13/20 - Input arrangements for video game devices
  • A63F 9/24 - Games using electronic circuits not otherwise provided for
  • G06F 3/03 - Arrangements for converting the position or the displacement of a member into a coded form
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/14 - Digital output to display device

90.

DYNAMIC VISION SENSOR TRACKING BASED ON LIGHT SOURCE OCCLUSION

      
Application Number US2023018392
Publication Number 2023/239457
Status In Force
Filing Date 2023-04-12
Publication Date 2023-12-14
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Ye, Xiaoyong
  • Nakamura, Yuichiro

Abstract

A tracking system includes a processor, a controller, two or more light sources and a dynamic vision sensor (DVS). The light sources are of known configuration with respect to each other the controller and turn on and off in a predetermined sequence. The DVS includes an array of light-sensitive elements of known configuration. The DVS outputs signals corresponding to events at corresponding light-sensitive elements in the array in response to changes in light from the light sources. The signals indicate times of the events and locations of the corresponding light-sensitive elements. The processor determines an association between each event and one or more of the light sources and, from that association, determines an occlusion of one or more of the light sources. The processor estimates a location of an object using the determined occlusion, the known light source configuration, and the locations of the corresponding light-sensitive elements in the array.

IPC Classes  ?

  • G06F 3/03 - Arrangements for converting the position or the displacement of a member into a coded form
  • A63F 13/213 - Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
  • G02B 27/01 - Head-up displays
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

91.

VIRTUAL REALITY CONTENT DISPLAY SYSTEM AND VIRTUAL REALITY CONTENT DISPLAY METHOD

      
Application Number JP2022022808
Publication Number 2023/238197
Status In Force
Filing Date 2022-06-06
Publication Date 2023-12-14
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor Ohba Akio

Abstract

A VR content display system 10 comprises a head-mounted display (HMD 12), a storage device 14, a position determination unit, and a display control unit. The position determination unit starts to determine whether the positions of the pupils of a user wearing the HMD 12 are located at appropriate positions in response to extraction of the HMD 12 from the storage device 14. The display control unit causes the HMD 12 to start displaying a virtual reality content in response to determination that the positions of the pupils of the user wearing the HMD 12 are located at the appropriate positions.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

92.

APPLICATION PROCESS CONTEXT COMPRESSION AND REPLAY

      
Application Number US2023018385
Publication Number 2023/239452
Status In Force
Filing Date 2023-04-12
Publication Date 2023-12-14
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor Jason, Wang

Abstract

Application state data from a main memory may be compressed and the compressed data may be written to a first location in a mass storage. Updated application state data is generated, and the updated application state data is compressed from the main memory. The updated application state data is then written to a second location in the mass storage. Processing may then be paused on the application state data and updated application state data. The compressed application state data and compressed updated application state data stored in the mass storage is scanned and information corresponding to compressed application state data and updated compressed application state data stored in the mass storage is displayed using information from the scanned compressed application state data and compressed updated application state data.

IPC Classes  ?

  • G06F 9/44 - Arrangements for executing specific programs
  • G06F 11/36 - Preventing errors by testing or debugging of software
  • G06F 9/00 - Arrangements for program control, e.g. control units

93.

DYNAMIC VISION SENSOR BASED EYE AND/OR FACIAL TRACKING

      
Application Number US2023018386
Publication Number 2023/239453
Status In Force
Filing Date 2023-04-12
Publication Date 2023-12-14
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Ye, Xiaoyong
  • Nakamura, Yuichiro

Abstract

A tracking system uses one or more light sources to direct light toward one or more of a user's eyes. A dynamic vision sensor (DVS) operably coupled to a processor is configured to view the user's eyes. The DVS has an array of light-sensitive elements in a known configuration. The DVS outputs signals corresponding to events at corresponding light-sensitive elements in the array in response to changes in light from the light sources reflecting from a portion of the user's eyes. The output signals include information corresponding to locations of the corresponding light-sensitive elements in the array. The processor determines an association between each event and a corresponding particular light source and fits an orientation of the user's eyes using the determined association, the relative location of the light sources with respect to the user's eyes and the locations of the corresponding light-sensitive elements in the array.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G07F 17/32 - Coin-freed apparatus for hiring articles; Coin-freed facilities or services for games, toys, sports, or amusements
  • G02B 27/01 - Head-up displays
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,

94.

ELECTRONIC DEVICE, AND INFORMATION PROCESSING SYSTEM

      
Application Number JP2022022311
Publication Number 2023/233583
Status In Force
Filing Date 2022-06-01
Publication Date 2023-12-07
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor Onishi Yusuke

Abstract

A power generation unit 30 generates power. A power storage unit 32 stores power obtained in the power generation unit 30. An image sensor 16 captures an image of the circumference of an electronic device. A display unit 12 displays an image corresponding to a subject captured by the image sensor 16. The image sensor 16 and the display unit 12 are driven by the power stored in the power storage unit 32.

IPC Classes  ?

  • A63H 17/395 - Steering-mechanisms for toy vehicles steered by programme
  • A63H 17/40 - Toy vehicles automatically steering or reversing by collision with an obstacle
  • A63H 17/41 - Toy vehicles prevented from falling off the supporting surface by automatic steering or reversing

95.

INFORMATION PROCESSING APPARATUS FOR DRIVING OPERATION MEMBER

      
Application Number JP2022022493
Publication Number 2023/233624
Status In Force
Filing Date 2022-06-02
Publication Date 2023-12-07
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Yunoki, Hirotomo
  • Katsuma, Takayuki
  • Hamada, Koji
  • Togawa, Keiji
  • Miyazaki, Yoshio
  • Kuroda, Kouji
  • Mori, Hideki
  • Watanabe, Yusuke

Abstract

This information processing apparatus is connected to an operation device having an operation member on which inclination operation can be performed and a driving unit that drives the operation member, receives, from the operation device, operation information including an inclination direction and/or inclination angle of the operation member, sets a scene including at least one operation target object that is subjected to operation based on the operation information, and issues a driving instruction for driving the operation member depending on the state of the operation target object in the set scene.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

96.

HEAD-MOUNTED DISPLAY AND CONTENT DISPLAY SYSTEM

      
Application Number JP2023018929
Publication Number 2023/234097
Status In Force
Filing Date 2023-05-22
Publication Date 2023-12-07
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Nomura Masanori
  • Takahashi Yasuo
  • Yonetomi Shoi
  • Mulase Yurika

Abstract

A power supply button 110 and a function button 112 are arranged on the underside outer surface of a housing 108 of this head-mounted display (HMD) 100. The power supply button 110 is used to switch supply of electric power from a prescribed power supply to each part of the HMD 100 on and off. The function button 112 can have any of a plurality of functions relating to operation of the HMD 100 selectively assigned thereto.

IPC Classes  ?

  • H04N 5/64 - Constructional details of receivers, e.g. cabinets or dust covers

97.

METHOD FOR ADJUSTING NOISE CANCELLATION IN HEADPHONES BASED ON REAL-WORLD ACTIVITY OR GAME CONTEXT

      
Application Number US2023018839
Publication Number 2023/235063
Status In Force
Filing Date 2023-04-17
Publication Date 2023-12-07
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Manzon-Gutzman, Tatianna
  • Karp, Sarah

Abstract

A method is provided, including: executing an interactive application, wherein executing the interactive application includes rendering video and audio of a virtual environment, the video being presented on a display viewed by a user, and the audio being presented through headphones worn by the user, and wherein executing the interactive application is responsive to user input generated from interactivity by the user with the presented video and audio; receiving environmental input from at least one sensor that senses a local environment in which the user is disposed; analyzing the environmental input to identify activity occurring in the local environment; responsive to identifying the activity, then adjusting a level of active noise cancellation applied by the headphones.

IPC Classes  ?

  • A63F 13/213 - Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
  • A63F 13/215 - Input arrangements for video game devices characterised by their sensors, purposes or types comprising means for detecting acoustic signals, e.g. using a microphone
  • A63F 13/655 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition by importing photos, e.g. of the player
  • G10K 11/178 - Methods or devices for protecting against, or for damping, noise or other acoustic waves in general using interference effects; Masking sound by electro-acoustically regenerating the original acoustic waves in anti-phase

98.

METHODS FOR EXAMINING GAME CONTEXT FOR DETERMINING A USER'S VOICE COMMANDS

      
Application Number US2023018840
Publication Number 2023/235064
Status In Force
Filing Date 2023-04-17
Publication Date 2023-12-07
Owner SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor Azmandian, Mahdi

Abstract

A method for executing a session of a video game is provided, including the following operations: recording speech of a player engaged in gameplay of the session of the video game; analyzing a game state generated by the execution of the session of the video game, wherein analyzing the game state identifies a context of the gameplay; analyzing the recorded speech using the identified context of the gameplay and a speech recognition model, to identify textual content of the recorded speech; applying the identified textual content as a gameplay input for the session of the video game.

IPC Classes  ?

  • H04N 21/4363 - Adapting the video stream to a specific local network, e.g. a IEEE 1394 or Bluetooth® network
  • H04N 21/436 - Interfacing a local distribution network, e.g. communicating with another STB or inside the home
  • A63F 13/215 - Input arrangements for video game devices characterised by their sensors, purposes or types comprising means for detecting acoustic signals, e.g. using a microphone
  • A63F 13/424 - Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving acoustic input signals, e.g. by using the results of pitch or rhythm extraction or voice recognition
  • A63F 13/533 - Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game for prompting the player, e.g. by displaying a game menu
  • A63F 13/67 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor adaptively or by learning from player actions, e.g. skill level adjustment or by storing successful combat sequences for re-use

99.

SYSTEMS AND METHODS FOR AUTOMATED CUSTOMIZED VOICE FILTERING

      
Application Number US2023020519
Publication Number 2023/235084
Status In Force
Filing Date 2023-05-01
Publication Date 2023-12-07
Owner
  • SONY INTERACTIVE ENTERTAINMENT LLC (USA)
  • SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Zhang, Jin
  • Bean, Celeste
  • Karimi, Sepideh
  • Krishnamurthy, Sudha

Abstract

Systems and methods for audio processing are described. An audio processing system receives audio content that includes a voice sample. The audio processing system analyzes the voice sample to identify a sound type in the voice sample. The sound type corresponds to pronunciation of at least one specified character in the voice sample. The audio processing system generates a filtered voice sample at least in part by filtering the voice sample to modify the sound type. The audio processing system outputs the filtered voice sample.

IPC Classes  ?

  • G06N 20/00 - Machine learning
  • G10L 15/02 - Feature extraction for speech recognition; Selection of recognition unit
  • G10L 15/14 - Speech classification or search using statistical models, e.g. Hidden Markov Models [HMM]
  • G10L 17/04 - Training, enrolment or model building
  • G10L 17/26 - Recognition of special voice characteristics, e.g. for use in lie detectors; Recognition of animal voices
  • G10L 21/007 - Changing voice quality, e.g. pitch or formants characterised by the process used
  • H04S 7/00 - Indicating arrangements; Control arrangements, e.g. balance control
  • G10L 15/07 - Adaptation to the speaker
  • G10L 21/0216 - Noise filtering characterised by the method used for estimating noise
  • G10L 21/0264 - Noise filtering characterised by the type of parameter measurement, e.g. correlation techniques, zero crossing techniques or predictive techniques
  • G10L 25/27 - Speech or voice analysis techniques not restricted to a single one of groups characterised by the analysis technique

100.

COOPERATIVE AND COACHED GAMEPLAY

      
Application Number US2023020725
Publication Number 2023/235091
Status In Force
Filing Date 2023-05-02
Publication Date 2023-12-07
Owner
  • SONY INTERACTIVE ENTERTAINMENT LLC (USA)
  • SONY INTERACTIVE ENTERTAINMENT INC. (Japan)
Inventor
  • Rudi, Olga
  • Young, Kristine
  • Azmandian, Mahdi
  • Zhang, Jin
  • Karp, Sarah
  • Karimi, Sepideh
  • Juenger, Elizabeth

Abstract

Methods and systems for cooperative or coached gameplay in virtual environments are disclosed. Memory may store a content control profile regarding a set of control input associated with an action in a virtual environment of a digital content title. A request may be received from a set of one or more users associated with different source devices regarding cooperative gameplay of the digital content title. At least one virtual avatar may be generated for an interactive session of the digital content title in response to the request. A plurality of control inputs may be received from the plurality of different source devices and combined into a combination set of control inputs. Generating the combination set of control input may be based on the content control profile. Virtual actions associated with the virtual avatar may be controlled within the virtual environment in accordance with the combination set of control inputs.

IPC Classes  ?

  • A63F 13/67 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor adaptively or by learning from player actions, e.g. skill level adjustment or by storing successful combat sequences for re-use
  • G06F 3/00 - Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
  • G06F 15/02 - Digital computers in general; Data processing equipment in general manually operated with input through keyboard and computation using a built-in program, e.g. pocket calculators
  1     2     3     ...     15        Next Page