Electronic Arts Inc.

United States of America

Back to Profile

1-100 of 725 for Electronic Arts Inc. Sort by
Query
Patent
United States - USPTO
Excluding Subsidiaries
Aggregations Reset Report
Date
New (last 4 weeks) 4
2024 April (MTD) 3
2024 March 1
2024 February 2
2024 January 8
See more
IPC Class
A63F 13/00 - Video games, i.e. games using an electronically generated display having two or more dimensions 115
A63F 9/24 - Games using electronic circuits not otherwise provided for 102
A63F 13/35 - Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers - Details of game servers 97
A63F 13/30 - Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers 79
A63F 13/67 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor adaptively or by learning from player actions, e.g. skill level adjustment or by storing successful combat sequences for re-use 59
See more
Status
Pending 71
Registered / In Force 654
Found results for  patents
  1     2     3     ...     8        Next Page

1.

FRAUD DETECTION SYSTEM

      
Application Number 18487903
Status Pending
Filing Date 2023-10-16
First Publication Date 2024-04-04
Owner Electronic Arts Inc. (USA)
Inventor
  • Aghdaie, Navid
  • Kolen, John
  • Mattar, Mohamed Marwan
  • Sardari, Mohsen
  • Xue, Su
  • Zaman, Kazi Atif-Uz

Abstract

Embodiments of an automated fraud detection system are disclosed that can detect user accounts that are engaging in unauthorized activities within a game application. The fraud detection system can provide an automated system that identifies parasitic accounts. The fraud detection system may identify patterns using machine learning based on characteristics, such as gameplay and transaction characteristics, associated with the parasitic user accounts. The fraud detection system may generate a model that can be applied to existing accounts within the game in order to automatically identify users that are engaging in unauthorized activities. The fraud detection system may automatically identify these parasitic accounts and implement appropriate actions to prevent the accounts from impacting legitimate users within the game application.

IPC Classes  ?

  • A63F 13/79 - Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories
  • A63F 13/70 - Game security or game management aspects

2.

DYNAMIC STREAMING VIDEO GAME CLIENT

      
Application Number 18449626
Status Pending
Filing Date 2023-08-14
First Publication Date 2024-04-04
Owner Electronic Arts Inc. (USA)
Inventor Karlsson, Per Henrik Benny

Abstract

Embodiments of the present application provide a phased streaming system and process using a dynamic video game client. The dynamic video game client can utilize a state stream game engine in combination with a game application streaming service to provide users with the ability to begin playing games quickly on a huge range of devices.

IPC Classes  ?

  • A63F 13/355 - Performing operations on behalf of clients with restricted processing capabilities, e.g. servers transform changing game scene into an MPEG-stream for transmitting to a mobile phone or a thin client
  • A63F 13/352 - Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers - Details of game servers involving special game server arrangements, e.g. regional servers connected to a national server or a plurality of servers managing partitions of the game world

3.

SEASONAL REWARD DISTRIBUTION SYSTEM

      
Application Number 18449628
Status Pending
Filing Date 2023-08-14
First Publication Date 2024-04-04
Owner Electronic Arts Inc. (USA)
Inventor Laker, Shaun Mackenzie

Abstract

The present disclosure provides a video game based seasonal reward distribution system. The seasonal reward system can provide users with a non-linear map that allows the users to choose how to progress through the reward map when advancing or leveling up a virtual character or user account within the video game. The virtual map can provide a visual representation of a non-linear pathway or tracks that a user can follow based on how the user would like to proceed and what types of rewards the user prefers to unlock. The reward map provides a series of reward nodes connected by links, resulting in a plurality of pathways or tracks that a user can select during advancement within the video game. The user can select individual reward nodes when the virtual character levels up and progress along a pathway on the virtual map.

IPC Classes  ?

  • A63F 13/69 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor by enabling or updating specific game elements, e.g. unlocking hidden features, items, levels or versions
  • A63F 13/35 - Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers - Details of game servers

4.

Telemetric video processing

      
Application Number 17833484
Grant Number 11925873
Status In Force
Filing Date 2022-06-06
First Publication Date 2024-03-12
Grant Date 2024-03-12
Owner Electronic Arts Inc. (USA)
Inventor
  • Lucas, Alexander
  • Kriz, Mike
  • Frederick, Nathan

Abstract

Various aspects of the subject technology relate to systems, methods, and machine-readable media for extracting information from videos. The method includes annotating portions of interest within screen captures. The method also includes receiving at least a first set of videos for a video game. The method also includes training a first machine-learning model to identify the portions of interest within the first set of videos. The method also includes generating validation data based on results of the first machine-learning model. The method also includes extracting information based on the portions of interest identified in the first set of videos.

IPC Classes  ?

  • A63F 13/86 - Watching games played by other players
  • A63F 13/537 - Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
  • G06F 18/21 - Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
  • G06F 18/22 - Matching criteria, e.g. proximity measures
  • G06V 20/40 - Scenes; Scene-specific elements in video content
  • H04N 19/42 - Methods or arrangements for coding, decoding, compressing or decompressing digital video signals - characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation

5.

Terrain search

      
Application Number 17823588
Grant Number 11961183
Status In Force
Filing Date 2022-08-31
First Publication Date 2024-02-29
Grant Date 2024-04-16
Owner Electronic Arts Inc. (USA)
Inventor Canelhas, Daniel Ricão

Abstract

A system may provide for searching terrain data of real-world locations based on input representing a terrain for a game world. The system may receive terrain inquiry data including height data for terrain of a game world, generate an inquiry descriptor based on the terrain inquiry data at least in part by applying a plurality of filters to the terrain inquiry data, the inquiry descriptor including a plurality of inquiry descriptor values corresponding to the plurality of filter and determine, based on the inquiry descriptor and respective sample descriptors of one or more terrain samples corresponding to terrain of real-world locations, one or more matching terrain samples.

IPC Classes  ?

  • G06T 17/05 - Geographic models
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts

6.

Automated Validation of Video Game Environments

      
Application Number 17983046
Status Pending
Filing Date 2022-11-08
First Publication Date 2024-02-15
Owner Electronic Arts Inc. (USA)
Inventor
  • Sestini, Alessandro
  • Gisslén, Linus
  • Bergdahl, Joakim

Abstract

This specification provides a computer-implemented method comprising providing, by a user, one or more demonstrations of a video game entity interacting with a video game environment to achieve a goal. The method further comprises generating, from the one or more demonstrations, one or more training examples. The method further comprises training an agent to control the video game entity using a neural network. The training comprises generating one or more predicted actions for each training example by processing, using the neural network, input data derived from the training example, and updating parameters of the neural network based on a comparison between the one or more predicted actions of the training examples and the one or more corresponding target actions of the training examples. The method further comprises performing validation of the video game environment, comprising controlling the video game entity in the video game environment using the trained agent.

IPC Classes  ?

  • A63F 13/67 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor adaptively or by learning from player actions, e.g. skill level adjustment or by storing successful combat sequences for re-use
  • A63F 13/69 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor by enabling or updating specific game elements, e.g. unlocking hidden features, items, levels or versions
  • G06N 3/04 - Architecture, e.g. interconnection topology
  • G06N 3/08 - Learning methods

7.

Emotion based music style change using deep learning

      
Application Number 18179725
Grant Number 11896902
Status In Force
Filing Date 2023-03-07
First Publication Date 2024-02-13
Grant Date 2024-02-13
Owner Electronic Arts Inc. (USA)
Inventor
  • Sheng, Jie
  • Yigit, Hulya Duygu
  • Zhao, Chong

Abstract

Various aspects of the subject technology relate to systems, methods, and machine-readable media for changing music of a video game based on a player's emotion. The method includes receiving indicators of emotion comprising in-game attributes of a player in a video game. The method also includes predicting an emotion of the player based on the indicators of emotion from the video game. The method also includes receiving original music from the video game. The method also includes determining an original tone of the original music. The method also includes determining a transformed tone based at least in part on the emotion of the player that was predicted. The method also includes transforming the original tone of the original music to the transformed tone. The method also includes generating transformed music from the original music based on the transformed tone.

IPC Classes  ?

  • A63F 13/54 - Controlling the output signals based on the game progress involving acoustic signals, e.g. for simulating revolutions per minute [RPM] dependent engine sounds in a driving game or reverberation against a virtual wall
  • A63F 13/67 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor adaptively or by learning from player actions, e.g. skill level adjustment or by storing successful combat sequences for re-use
  • A63F 13/40 - Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment

8.

Live reverb metrics system

      
Application Number 17486629
Grant Number 11878246
Status In Force
Filing Date 2021-09-27
First Publication Date 2024-01-23
Grant Date 2024-01-23
Owner Electronic Arts Inc. (USA)
Inventor
  • Smith, Matthew David
  • Elmoznino, Hugo David
  • Martel, Giselle Olivia

Abstract

Various aspects of the subject technology relate to systems, methods, and machine-readable media for rendering audio via a game engine for a game. Various aspects may include determining sound source reverb metrics and listener reverb metrics. Aspects may include determining reverbs within a reverb possibility space for all rooms or spaces of the game rendered by the game engine. Aspects may also include determining sound tuning parameters describing reverb attenuation over distance. Aspects may include calculating acoustic parameters based on the reverb metrics, relative positions, and sound tuning parameters. Aspects may include rendering audio according to a fit of determined reverbs to the acoustic parameters.

IPC Classes  ?

  • A63F 13/54 - Controlling the output signals based on the game progress involving acoustic signals, e.g. for simulating revolutions per minute [RPM] dependent engine sounds in a driving game or reverberation against a virtual wall
  • H04S 7/00 - Indicating arrangements; Control arrangements, e.g. balance control

9.

Adversarial Reinforcement Learning for Procedural Content Generation and Improved Generalization

      
Application Number 18474863
Status Pending
Filing Date 2023-09-26
First Publication Date 2024-01-18
Owner Electronic Arts Inc. (USA)
Inventor
  • Gisslén, Linus Mathias
  • Eakins, Andrew John

Abstract

Methods, apparatus and systems are provided for training a first reinforcement-learning (RL) agent and a second RL agent coupled to a computer game environment using RL techniques. The first RL agent iteratively generates a sub-goal sequence in relation to an overall goal within the computer game environment, where the first RL agent generates a new sub-goal for the sub-goal sequence after a second RL agent, interacting with the computer game environment, successfully achieves a current sub-goal in the sub-goal sequence. The second RL agent iteratively interacts with the computer game environment to achieve the current sub-goal in which each iterative interaction includes an attempt by the second RL agent for interacting with the computer game environment to achieve the current sub-goal. The first RL agent is updated using a first reward issued when the second RL agent successfully achieves the current sub-goal. The second RL agent is updated when a second reward is issued by the computer game environment based on the performance of the second RL agent attempting to achieve said current sub-goal. Once validly trained, the first RL agent forms a final first RL agent for automatic procedural content generation (PCG) in the computer game environment and the second RL agent forms a final second RL agent for automatically interacting with a PCG computer game environment.

IPC Classes  ?

  • A63F 13/67 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor adaptively or by learning from player actions, e.g. skill level adjustment or by storing successful combat sequences for re-use
  • G06N 3/08 - Learning methods
  • A63F 13/56 - Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding
  • G06N 3/045 - Combinations of networks

10.

DETECTING COLLUSION IN ONLINE GAMES

      
Application Number 18363399
Status Pending
Filing Date 2023-08-01
First Publication Date 2024-01-18
Owner Electronic Arts Inc. (USA)
Inventor
  • Greige, Laura
  • Trotter, Meredith
  • Narravula, Sundeep
  • Aghdaie, Navid
  • De Mesentier Silva, Fernando

Abstract

A collusion detection system may detect collusion between entities participating in online gaming. The collusion detection system may identify a plurality of entities associated with and opponents within an instance of an online game, determine social data associated with the plurality of entities, determine in-game behavior data associated with the plurality of entities, and determine, for one or more pairings of the plurality of entities, respective pairwise feature sets based at least in part on the social data and the in-game behavior data. The collusion detection system may then perform anomaly detection on the respective pairwise feature sets and, in response to the anomaly detection detecting one or more anomalous pairwise feature sets, output one or more suspect pairings of the plurality of entities corresponding to the one or more anomalous pairwise feature sets as suspected colluding pairings.

IPC Classes  ?

  • A63F 13/75 - Enforcing rules, e.g. detecting foul play or generating lists of cheating players
  • A63F 13/35 - Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers - Details of game servers

11.

CONTEXTUALLY AWARE COMMUNICATIONS SYSTEM IN VIDEO GAMES

      
Application Number 18331065
Status Pending
Filing Date 2023-06-07
First Publication Date 2024-01-11
Owner Electronic Arts Inc. (USA)
Inventor
  • Alderman, Kevin Todd
  • Glenn, John Preston
  • Mcleod, Brent Travis
  • Pineda, Carlos Emmanuel Reyes
  • Vinson, Rayme Christopher

Abstract

In response to receiving user input command for sending a contextually aware communication, a computer system is configured to use game state data to determine a target location that a player is focusing on in a virtual environment in a video game, identify a unit that the player likely wants to communicate about based on at least priorities of unit types and proximities of units to the target location, and select a communication action for performance. Different communication actions can be performed in response to the same user input command when the game state data indicates different game states.

IPC Classes  ?

  • A63F 13/5372 - Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for tagging characters, objects or locations in the game scene, e.g. displaying a circle under the character controlled by the player
  • A63F 13/23 - Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console
  • A63F 13/5378 - Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for displaying an additional top view, e.g. radar screens or maps

12.

OPEN GAME ENGINE AND MARKETPLACE WITH ASSOCIATED GAME EDITING AND CREATION TOOLS

      
Application Number 18331091
Status Pending
Filing Date 2023-06-07
First Publication Date 2024-01-11
Owner Electronic Arts Inc. (USA)
Inventor
  • Bererton, Curt Alexander
  • Bererton, Thomas Andrew
  • Deleon, Christopher Lee
  • Hershberger, David Lee
  • Nesky, John Michael
  • Pignol, Mathilde Elodie
  • Wagner, Joshua Alan

Abstract

The present invention provides a video game development platform. More specifically, aspects of the invention relate to components of applications such as video games including the source code, graphics, sounds, and animations as well as a market place where any of the above are traded for currency, tokens, credits, or given to other people. These components can then be combined, using game editing and creation tools, to make video games. Users can create and edit games, either of their own or based on other users' preexisting games, and can share their games with others. Game components may be bought, sold, traded, or otherwise distributed through an online marketplace.

IPC Classes  ?

  • A63F 13/63 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor by the player, e.g. authoring using a level editor
  • A63F 13/71 - Game security or game management aspects using secure communication between game devices and game servers, e.g. by encrypting game data or authenticating players

13.

AUTOMATIC HEAD POSE NEUTRALIZATION AND BLEND SHAPE GENERATION

      
Application Number 18472581
Status Pending
Filing Date 2023-09-22
First Publication Date 2024-01-11
Owner Electronic Arts Inc. (USA)
Inventor
  • Phan, Hau Nghiep
  • Bolduc, Mathieu Marquis

Abstract

A system may perform head pose neutralization on an input mesh to produce a neutral mesh and/or determine blend shapes for the neutral mesh. The system may generate a neutral mesh based on an input mesh and a reference mesh and then generate a blend shape associated with the neutral mesh based at least in part on one or more reference neutral meshes and one or more corresponding reference blend shapes.

IPC Classes  ?

  • G06T 13/40 - 3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
  • G06T 17/20 - Wire-frame description, e.g. polygonalisation or tessellation

14.

LIVE GAMEPLAY UPDATES

      
Application Number 18363337
Status Pending
Filing Date 2023-08-01
First Publication Date 2024-01-04
Owner Electronic Arts Inc. (USA)
Inventor
  • Victor, Nitish
  • Aghdaie, Navid
  • Chaput, Harold Henry
  • Narravula, Sundeep
  • Zaman, Kazi Atif-Uz

Abstract

A system and method for providing live gameplay updates receives a modification for a videogame, the modification affecting a gameplay aspect of the videogame during execution of the videogame. The system and method determine a target group for deploying the modification. The target group includes first one or more live instances of the videogame. The system and method deploy the modification to the target group and receive gameplay data associated with the gameplay aspect from the target group. The system and method deploy the modification to second one or more live instances of the videogame based at least in part on an analysis of the received gameplay data.

IPC Classes  ?

  • A63F 13/60 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
  • A63F 13/67 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor adaptively or by learning from player actions, e.g. skill level adjustment or by storing successful combat sequences for re-use
  • A63F 13/30 - Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
  • A63F 13/77 - Game security or game management aspects involving data related to game devices or game servers, e.g. configuration data, software version or amount of memory
  • A63F 13/35 - Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers - Details of game servers

15.

Automated controller configuration recommendation system

      
Application Number 17894082
Grant Number 11857868
Status In Force
Filing Date 2022-08-23
First Publication Date 2024-01-02
Grant Date 2024-01-02
Owner Electronic Arts Inc. (USA)
Inventor Kestell, Stephen Roger

Abstract

Various aspects of the subject technology relate to systems, methods, and machine- readable media for adjusting controller settings. The method includes receiving, through a controller associated with a user, controller input for software. The method also includes determining, based on the controller input, a user profile for the user comprising at least a skill level and an input tendency of the user. The method also includes providing suggested adjustments to the controller settings intended to improve performance of the user in relation to the software, the controller settings comprising at least one of controller sensitivity or controller assignments. The method also includes receiving approval of the user to implement the suggested adjustments to the controller settings. The method also includes adjusting the controller settings based on the approval of the user.

IPC Classes  ?

  • A63F 13/42 - Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
  • A63F 13/22 - Setup operations, e.g. calibration, key configuration or button assignment
  • A63F 13/23 - Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console
  • G06N 20/00 - Machine learning

16.

CREATING AND EXPORTING GRAPHICAL USER INTERFACES FOR VIDEO GAME RUNTIME ENVIRONMENTS

      
Application Number 18341683
Status Pending
Filing Date 2023-06-26
First Publication Date 2023-12-28
Owner Electronic Arts Inc. (USA)
Inventor
  • Popa, Adrian-Ciprian
  • Cowan, Timothy J.
  • Hayes, Jonathan Douglas

Abstract

Systems and methods for creating graphical user interfaces (GUIs) for runtime execution in virtual environments of software, such as video games. The system utilizes mock GUIs, which can be images illustrating or displaying mocked graphical user interfaces, to create GUIs that can be exported into runtime environments of software. The system creates GUIs by analyzing the graphical elements and attributes of mock GUIs, and assigning functionality to those graphical elements, enabling the operating of the GUIs within executable runtime environments.

IPC Classes  ?

  • G06F 9/451 - Execution arrangements for user interfaces
  • A63F 13/533 - Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game for prompting the player, e.g. by displaying a game menu

17.

UI state identification, navigation and crawling

      
Application Number 18192009
Grant Number 11846969
Status In Force
Filing Date 2023-03-29
First Publication Date 2023-12-19
Grant Date 2023-12-19
Owner Electronic Arts Inc. (USA)
Inventor Ayucar, Iñaki

Abstract

A UI state crawler system may allow for the crawling of a video game UI that may identify and map UI states of the video game. The UI state crawler system may determine a user interface (UI) state identifier (ID) for a UI state of a UI of a video game based at least in part on a plurality of node IDs corresponding to a plurality of nodes of a hierarchical structure of the UI and determine a UI state map does not include a UI state map node corresponding to the UI state ID. In response to the determining the UI state map does not include the UI state map node corresponding to the UI state ID, the UI state crawler system may generate an updated UI state map including the UI state map node corresponding to the UI state ID.

IPC Classes  ?

  • G06F 8/38 - Creation or generation of source code for implementing user interfaces
  • G06F 11/36 - Preventing errors by testing or debugging of software

18.

ENHANCED POSE GENERATION BASED ON GENERATIVE MODELING

      
Application Number 18316128
Status Pending
Filing Date 2023-05-11
First Publication Date 2023-12-14
Owner Electronic Arts Inc. (USA)
Inventor Akhoundi, Elaheh

Abstract

Systems and methods are provided for enhanced pose generation based on generative modeling. An example method includes accessing an autoencoder trained based on poses of real-world persons, each pose being defined based on location information associated with joints, with the autoencoder being trained to map an input pose to a feature encoding associated with a latent feature space. Information identifying, at least, a first pose and a second pose associated with a character configured for inclusion in an in-game world is obtained via user input, with each of the poses being defined based on location information associated with the joints and with the joints being included on a skeleton associated with the character. Feature encodings associated with the first pose and the second pose are generated based on the autoencoder. Output poses are generated based on transition information associated with the first pose and the second pose.

IPC Classes  ?

  • A63F 13/67 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor adaptively or by learning from player actions, e.g. skill level adjustment or by storing successful combat sequences for re-use
  • A63F 13/655 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition by importing photos, e.g. of the player
  • G06N 3/088 - Non-supervised learning, e.g. competitive learning
  • G06V 40/20 - Movements or behaviour, e.g. gesture recognition
  • G06V 10/82 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
  • G06F 18/40 - Software arrangements specially adapted for pattern recognition, e.g. user interfaces or toolboxes therefor
  • G06F 18/214 - Generating training patterns; Bootstrap methods, e.g. bagging or boosting
  • G06N 3/045 - Combinations of networks
  • G06V 10/774 - Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting

19.

Neural Synthesis of Sound Effects Using Deep Generative Models

      
Application Number 18106335
Status Pending
Filing Date 2023-02-06
First Publication Date 2023-12-07
Owner Electronic Arts Inc. (USA)
Inventor
  • Villanueva Aylagas, Monica
  • Jansson, Albin
  • Andreu, Sergi

Abstract

This specification relates to generating variations of in-game sound effects using machine-learned models. According to a first aspect of this specification, there is described a computer implemented method of training a machine-learned generative model to generate sound effect variations. The method comprises: for each of a plurality of training examples in a set of training examples, each training example comprising a waveform of a sound effect: generating a low dimensional representation of the waveform of the sound effect; inputting the waveform of the sound effect and the low-dimensional representation of the waveform of the sound effect into the generative model; processing, by the generative model, the input waveform of the sound effect and the low-dimensional representation of the waveform of the sound effect to generate a sample from an output distribution; and updating parameters of the generative model using an objective function based on a distribution of the input samples.

IPC Classes  ?

  • A63F 13/54 - Controlling the output signals based on the game progress involving acoustic signals, e.g. for simulating revolutions per minute [RPM] dependent engine sounds in a driving game or reverberation against a virtual wall
  • G06N 3/0475 - Generative networks
  • G06N 3/08 - Learning methods

20.

Training Action Prediction Machine-Learning Models for Video Games with Healed Data

      
Application Number 18451621
Status Pending
Filing Date 2023-08-17
First Publication Date 2023-12-07
Owner Electronic Arts Inc. (USA)
Inventor
  • Gordon, William
  • Keltner, Kasey
  • Leaf, Shawn

Abstract

This specification provides a computer-implemented method, the method comprising obtaining a machine-learning model. The machine-learning model is being trained with expert data comprising a plurality of training examples. Each training example comprises: (i) game state data representing a state of a video game environment, and (ii) scored action data representing an action and a score for that action if performed by a video game entity of the video game environment subsequent to the state of the video game environment. An action is performed by the video game entity based on a prediction for the action generated by the machine-learning model. The method further comprises determining whether the action performed by the video game entity was optimal. In response to determining that the action performed by the video game entity was suboptimal, a healed training example is generated. The healed training example comprises: (i) the state of the instance of the video game environment, and (ii) healed scored action data indicative that the action performed by the video game entity was suboptimal. The machine-learning model is updated based on the healed training example.

IPC Classes  ?

  • A63F 13/67 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor adaptively or by learning from player actions, e.g. skill level adjustment or by storing successful combat sequences for re-use
  • A63F 13/57 - Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game

21.

Generating Machine-Learned Inverse Rig Models

      
Application Number 18060133
Status Pending
Filing Date 2022-11-30
First Publication Date 2023-12-07
Owner Electronic Arts Inc. (USA)
Inventor
  • Marquis Bolduc, Mathieu
  • Phan, Hau

Abstract

A computer-implemented method for generating a machine-learned inverse rig model. The machine-learned inverse rig model outputs rig parameters for a facial animation rig used to generate animations of facial expressions in video games. The method comprises receiving one or more training examples. Each training example comprises target mesh data for a face portraying a particular facial expression. For each training example, predicted rig parameters are generated using an inverse rig model, comprising processing the target mesh data of the training example. Predicted mesh data is generated, comprising processing the predicted rig parameters using a forward rig model. A loss is determined, comprising performing a comparison between the predicted mesh data and the target mesh data. The inverse rig model is trained, comprising updating parameters of the inverse rig model based on the losses of one or more training examples.

IPC Classes  ?

  • G06T 13/40 - 3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
  • G06T 17/20 - Wire-frame description, e.g. polygonalisation or tessellation
  • G06T 7/246 - Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
  • G06T 13/80 - 2D animation, e.g. using sprites
  • A63F 13/57 - Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game

22.

ENHANCED ANIMATION GENERATION BASED ON VIDEO WITH LOCAL PHASE

      
Application Number 18329339
Status Pending
Filing Date 2023-06-05
First Publication Date 2023-12-07
Owner Electronic Arts Inc. (USA)
Inventor
  • Shi, Mingyi
  • Zhao, Yiwei
  • Starke, Wolfram Sebastian
  • Sardari, Mohsen
  • Aghdaie, Navid

Abstract

Embodiments of the systems and methods described herein provide a dynamic animation generation system that can apply a real-life video clip with a character in motion to a first neural network to receive rough motion data, such as pose information, for each of the frames of the video clip, and overlay the pose information on top of the video clip to generate a modified video clip. The system can identify a sliding window that includes a current frame, past frames, and future frames of the modified video clip, and apply the modified video clip to a second neural network to predict a next frame. The dynamic animation generation system can then move the sliding window to the next frame while including the predicted next frame, and apply the new sliding window to the second neural network to predict the following frame to the next frame.

IPC Classes  ?

  • G06T 13/40 - 3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
  • G06T 7/70 - Determining position or orientation of objects or cameras
  • G06T 7/20 - Analysis of motion
  • H04N 5/272 - Means for inserting a foreground image in a background image, i.e. inlay, outlay

23.

Neural animation layering for synthesizing martial arts movements

      
Application Number 17305214
Grant Number 11830121
Status In Force
Filing Date 2021-07-01
First Publication Date 2023-11-28
Grant Date 2023-11-28
Owner ELECTRONIC ARTS INC. (USA)
Inventor
  • Starke, Wolfram Sebastian
  • Zhao, Yiwei
  • Sardari, Mohsen
  • Chaput, Harold Henry
  • Aghdaie, Navid

Abstract

In some embodiments, the dynamic animation generation system can provide a deep learning framework to produce a large variety of martial arts movements in a controllable manner from unstructured motion capture data. The system can imitate animation layering using neural networks with the aim to overcome challenges when mixing, blending and editing movements from unaligned motion sources. The system can synthesize movements from given reference motions and simple user controls, and generate unseen sequences of locomotion, but also reconstruct signature motions of different fighters. For achieving this task, the dynamic animation generation system can adopt a modular framework that is composed of the motion generator, that maps the trajectories of a number of key joints and root trajectory to the full body motion, and a set of different control modules that map the user inputs to such trajectories.

IPC Classes  ?

  • G06T 13/40 - 3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
  • G06T 15/50 - Lighting effects
  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods
  • G06N 3/02 - Neural networks
  • A63F 13/573 - Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using trajectories of game objects, e.g. of a golf ball according to the point of impact

24.

SYSTEMS AND METHODS FOR A NETWORK-BASED VIDEO GAME APPLICATION

      
Application Number 18225540
Status Pending
Filing Date 2023-07-24
First Publication Date 2023-11-16
Owner Electronic Arts Inc. (USA)
Inventor Yoo, Youngseok

Abstract

Embodiments of the systems and methods disclosed herein provide a game that includes at least two users that each have a preconfigured set of playable virtual entities. The users draw playable virtual entities from each of their respective sets of playable virtual entities, and can play the playable virtual entities in one or more battles to facilitate moving a virtual positional marker towards a goal or shooting a goal, wherein the battle procedures are based on a turn priority. Certain attributes are associated with the playable virtual entities so that playable virtual entity effects, OVR values, and salary values can affect which playable virtual entities are in a deck and how the playable virtual entities are affected by the location of the virtual positional marker and/or other playable virtual entities.

IPC Classes  ?

  • A63F 13/795 - Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories for providing a buddy list
  • A63F 13/35 - Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers - Details of game servers

25.

MOTION CAPTURE USING LIGHT POLARIZATION

      
Application Number 18128292
Status Pending
Filing Date 2023-03-30
First Publication Date 2023-11-02
Owner Electronic Arts Inc. (USA)
Inventor Hejl, Jr., James Nunn

Abstract

A system may perform motion capture using polarized light. For example, the system may determine one or more filtered pixels having an associated degree of linear polarization above a threshold from a plurality of pixels of an image captured by a polarization camera and determine a set of pixels of the filtered pixels are associated with a polarized tag, the polarized tag being on a subject of motion capture. The system may then determine an orientation of the polarized tag based on one or more angles of polarization of the pixels associated with the polarized tag and generate a motion capture pose for a model based on a location of the set of pixels of the filtered pixels in the image and the orientation of the polarized tag.

IPC Classes  ?

  • G06T 13/40 - 3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods

26.

Generating speech in the voice of a player of a video game

      
Application Number 17082266
Grant Number 11790884
Status In Force
Filing Date 2020-10-28
First Publication Date 2023-10-17
Grant Date 2023-10-17
Owner ELECTRONIC ARTS INC. (USA)
Inventor
  • Shakeri, Zahra
  • Pinto, Jervis
  • Gupta, Kilol
  • Sardari, Mohsen
  • Chaput, Harold
  • Aghdaie, Navid
  • Moss, Kenneth

Abstract

A computer-implemented method of generating speech audio in a video game is provided. The method includes inputting, into a synthesizer module, input data that represents speech content. Source acoustic features for the speech content in the voice of a source speaker are generated and are input, along with a speaker embedding associated with a player of the video game into an acoustic feature encoder of a voice convertor. One or more acoustic feature encodings are generated as output of the acoustic feature encoder, which are inputted into an acoustic feature decoder of the voice convertor to generate target acoustic features. The target acoustic features are processed with one or more modules, to generate speech audio in the voice of the player.

IPC Classes  ?

  • G10L 13/047 - Architecture of speech synthesisers
  • G10L 13/033 - Voice editing, e.g. manipulating the voice of the synthesiser
  • A63F 13/54 - Controlling the output signals based on the game progress involving acoustic signals, e.g. for simulating revolutions per minute [RPM] dependent engine sounds in a driving game or reverberation against a virtual wall
  • G10L 21/007 - Changing voice quality, e.g. pitch or formants characterised by the process used
  • A63F 13/215 - Input arrangements for video game devices characterised by their sensors, purposes or types comprising means for detecting acoustic signals, e.g. using a microphone

27.

In-game Physics with Affine Bodies

      
Application Number 17710355
Status Pending
Filing Date 2022-03-31
First Publication Date 2023-10-05
Owner Electronic Arts Inc. (USA)
Inventor Lewin, Christopher Charles

Abstract

This specification relates to in-game physics simulations, and in particular to the simulation of in-game objects as affine bodies. According to a first aspect of this disclosure, there is described a computer implemented method comprising: representing a first in-game object in a plurality of in-game objects using an affine body representation, wherein the affine body representation comprises a translation vector and a deformation matrix; and updating an in-game physics state of the plurality of in-game objects, the in game physics state comprising a current translation vector and a current deformation matrix of the first in-game object. The updating comprises: determining one or more constraints to the plurality of in-game objects; determining one or more relaxed constraints by relaxing the one or more constraints based on a stiffness of the first in-game object; and determining an updated in-game physics state of the first in-game object, comprising solving the one or more relaxed constraints and equations of motion for the plurality of in-game objects.

IPC Classes  ?

  • A63F 13/577 - Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using determination of contact between game characters or objects, e.g. to avoid collision between virtual racing cars
  • G06T 3/00 - Geometric image transformation in the plane of the image
  • G06T 11/00 - 2D [Two Dimensional] image generation

28.

SYSTEMS AND METHODS FOR RAY TRACED OCCLUSION AND REFLECTIONS

      
Application Number 17710416
Status Pending
Filing Date 2022-03-31
First Publication Date 2023-10-05
Owner Electronic Arts Inc. (USA)
Inventor Hulm, Dustin

Abstract

A method, device, and computer-readable storage medium for generating an occlusion value for a pixel. The method includes: selecting a pixel in an image of a scene; identifying at least one direction to cast a fixed-distance ray from a location on an object corresponding to the pixel based on an orientation of a surface of the object at the location; in response to determining that screen space occlusion data is available in the at least one direction, obtaining an occlusion value for the pixel from at least one lighting probe based on the at least one direction; and in response to determining that screen space occlusion data is not available in the at least one direction, obtaining the occlusion value for the pixel based on performing ray tracing from the location on the object corresponding to the pixel.

IPC Classes  ?

29.

LEARNING CHARACTER MODEL ANIMATIONS WITH A LAYER-WISE MIXTURE-OF-EXPERTS NETWORK

      
Application Number 17657469
Status Pending
Filing Date 2022-03-31
First Publication Date 2023-10-05
Owner Electronic Arts Inc. (USA)
Inventor
  • Xie, Zhaoming
  • Starke, Wolfram Sebastian
  • Chaput, Harold Henry

Abstract

A computing system may provide functionality for controlling an animated model to perform actions and to perform transitions therebetween. The system may determine, from among a plurality of edges from a first node of a control graph to respective other nodes of the control graph, a selected edge from the first control node to a selected node. The system may then determine controls for an animated model in a simulation based at least in part on the selected edge, control data associated with the selected node, a current simulation state of the simulation, and a machine learned algorithm, determine an updated simulation state of the simulation based at least in part on the controls for the animated model, and adapt one or more parameters of the machine learned algorithm based at least in part on the updated simulation state and a desired simulation state.

IPC Classes  ?

  • G06T 13/40 - 3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
  • G06T 13/80 - 2D animation, e.g. using sprites
  • G06N 20/00 - Machine learning
  • A63F 13/57 - Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game

30.

LEARNING CHARACTER MOTION ALIGNMENT WITH PERIODIC AUTOENCODERS

      
Application Number 17657591
Status Pending
Filing Date 2022-03-31
First Publication Date 2023-10-05
Owner Electronic Arts Inc. (USA)
Inventor
  • Starke, Wolfram Sebastian
  • Chaput, Harold Henry

Abstract

The present disclosure provides a periodic autoencoder that can be used to generate a general motion manifold structure using local periodicity of the movement whose parameters are composed of phase, frequency, and amplitude. The periodic autoencoder is a novel neural network architecture that can learn periodic features from large unstructured motion datasets in an unsupervised manner. The character movements can be decomposed into multiple latent channels that can capture the non-linear periodicity of different body segments during synchronous, asynchronous, and transition movements while progressing forward in time, such that it captures spatial data and temporal data associated with the movements.

IPC Classes  ?

  • A63F 13/57 - Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
  • G06T 13/80 - 2D animation, e.g. using sprites
  • G06T 13/40 - 3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings

31.

Animation Generation and Interpolation with RNN-Based Variational Autoencoders

      
Application Number 17710253
Status Pending
Filing Date 2022-03-31
First Publication Date 2023-10-05
Owner Electronic Arts Inc. (USA)
Inventor Akhoundi, Elaheh

Abstract

This specification relates to the generation of animation data using recurrent neural networks. According to a first aspect of this specification, there is described a computer implemented method comprising: sampling an initial hidden state of a recurrent neural network (RNN) from a distribution; generating, using the RNN, a sequence of frames of animation from the initial state of the RNN and an initial set of animation data comprising a known initial frame of animation, the generating comprising, for each generated frame of animation in the sequence of frames of animation: inputting, into the RNN, a respective set of animation data comprising the previous frame of animation data in the sequence of frames of animation; generating, using the RNN and based on a current hidden state of the RNN, the frame of animation data; and updating the hidden state of the RNN based on the input respective set of animation data.

IPC Classes  ?

  • G06T 13/40 - 3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
  • G06N 3/04 - Architecture, e.g. interconnection topology

32.

Training action prediction machine-learning models for video games with healed data

      
Application Number 17710260
Grant Number 11786822
Status In Force
Filing Date 2022-03-31
First Publication Date 2023-10-05
Grant Date 2023-10-17
Owner ELECTRONIC ARTS INC. (USA)
Inventor
  • Gordon, William
  • Keltner, Kasey
  • Leaf, Shawn

Abstract

This specification provides a computer-implemented method, the method comprising obtaining a machine-learning model. The machine-learning model is being trained with expert data comprising a plurality of training examples. Each training example comprises: (i) game state data representing a state of a video game environment, and (ii) scored action data representing an action and a score for that action if performed by a video game entity of the video game environment subsequent to the state of the video game environment. An action is performed by the video game entity based on a prediction for the action generated by the machine-learning model. The method further comprises determining whether the action performed by the video game entity was optimal. In response to determining that the action performed by the video game entity was suboptimal, a healed training example is generated. The healed training example comprises: (i) the state of the instance of the video game environment, and (ii) healed scored action data indicative that the action performed by the video game entity was suboptimal. The machine-learning model is updated based on the healed training example.

IPC Classes  ?

  • A63F 13/67 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor adaptively or by learning from player actions, e.g. skill level adjustment or by storing successful combat sequences for re-use
  • A63F 13/57 - Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game

33.

Systems and methods for ray traced contact shadows

      
Application Number 17710419
Grant Number 11810241
Status In Force
Filing Date 2022-03-31
First Publication Date 2023-10-05
Grant Date 2023-11-07
Owner Electronic Arts Inc. (USA)
Inventor Hulm, Dustin

Abstract

A method, device, and computer-readable storage medium for generating shadows in an image of a scene. The method includes: generating a shadow map to determine pixels in shadow; for each pixel in shadow, performing a sampling of neighboring pixels; for each pixel in shadow, computing an average brightness of the sampling to generate a shadow mapped shadow value; determining a set of close shadow edge pixels, which are shadow edge pixels that correspond to locations on an object that are below a threshold distance to an occluding object; for each pixel in the set of close shadow edge pixels, performing ray tracing to determine a ray traced shadow value; and outputting shadow values for pixels in shadow, wherein the output is based on the ray traced shadow value for the close shadow edge pixels, and the output is based on the shadow mapped shadow value for other shadow pixels.

IPC Classes  ?

34.

ANTI-PEEK SYSTEM FOR VIDEO GAMES

      
Application Number 18174308
Status Pending
Filing Date 2023-02-24
First Publication Date 2023-09-07
Owner Electronic Arts Inc. (USA)
Inventor Pineda, Carlos Emmanuel Reyes

Abstract

Various aspects of the subject technology relate to systems, methods, and machine-readable media for preventing rendering of a character in a video game. The method includes receiving an action regarding a first character rendered in a first-person point of view (POV), the action causing the POV of the first character to change from the first-person POV to a third-person POV. The method includes detecting the POV of the first character is to be changed. The method includes determining characters are outside of a field of view (FOV) of the first character in the first-person POV and would be within the FOV of the first character in the third-person POV. The method includes changing the POV of the first character from the first-person POV to a third person POV. The method includes causing rendering of the video game in a third-person POV of the first character, the rendering preventing rendering of other characters.

IPC Classes  ?

  • A63F 13/52 - Controlling the output signals based on the game progress involving aspects of the displayed game scene
  • A63F 13/75 - Enforcing rules, e.g. detecting foul play or generating lists of cheating players

35.

MULTI-CAMERA IMAGE CAPTURE SYSTEM

      
Application Number 18143422
Status Pending
Filing Date 2023-05-04
First Publication Date 2023-08-31
Owner Electronic Arts Inc. (USA)
Inventor
  • Hejl, Jim
  • Phaneuf, Jerry
  • Jeromin, Aaron

Abstract

An image capture system includes: a first mobile unit configured to move around the target area; a second mobile unit adjustably coupled to the first mobile unit; a dual-camera unit, operatively coupled to the second mobile unit, including: a first camera to capture structural data; and a second camera to capture color data, wherein the first mobile unit and the second mobile unit are configured to move the first camera and the second camera.

IPC Classes  ?

  • H04N 13/296 - Synchronisation thereof; Control thereof
  • H04N 13/128 - Adjusting depth or disparity
  • H04N 13/239 - Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
  • H04N 13/25 - Image signal generators using stereoscopic image cameras using image signals from one sensor to control the characteristics of another sensor
  • H04N 13/254 - Image signal generators using stereoscopic image cameras in combination with electromagnetic radiation sources for illuminating objects
  • H04N 13/257 - Colour aspects
  • H04N 13/271 - Image signal generators wherein the generated image signals comprise depth maps or disparity maps
  • G01B 11/24 - Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
  • H04N 23/60 - Control of cameras or camera modules
  • H04N 23/69 - Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
  • H04N 23/90 - Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums

36.

Text to performance pipeline system

      
Application Number 17701060
Grant Number 11738266
Status In Force
Filing Date 2022-03-22
First Publication Date 2023-08-29
Grant Date 2023-08-29
Owner ELECTRONIC ARTS INC. (USA)
Inventor
  • Braun, Eric
  • Sung, Peter
  • Vaught, Clayton
  • Waters, Cullen

Abstract

Methods, apparatus and systems are provided for generating an interactive non-player character (NPC) scene for a computer game environment of a video game. Changes are detected in relation to a script associated with the interactive NPC scene. For each NPC, a set of NPC data associated with the interactions said each NPC has within the script is generated corresponding to the changes. The generated set of NPC data is processed with an NPC rig associated with said each NPC to generate an NPC asset. A camera solver is applied to a region of the computer game environment associated with the script for determining locations of NPC assets and one or more cameras within said region in relation to said interactive NPC scene. Data representative of said each NPC asset and said determined NPC asset and camera locations for use by a game development engine for generating said interactive NPC scene.

IPC Classes  ?

  • A63F 13/56 - Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding
  • A63F 13/57 - Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
  • A63F 13/5258 - Changing parameters of virtual cameras by dynamically adapting the position of the virtual camera to keep a game object or game character in its viewing frustum, e.g. for tracking a character or a ball
  • A63F 13/60 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
  • G10L 13/027 - Concept to speech synthesisers; Generation of natural phrases from machine-based concepts
  • G06V 40/16 - Human faces, e.g. facial parts, sketches or expressions
  • G06V 40/20 - Movements or behaviour, e.g. gesture recognition

37.

JOINT TWIST GENERATION FOR ANIMATION

      
Application Number 17678947
Status Pending
Filing Date 2022-02-23
First Publication Date 2023-08-24
Owner Electronic Arts Inc. (USA)
Inventor
  • Starke, Wolfram Sebastian
  • Chaput, Harold Henry
  • Zhao, Yiwei

Abstract

The present disclosure provides embodiments for joint twist generation for animation. The system can utilize a neural network, also referred to as a deep neural network, which utilizes machine learning processes in order to create animation data that are more life-like and realistic. The system can obtain a set of axis vectors for a rig of a virtual character model; obtain a twist model for the rig; input the set of axis vectors to the twist model to obtain a set of twist vectors; and determine animation data based on the set of axis vectors and the set of twist vectors.

IPC Classes  ?

  • G06T 13/40 - 3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings

38.

Voice aging using machine learning

      
Application Number 17399592
Grant Number 11735158
Status In Force
Filing Date 2021-08-11
First Publication Date 2023-08-22
Grant Date 2023-08-22
Owner ELECTRONIC ARTS INC. (USA)
Inventor
  • Gupta, Kilol
  • Shakeri, Zahra
  • Zhong, Ping
  • Gururani, Siddharth
  • Sardari, Mohsen

Abstract

This specification describes systems and methods for aging voice audio, in particular voice audio in computer games. According to one aspect of this specification, there is described a method for aging speech audio data. The method comprises: inputting an initial audio signal and an age embedding into a machine-learned age convertor model, wherein: the initial audio signal comprises speech audio; and the age embedding is based on an age classification of a plurality of speech audio samples of subjects in a target age category; processing, by the machine-learned age convertor model, the initial audio signal and the age embedding to generate an age-altered audio signal, wherein the age-altered audio signal corresponds to a version of the initial audio signal in the target age category; and outputting, from the machine-learned age convertor model, the age-altered audio signal.

IPC Classes  ?

  • G10L 15/16 - Speech classification or search using artificial neural networks
  • G10L 15/00 - Speech recognition
  • G10L 21/00 - Processing of the speech or voice signal to produce another audible or non-audible signal, e.g. visual or tactile, in order to modify its quality or its intelligibility
  • G10L 25/00 - Speech or voice analysis techniques not restricted to a single one of groups
  • G10L 13/027 - Concept to speech synthesisers; Generation of natural phrases from machine-based concepts
  • G10L 25/30 - Speech or voice analysis techniques not restricted to a single one of groups characterised by the analysis technique using neural networks
  • G10L 13/047 - Architecture of speech synthesisers

39.

Animation Evaluation

      
Application Number 17669930
Status Pending
Filing Date 2022-02-11
First Publication Date 2023-08-17
Owner Electronic Arts Inc. (USA)
Inventor
  • Nishimura, Hitoshi
  • Cardinal, Ryan

Abstract

The specification relates to the generation of in-game animation data and the evaluation of in-game animations. According to a first aspect of this specification, there is described a computer implemented method comprising: inputting, into an encoder neural network, input data comprising a plurality of input pose parameters indicative of one or more poses of an in-game object in an animation; generating, by the encoder neural network, one or more encoded representations of the one or more poses of the in-game object from the input data; and determining a quality score for a pose of the one or more poses of an in-game object based on the one or more encoded representations.

IPC Classes  ?

  • A63F 13/57 - Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
  • G06N 3/04 - Architecture, e.g. interconnection topology

40.

Automatic head pose neutralization and blend shape generation

      
Application Number 17651196
Grant Number 11803997
Status In Force
Filing Date 2022-02-15
First Publication Date 2023-08-17
Grant Date 2023-10-31
Owner Electronic Arts Inc. (USA)
Inventor
  • Phan, Hau Nghiep
  • Bolduc, Mathieu Marquis

Abstract

A system may perform head pose neutralization on an input mesh to produce a neutral mesh and/or determine blend shapes for the neutral mesh. The system may generate a neutral mesh based on an input mesh and a reference mesh and then generate a blend shape associated with the neutral mesh based at least in part on one or more reference neutral meshes and one or more corresponding reference blend shapes.

IPC Classes  ?

  • G06T 17/20 - Wire-frame description, e.g. polygonalisation or tessellation
  • G06T 13/40 - 3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings

41.

FEEDBACK ORIENTED GAMEPLAY SESSIONS IN VIDEO GAMES

      
Application Number 18154759
Status Pending
Filing Date 2023-01-13
First Publication Date 2023-08-17
Owner Electronic Arts Inc. (USA)
Inventor
  • Caballero, Oswaldo
  • Main, Paul
  • Lam, Samuel
  • Velez, Santiago

Abstract

Systems and methods are described herein for monitoring a gameplay session for violations of a policy and creating a remediation gameplay session through which remediation can be provided to players or player accounts that violate gameplay policies. The systems and methods can create a remediation gameplay session based in part on the game state data of the gameplay session during which the violation occurs.

IPC Classes  ?

  • A63F 13/75 - Enforcing rules, e.g. detecting foul play or generating lists of cheating players
  • A63F 13/79 - Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories

42.

Goal Driven Animation

      
Application Number 17669927
Status Pending
Filing Date 2022-02-11
First Publication Date 2023-08-17
Owner Electronic Arts Inc. (USA)
Inventor
  • Wong, David
  • Nishimura, Hitoshi

Abstract

The specification relates to the generation of in-game animation data and the evaluation of in-game animations. According to a first aspect of the present disclosure, there is described a computer implemented method comprising: inputting, into one or more neural network models, input data comprising one or more current pose markers indicative of a current pose of an in-game object, one or more target markers indicative of a target pose of an in-game object and an object trajectory of the in-game object; processing, using the one or more neural networks, the input data to generate one or more intermediate pose markers indicative of an intermediate pose of the in-game object positioned between the current pose and the target pose; outputting, from the one or more neural networks, the one or more intermediate pose markers; and generating, using the one or more intermediate pose markers, an intermediate pose of the in-game object, wherein the intermediate pose of the in-game object corresponds to a pose of the in-game object at an intermediate frame of in-game animation between a current frame of in-game animation in which the in-game object is in the current pose and a target frame of in-game animation in which the in-game object is in the target pose.

IPC Classes  ?

  • A63F 13/57 - Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods
  • G06T 13/00 - Animation

43.

Animated and personalized coach for video games

      
Application Number 17120011
Grant Number 11724201
Status In Force
Filing Date 2020-12-11
First Publication Date 2023-08-15
Grant Date 2023-08-15
Owner Electronic Arts Inc. (USA)
Inventor
  • Chaput, Harold Henry
  • Teye, Mattias
  • Chen, Zebin
  • Wang, Wei
  • Sjöö, Ulf Erik Kristoffer
  • Singh-Blom, Ulf Martin Lucas

Abstract

Various aspects of the subject technology relate to systems, methods, and machine-readable media for generating insights for video games. The method includes gathering information regarding a player for a plurality of video games, the information comprising at least one of in-world state data, player action data, player progression data, and/or real-world events relevant to each video game. The method also includes tracking events in at least one video game of the plurality of video games, the events comprising an action event or a standby event. The method also includes determining that an event of the tracked events is an action event. The method also includes generating insights regarding the action event based on the information gathered regarding the player, the insights for improving the player's performance in the video game. The method also includes relaying the insights to the player to improve the player's performance in the video game.

IPC Classes  ?

  • A63F 13/58 - Controlling game characters or game objects based on the game progress by computing conditions of game characters, e.g. stamina, strength, motivation or energy level
  • A63F 13/85 - Providing additional services to players
  • A63F 13/35 - Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers - Details of game servers
  • A63F 13/795 - Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories for providing a buddy list
  • A63F 13/798 - Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories for assessing skills or for ranking players, e.g. for generating a hall of fame
  • A63F 13/215 - Input arrangements for video game devices characterised by their sensors, purposes or types comprising means for detecting acoustic signals, e.g. using a microphone
  • G06T 13/40 - 3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
  • G06N 20/00 - Machine learning
  • A63F 13/424 - Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving acoustic input signals, e.g. by using the results of pitch or rhythm extraction or voice recognition

44.

VIDEOGRAPHER MODE IN ONLINE GAMES

      
Application Number 18296811
Status Pending
Filing Date 2023-04-06
First Publication Date 2023-08-03
Owner Electronic Arts Inc. (USA)
Inventor Knights, Garrett

Abstract

A online gaming system may provide for a videographer mode in online gaming. The online gaming system may initiate an instance of an online game for players playing the online game in a player mode, establish connections to respective game clients of the players and to a videographer client of a computing device of a videographer, the videographer being a user participating in the online game in a videographer mode differing from the player mode, the videographer mode including capturing gameplay at least one of the players. Then, the online game system may receive player input data from at least one of the players, update a game state of the instance based on the player input data, and output respective game client data to the respective game clients and videographer client data the videographer client.

IPC Classes  ?

  • A63F 13/525 - Changing parameters of virtual cameras
  • A63F 13/87 - Communicating with other players during game play, e.g. by e-mail or chat
  • A63F 13/798 - Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories for assessing skills or for ranking players, e.g. for generating a hall of fame

45.

Replay editor in video games

      
Application Number 17649256
Grant Number 11878239
Status In Force
Filing Date 2022-01-28
First Publication Date 2023-08-03
Grant Date 2024-01-23
Owner Electronic Arts Inc. (USA)
Inventor
  • Vaught, Clayton W.
  • Waters, Jr., Cullen J.
  • Ricker, James Lawrence

Abstract

A gaming system may allow for a user to capture and/or edit simulation state data of gameplay in a video game such that a replay of the gameplay may be rendered and/or shared. The gaming system may receive simulation state data and a request. The simulation state data may include simulation state(s) which include a model and pose state of an avatar corresponding to a player in a game simulation of a video game previously rendered as rendered view(s). The request may request a replay of the simulation state data with modification(s). The gaming system may modify the simulation state data to generate modified simulation state data and render, based on the modified simulation state data, replay view(s) that differ from the previously rendered view(s). The gaming system may then output the replay view(s) to a display of a computing device.

IPC Classes  ?

  • A63F 13/497 - Partially or entirely replaying previous game actions
  • A63F 13/577 - Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using determination of contact between game characters or objects, e.g. to avoid collision between virtual racing cars
  • A63F 13/525 - Changing parameters of virtual cameras

46.

ONLINE GAMING FOR STREAMING PLAYERS

      
Application Number 18128134
Status Pending
Filing Date 2023-03-29
First Publication Date 2023-07-27
Owner Electronic Arts Inc. (USA)
Inventor Labate, Jesse Alan

Abstract

A streaming system may improving online gaming experiences for streaming players and/or providing device independent input processing. The streaming system may receive, from a client device, a selection of a game to be played via a streaming system, determine network connection parameters based at least in part on the game, determine a current streaming quality of the network connection of the client device, and determine the current streaming quality does not meet the network connection parameters. Based on the determination that the current streaming quality does not meet the network connection parameters, the streaming system may cause a prompt to be displayed to a player associated with the client device regarding alternative content for the player to play or cause at least a portion of the gameplay of the game to be slowed based at least in part on the current streaming quality and the network connection parameters.

IPC Classes  ?

  • A63F 13/355 - Performing operations on behalf of clients with restricted processing capabilities, e.g. servers transform changing game scene into an MPEG-stream for transmitting to a mobile phone or a thin client

47.

PLAYER PROFILE MANAGEMENT SYSTEM

      
Application Number 18128300
Status Pending
Filing Date 2023-03-30
First Publication Date 2023-07-27
Owner Electronic Arts Inc. (USA)
Inventor
  • Zhang, Zhaosheng
  • Huang, Sijia
  • Li, Li
  • Zhou, Biao
  • Zhu, Shanzhong

Abstract

A player profile management system collects player data from various systems and generates and manages player profiles. A snapshot pipeline of the player profile management system generates a snapshot player profile associated with a player. The player profile management system receives, after generating the snapshot player profile associated with the player, player data associated with the player. An update pipeline of the player profile management system generates, based on the snapshot player profile and the player data associated with the player, an update player profile associated with the player.

IPC Classes  ?

  • A63F 13/79 - Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories
  • A63F 13/355 - Performing operations on behalf of clients with restricted processing capabilities, e.g. servers transform changing game scene into an MPEG-stream for transmitting to a mobile phone or a thin client

48.

Intent-based Models for Use in Selecting Actions in Video Games

      
Application Number 18130359
Status Pending
Filing Date 2023-04-03
First Publication Date 2023-07-27
Owner Electronic Arts Inc. (USA)
Inventor Blok, Hendrik

Abstract

This specification describes a computer-implemented method of generating an intent-based model for use in selecting actions in a video game. The method comprises initializing a graph comprising a plurality of nodes. Each node of the plurality of nodes represents a state of an entity in the video game. The method further comprises adding one or more edges to the graph. Each edge of the one or more edges represents a transition from a first state to a second state. The method further comprises determining, for each node of the plurality of nodes, a distance to each other node, comprising performing a path-finding algorithm on the graph. The method further comprises determining one or more outcome nodes. Each outcome node represents an outcome state of the entity. The method further comprises scoring the one or more outcome nodes, comprising, for each outcome node, determining a score based on an outcome of the outcome node. The method further comprises scoring the plurality of nodes of the graph. Scoring the plurality of nodes of the graph comprises, for each node of the plurality of nodes, and for each outcome out of a set of outcomes, determining whether one or more outcome nodes for the outcome are immediately available from the node; and when one or more outcome nodes for the outcome are immediately available from the node, scoring the outcome for the node using the scores of the one or more outcome nodes. The method further comprises, for each node of the graph, and for each outcome out of the set of outcomes, determining a distance from the node to a highest scoring outcome node for the outcome.

IPC Classes  ?

  • A63F 13/47 - Controlling the progress of the video game involving branching, e.g. choosing one of several possible scenarios at a given point in time
  • A63F 13/67 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor adaptively or by learning from player actions, e.g. skill level adjustment or by storing successful combat sequences for re-use

49.

ENHANCED ANIMATION GENERATION BASED ON MOTION MATCHING USING LOCAL BONE PHASES

      
Application Number 18158422
Status Pending
Filing Date 2023-01-23
First Publication Date 2023-07-27
Owner Electronic Arts Inc. (USA)
Inventor
  • Starke, Wolfram Sebastian
  • Zhao, Yiwei
  • Sardari, Mohsen
  • Chaput, Harold Henry
  • Aghdaie, Navid

Abstract

Systems and methods are provided for enhanced animation generation based on using motion mapping with local bone phases. An example method includes accessing first animation control information generated for a first frame of an electronic game including local bone phases representing phase information associated with contacts of a plurality of rigid bodies of an in-game character with an in-game environment. Executing a local motion matching process for each of the plurality of local bone phases and generating a second pose of the character model based on the plurality of matched local poses for a second frame of the electronic game.

IPC Classes  ?

  • G06T 13/40 - 3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
  • G06T 7/215 - Motion-based segmentation
  • G06T 7/223 - Analysis of motion using block-matching
  • G06V 40/10 - Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands

50.

COMPETITIVE EVENT BASED REWARD DISTRIBUTION SYSTEM

      
Application Number 17647590
Status Pending
Filing Date 2022-01-10
First Publication Date 2023-07-13
Owner Electronic Arts Inc. (USA)
Inventor
  • Dion, Matthew Aaron
  • Ahn, Chong Won

Abstract

The present disclosure provides a video game based reward distribution system. The reward system can provide users with a plurality of progression tracks that allow the users to choose how to progress through the reward map when advancing or leveling up a virtual character or user account within the video game.

IPC Classes  ?

  • A63F 13/798 - Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories for assessing skills or for ranking players, e.g. for generating a hall of fame
  • A63F 13/69 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor by enabling or updating specific game elements, e.g. unlocking hidden features, items, levels or versions

51.

PREDICTIVE EXECUTION OF DISTRIBUTED GAME ENGINES

      
Application Number 17937653
Status Pending
Filing Date 2022-10-03
First Publication Date 2023-07-06
Owner Electronic Arts Inc. (USA)
Inventor
  • Kolen, John
  • Chaput, Harold Henry
  • Aghdaie, Navid
  • Zaman, Kazi Atif-Uz
  • Moss, Kenneth Alan

Abstract

Systems described herein may automatically and dynamically adjust the amount and type of computing resources usable to execute, process, or perform various tasks associated with a video game. Using one or more machine learning algorithms, a prediction model can be generated that uses the historical and/or current user interaction data obtained by monitoring the users playing the video game. Based on the historical and/or current user interaction data, future user interactions likely to be performed in the future can be predicted. Using the predictions of the users' future interactions, the amount and type of computing resources maintained in the systems can be adjusted such that a proper balance between reducing the consumption of computing resources and reducing the latency experienced by the users of the video game is achieved and maintained.

IPC Classes  ?

  • A63F 13/358 - Adapting the game course according to the network or server load, e.g. for reducing latency due to different connection speeds between clients
  • A63F 13/355 - Performing operations on behalf of clients with restricted processing capabilities, e.g. servers transform changing game scene into an MPEG-stream for transmitting to a mobile phone or a thin client
  • A63F 13/67 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor adaptively or by learning from player actions, e.g. skill level adjustment or by storing successful combat sequences for re-use
  • A63F 13/352 - Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers - Details of game servers involving special game server arrangements, e.g. regional servers connected to a national server or a plurality of servers managing partitions of the game world

52.

Respawn systems and methods in video games

      
Application Number 18146111
Grant Number 11957980
Status In Force
Filing Date 2022-12-23
First Publication Date 2023-06-29
Grant Date 2024-04-16
Owner Electronic Arts Inc. (USA)
Inventor Pineda, Carlos Emmanuel Reyes

Abstract

In a video game, a player's character can start in a normal state, receive first damage, and change to an incapacitated state. The player's character can be revived from the incapacitated state back to the normal state. The player's character can be changed from the incapacitated state to a preliminarily defeated state, and in response, a player respawn activation item can be generated. The player respawn activation item can be used by the player's teammates to respawn the player's character at one or more respawn locations.

IPC Classes  ?

  • A63F 13/58 - Controlling game characters or game objects based on the game progress by computing conditions of game characters, e.g. stamina, strength, motivation or energy level
  • A63F 13/837 - Shooting of targets
  • A63F 13/5378 - Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for displaying an additional top view, e.g. radar screens or maps

53.

INTERACTIVE GAMEPLAY PLAYBACK SYSTEM

      
Application Number 18051477
Status Pending
Filing Date 2022-10-31
First Publication Date 2023-06-22
Owner Electronic Arts Inc. (USA)
Inventor
  • Bruzzo, Christopher Loren
  • Rohart, Arthur Francois Marie
  • Bateren, Oghene Fejiro

Abstract

The disclosure provides a video playback system for use within a game application and/or other interactive computing environments. The video playback system can be used to capture gameplay during execution of a game application. The captured gameplay video can be processed and stored within the game application or in a network accessible location.

IPC Classes  ?

  • A63F 13/86 - Watching games played by other players
  • A63F 13/35 - Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers - Details of game servers
  • A63F 13/48 - Starting a game, e.g. activating a game device or waiting for other players to join a multiplayer session
  • A63F 13/20 - Input arrangements for video game devices
  • A63F 13/25 - Output arrangements for video game devices
  • A63F 13/30 - Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
  • A63F 13/32 - Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using local area network [LAN] connections
  • A63F 13/33 - Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using wide area network [WAN] connections
  • A63F 13/332 - Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using wide area network [WAN] connections using wireless networks, e.g. cellular phone networks
  • A63F 13/335 - Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using wide area network [WAN] connections using Internet

54.

DYNAMIC LOCOMOTION ADAPTATION IN RUNTIME GENERATED ENVIRONMENTS

      
Application Number 17550973
Status Pending
Filing Date 2021-12-14
First Publication Date 2023-06-15
Owner Electronic Arts Inc. (USA)
Inventor
  • Starke, Wolfram Sebastian
  • Chaput, Harold Henry

Abstract

Use of pose prediction models enables runtime animation to be generated for an electronic game. The pose prediction model can predict a character pose of a character based on joint data for a pose of the character in a previous frame. Further, by using environment data, it is possible to modify the prediction of the character pose based on a particular environment in which the character is location. Advantageously, the use of machine learning enables prediction of character movement in environment in which it is difficult or impossible to obtain motion capture data.

IPC Classes  ?

  • G06T 13/40 - 3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
  • G06T 7/70 - Determining position or orientation of objects or cameras
  • G06T 7/20 - Analysis of motion
  • A63F 13/56 - Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding

55.

AUTOMATED PLAYER SPONSORSHIP SYSTEM

      
Application Number 18106353
Status Pending
Filing Date 2023-02-06
First Publication Date 2023-06-15
Owner Electronic Arts Inc. (USA)
Inventor
  • Patel, Jijnes Jashbhai
  • Gibson, Daniel Valentine
  • Moss, Kenneth Alan

Abstract

Embodiments of the systems and methods disclosed herein provide a sponsor matching system in which players and sponsors can be matched. Upon a match based at least in part on stored sponsorship criteria and/or player preferences, a first sponsor can select a set of players to receive permission to select an advertisement associated with the first sponsor. Once a first player of the selected players selects an advertisement and an advertisement placement location associated with the first sponsor, the sponsor matching system can generate game rendering instructions for a first player system associated with the first player.

IPC Classes  ?

  • A63F 13/795 - Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories for providing a buddy list
  • A63F 13/61 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor using advertising information

56.

System for customizing in-game character animations by players

      
Application Number 17644000
Grant Number 11816772
Status In Force
Filing Date 2021-12-13
First Publication Date 2023-06-15
Grant Date 2023-11-14
Owner ELECTRONIC ARTS INC. (USA)
Inventor
  • Starke, Wolfram Sebastian
  • Chaput, Harold Henry

Abstract

System and methods for using a deep learning framework to customize animation of an in-game character of a video game. The system can be preconfigured with animation rule sets corresponding to various animations. Each animation can be comprised of a series of distinct poses that collectively form the particular animation. The system can provide an animation-editing interface that enables a user of the video game to make modifications to at least one pose or frame of the animation. The system can realistically extrapolate these modifications across some or all portions of the animation. In addition or alternatively, the system can realistically extrapolate the modifications across other types of animations.

IPC Classes  ?

  • A63F 13/57 - Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
  • A63F 13/63 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor by the player, e.g. authoring using a level editor
  • G06T 13/20 - 3D [Three Dimensional] animation
  • G06T 7/20 - Analysis of motion

57.

PREDICTING FACIAL EXPRESSIONS USING CHARACTER MOTION STATES

      
Application Number 17643065
Status Pending
Filing Date 2021-12-07
First Publication Date 2023-06-08
Owner Electronic Arts Inc. (USA)
Inventor
  • Starke, Wolfram Sebastian
  • Borovikov, Igor
  • Chaput, Harold Henry

Abstract

Systems and methods for identifying one or more facial expression parameters associated with a pose of a character are disclosed. A system may execute a game development application to identify facial expression parameters for a particular pose of a character. The system may receive an input identifying the pose of the character. Further, the system may provide the input to a machine learning model. The machine learning model may be trained based on a plurality of poses and expected facial expression parameters for each pose. Further, the machine learning model can identify a latent representation of the input. Based on the latent representation of the input, the machine learning model can generate one or more facial expression parameters of the character and output the one or more facial expression parameters. The system may also generate a facial expression of the character and output the facial expression.

IPC Classes  ?

  • G06T 13/40 - 3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
  • G06T 17/20 - Wire-frame description, e.g. polygonalisation or tessellation
  • G06N 20/20 - Ensemble learning
  • G06T 13/20 - 3D [Three Dimensional] animation

58.

DETECTING HIGH-SKILLED ENTITIES IN LOW-LEVEL MATCHES IN ONLINE GAMES

      
Application Number 17457194
Status Pending
Filing Date 2021-12-01
First Publication Date 2023-06-01
Owner Electronic Arts Inc. (USA)
Inventor
  • Greige, Laura
  • De Mesentier Silva, Fernando
  • Sulimanov, Alexander

Abstract

A high-skilled-low-level detection system may detect high-skilled entities in low-level matches of an online gaming. The system may identify a plurality of entities that are within a first category of entities eligible to be matched by a matchmaking algorithm. The system may then determine respective feature sets based at least in part on gameplay data associated with the plurality of entities and perform anomaly detection on the respective feature sets. The system may then determine, based on the anomaly detection, an anomalous entity of the plurality of entities and cause the matchmaking algorithm to match the anomalous entity with other entities that are in a second category of entities.

IPC Classes  ?

  • A63F 13/798 - Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories for assessing skills or for ranking players, e.g. for generating a hall of fame

59.

SYSTEMS AND METHODS FOR GENERATING A SIMPLIFIED POLYGONAL MESH

      
Application Number 17530952
Status Pending
Filing Date 2021-11-19
First Publication Date 2023-05-25
Owner Electronic Arts Inc. (USA)
Inventor Mason, Ashton

Abstract

A method, device, and computer-readable storage medium for generating a simplified mesh. The method includes: receiving an input mesh that is a polygonal mesh; identifying one or more submeshes of the input mesh; fitting a set of shapes to the one or more submeshes to determine which shapes approximate which submeshes within a threshold value; for each submesh that is associated with at least one shape that approximates the submesh within the threshold value, generating a set of proxy levels-of-detail (LODs) for the submesh, wherein each proxy LOD is a polygonal mesh corresponding to the shape that approximates the submesh; generating for each submesh, a set of traditionally simplified levels-of-detail (LODs) based on simplifying the submesh; and generating the simplified mesh based on selecting one proxy LOD or one traditionally simplified LOD for each submesh of the one or more submeshes.

IPC Classes  ?

  • G06T 17/20 - Wire-frame description, e.g. polygonalisation or tessellation

60.

Color blindness diagnostic system

      
Application Number 18055792
Grant Number 11872492
Status In Force
Filing Date 2022-11-15
First Publication Date 2023-05-11
Grant Date 2024-01-16
Owner Electronic Arts Inc. (USA)
Inventor Stevens, Karen Elaine

Abstract

Systems and methods for determining whether to enable color blind accessibility settings within the course of a user interactive narrative are described herein. Virtual color blindness indication objects containing colors that are visibly distinguishable within a single dichromatic visual spectrum can be utilized in objectives to determine a user's dichromatic visual deficiency type.

IPC Classes  ?

  • G06F 3/048 - Interaction techniques based on graphical user interfaces [GUI]
  • A63F 13/67 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor adaptively or by learning from player actions, e.g. skill level adjustment or by storing successful combat sequences for re-use
  • G06T 11/00 - 2D [Two Dimensional] image generation
  • A63F 13/77 - Game security or game management aspects involving data related to game devices or game servers, e.g. configuration data, software version or amount of memory
  • A63F 13/533 - Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game for prompting the player, e.g. by displaying a game menu
  • A61B 3/06 - Subjective types, i.e. testing apparatus requiring the active assistance of the patient for testing colour vision
  • G06F 3/0481 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance

61.

System for testing command execution latency within a video game

      
Application Number 18050949
Grant Number 11904238
Status In Force
Filing Date 2022-10-28
First Publication Date 2023-04-27
Grant Date 2024-02-20
Owner Electronic Arts Inc. (USA)
Inventor
  • Phaneuf, Gerald Richard
  • Hejl, Jr., James Nunn

Abstract

A video game test system can determine an objective measure of elapsed time between interaction with a video game controller and the occurrence of a particular event within the video game. This objective measure enables a tester to determine whether a video game is objectively operating slowly or just feels slow to the tester, and may indicate the existence of coding errors that may affect execution speed, but not cause visible errors. The system may obtain the objective measure of elapsed time by simulating a user's interaction with the video game. Further, the system may identify data embedded into a frame of an animation by the video game source code to identify the occurrence of a corresponding event. The system can then measure the elapsed time between the simulated user interaction and the occurrence or triggering of the corresponding event.

IPC Classes  ?

  • A63F 13/31 - Communication aspects specific to video games, e.g. between several handheld game devices at close range
  • A63F 13/35 - Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers - Details of game servers
  • A63F 13/77 - Game security or game management aspects involving data related to game devices or game servers, e.g. configuration data, software version or amount of memory

62.

Videographer mode in online games

      
Application Number 17504009
Grant Number 11638872
Status In Force
Filing Date 2021-10-18
First Publication Date 2023-04-20
Grant Date 2023-05-02
Owner Electronic Arts Inc. (USA)
Inventor Knights, Garrett

Abstract

A online gaming system may provide for a videographer mode in online gaming. The online gaming system may initiate an instance of an online game for players playing the online game in a player mode, establish connections to respective game clients of the players and to a videographer client of a computing device of a videographer, the videographer being a user participating in the online game in a videographer mode differing from the player mode, the videographer mode including capturing gameplay at least one of the players. Then, the online game system may receive player input data from at least one of the players, update a game state of the instance based on the player input data, and output respective game client data to the respective game clients and videographer client data the videographer client.

IPC Classes  ?

  • A63F 13/525 - Changing parameters of virtual cameras
  • A63F 13/87 - Communicating with other players during game play, e.g. by e-mail or chat
  • A63F 13/798 - Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories for assessing skills or for ranking players, e.g. for generating a hall of fame

63.

Generating facial position data based on audio data

      
Application Number 18086237
Grant Number 11847727
Status In Force
Filing Date 2022-12-21
First Publication Date 2023-04-20
Grant Date 2023-12-19
Owner ELECTRONIC ARTS INC. (USA)
Inventor
  • Del Val Santos, Jorge
  • Gisslen, Linus
  • Singh-Blom, Martin
  • Sjöö, Kristoffer
  • Teye, Mattias

Abstract

A computer-implemented method for generating a machine-learned model to generate facial position data based on audio data comprising training a conditional variational autoencoder having an encoder and decoder. The training comprises receiving a set of training data items, each training data item comprising a facial position descriptor and an audio descriptor; processing one or more of the training data items using the encoder to obtain distribution parameters; sampling a latent vector from a latent space distribution based on the distribution parameters; processing the latent vector and the audio descriptor using the decoder to obtain a facial position output; calculating a loss value based at least in part on a comparison of the facial position output and the facial position descriptor of at least one of the one or more training data items; and updating parameters of the conditional variational autoencoder based at least in part on the calculated loss value.

IPC Classes  ?

  • G06T 13/20 - 3D [Three Dimensional] animation
  • G06N 20/20 - Ensemble learning
  • G06T 13/40 - 3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
  • G06N 3/088 - Non-supervised learning, e.g. competitive learning

64.

Seamless image processing of a tiled image region

      
Application Number 17450665
Grant Number 11875445
Status In Force
Filing Date 2021-10-12
First Publication Date 2023-04-13
Grant Date 2024-01-16
Owner Electronic Arts Inc. (USA)
Inventor Keable, Julien

Abstract

Systems and methods for performing a processing operation for a tiled image region are disclosed. The tiled image region may include a plurality of tiles or images. Further, the tiled image region may correspond to a plurality of image resolutions. A system may execute a game development application to perform the processing operation for the tiled image region. The system may identify the tiled image region corresponding to the processing operation. The system can utilize a texture array, a lookup texture, and a scaling factor to determine position data for the tiled image region. The system can then render a continuous image region that represents the tiled image region. The system can seamlessly process the continuous image region according to the processing operation and use the continuous image region to update the tiled image region.

IPC Classes  ?

  • G06T 15/04 - Texture mapping
  • G06T 15/00 - 3D [Three Dimensional] image rendering
  • G06T 3/40 - Scaling of a whole image or part thereof
  • A63F 13/52 - Controlling the output signals based on the game progress involving aspects of the displayed game scene
  • G06T 1/60 - Memory management

65.

Anti-peek system for video games

      
Application Number 17549776
Grant Number 11623145
Status In Force
Filing Date 2021-12-13
First Publication Date 2023-04-11
Grant Date 2023-04-11
Owner Electronic Arts Inc. (USA)
Inventor Pineda, Carlos Emmanuel Reyes

Abstract

Various aspects of the subject technology relate to systems, methods, and machine-readable media for preventing rendering of a character in a video game. The method includes receiving an action regarding a first character rendered in a first-person point of view (POV), the action causing the POV of the first character to change from the first-person POV to a third-person POV. The method includes detecting the POV of the first character is to be changed. The method includes determining characters are outside of a field of view (FOV) of the first character in the first-person POV and would be within the FOV of the first character in the third-person POV. The method includes changing the POV of the first character from the first-person POV to a third person POV. The method includes causing rendering of the video game in a third-person POV of the first character, the rendering preventing rendering of other characters.

IPC Classes  ?

  • A63F 13/52 - Controlling the output signals based on the game progress involving aspects of the displayed game scene
  • A63F 13/75 - Enforcing rules, e.g. detecting foul play or generating lists of cheating players

66.

Emotion based music style change using deep learning

      
Application Number 17229600
Grant Number 11617952
Status In Force
Filing Date 2021-04-13
First Publication Date 2023-04-04
Grant Date 2023-04-04
Owner Electronic Arts Inc. (USA)
Inventor
  • Sheng, Jie
  • Yigit, Hulya Duygu
  • Zhao, Chong

Abstract

Various aspects of the subject technology relate to systems, methods, and machine-readable media for changing music of a video game based on a player's emotion. The method includes receiving indicators of emotion comprising in-game attributes of a player in a video game. The method also includes predicting an emotion of the player based on the indicators of emotion from the video game. The method also includes receiving original music from the video game. The method also includes determining an original tone of the original music. The method also includes determining a transformed tone based at least in part on the emotion of the player that was predicted. The method also includes transforming the original tone of the original music to the transformed tone. The method also includes generating transformed music from the original music based on the transformed tone.

IPC Classes  ?

  • A63F 13/54 - Controlling the output signals based on the game progress involving acoustic signals, e.g. for simulating revolutions per minute [RPM] dependent engine sounds in a driving game or reverberation against a virtual wall
  • A63F 13/67 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor adaptively or by learning from player actions, e.g. skill level adjustment or by storing successful combat sequences for re-use
  • A63F 13/40 - Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment

67.

Speaker conversion for video games

      
Application Number 17093171
Grant Number 11605388
Status In Force
Filing Date 2020-11-09
First Publication Date 2023-03-14
Grant Date 2023-03-14
Owner Electronic Arts Inc. (USA)
Inventor
  • Gupta, Kilol
  • Shah, Dhaval
  • Shakeri, Zahra
  • Pinto, Jervis
  • Sardari, Mohsen
  • Chaput, Harold
  • Aghdaie, Navid
  • Zaman, Kazi

Abstract

This specification describes a computer-implemented method of generating speech audio for use in a video game, wherein the speech audio is generated using a voice convertor that has been trained to convert audio data for a source speaker into audio data for a target speaker. The method comprises receiving: (i) source speech audio, and (ii) a target speaker identifier. The source speech audio comprises speech content in the voice of a source speaker. Source acoustic features are determined for the source speech audio. A target speaker embedding associated with the target speaker identifier is generated as output of a speaker encoder of the voice convertor. The target speaker embedding and the source acoustic features are inputted into an acoustic feature encoder of the voice convertor. One or more acoustic feature encodings are generated as output of the acoustic feature encoder. The one or more acoustic feature encodings are derived from the target speaker embedding and the source acoustic features. Target speech audio is generated for the target speaker. The target speech audio comprises the speech content in the voice of the target speaker. The generating comprises decoding the one or more acoustic feature encodings using an acoustic feature decoder of the voice convertor.

IPC Classes  ?

  • G10L 15/02 - Feature extraction for speech recognition; Selection of recognition unit
  • G10L 17/04 - Training, enrolment or model building
  • A63F 13/424 - Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving acoustic input signals, e.g. by using the results of pitch or rhythm extraction or voice recognition
  • A63F 13/215 - Input arrangements for video game devices characterised by their sensors, purposes or types comprising means for detecting acoustic signals, e.g. using a microphone
  • G10L 15/00 - Speech recognition

68.

EXTENDING KNOWLEDGE DATA IN MACHINE VISION

      
Application Number 17942449
Status Pending
Filing Date 2022-09-12
First Publication Date 2023-03-09
Owner Electronic Arts Inc. (USA)
Inventor Skuin, Boris

Abstract

A machine-vision system configured to detect a first feature associated with a virtual-sporting event, detect a second feature associated with the virtual-sporting event, provide the first detected feature and the second detected feature as input to a machine-vision system to identify at least a first portion of a representation of the virtual-sporting event, and combine the first detected feature and the second detected feature to validate the first portion of the representation of the virtual-sporting event.

IPC Classes  ?

  • G06K 9/62 - Methods or arrangements for recognition using electronic means
  • G06N 5/04 - Inference or reasoning models
  • G06N 20/00 - Machine learning
  • G06V 10/22 - Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
  • G06V 20/40 - Scenes; Scene-specific elements in video content
  • G06V 40/10 - Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
  • G06T 7/292 - Multi-camera tracking
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

69.

Generative Interior Design in Video Games

      
Application Number 17463754
Status Pending
Filing Date 2021-09-01
First Publication Date 2023-03-02
Owner Electronic Arts Inc. (USA)
Inventor
  • Liu, Han
  • Liang, Jingwen
  • Mitchell, Schaefer
  • Sardari, Mohsen

Abstract

This specification describes a computer-implemented generative interior design method. The method comprises obtaining input data comprising boundary data. The boundary data defines a boundary of an interior region of a video game building. A floor plan for the interior region of the video game building is generated. This comprises processing the input data using a floor plan generator model. The floor plan divides the interior region into a plurality of interior spaces. A layout for at least one of the plurality of interior spaces defined by the floor plan is generated by a layout generator model comprising one or more graph neural networks. The layout represents a configuration of one or more objects to be placed in the interior region.

IPC Classes  ?

  • A63F 13/63 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor by the player, e.g. authoring using a level editor
  • G06F 30/12 - Geometric CAD characterised by design entry means specially adapted for CAD, e.g. graphical user interfaces [GUI] specially adapted for CAD
  • G06F 30/13 - Architectural design, e.g. computer-aided architectural design [CAAD] related to design of buildings, bridges, landscapes, production plants or roads
  • G06N 3/04 - Architecture, e.g. interconnection topology

70.

SPECTATOR SYSTEM IN ONLINE GAMES

      
Application Number 17446745
Status Pending
Filing Date 2021-09-02
First Publication Date 2023-03-02
Owner Electronic Arts Inc. (USA)
Inventor
  • Noimark, Yuval
  • Skelton, Jeffrey E.
  • Karlsson, Henrik
  • Bilbao, Eneko

Abstract

A spectator system may provide for spectating in online gaming. The spectator system may receive, at a spectator server, game state data from a game simulation server hosting an online game for one or more players, generate one or more spectator game state data corresponding to one or more spectator devices and output the one or more spectator game state data to the spectator devices. The spectator server may further output the game state data to another spectator server.

IPC Classes  ?

  • A63F 13/86 - Watching games played by other players
  • A63F 13/35 - Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers - Details of game servers

71.

System for multiview games experience

      
Application Number 17446747
Grant Number 11925861
Status In Force
Filing Date 2021-09-02
First Publication Date 2023-03-02
Grant Date 2024-03-12
Owner Electronic Arts Inc. (USA)
Inventor
  • Noimark, Yuval
  • Skelton, Jeffrey E.
  • Karlsson, Henrik
  • Bilbao, Eneko

Abstract

A spectator system may provide for spectating in online gaming. The spectator system may receive, at a spectator server, game state data from a game simulation server hosting an online game for one or more players, generate one or more spectator game state data corresponding to one or more spectator devices and output the one or more spectator game state data to the spectator devices. The spectator server may further output the game state data to another spectator server.

IPC Classes  ?

  • A63F 13/355 - Performing operations on behalf of clients with restricted processing capabilities, e.g. servers transform changing game scene into an MPEG-stream for transmitting to a mobile phone or a thin client
  • A63F 13/335 - Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using wide area network [WAN] connections using Internet
  • A63F 13/77 - Game security or game management aspects involving data related to game devices or game servers, e.g. configuration data, software version or amount of memory
  • A63F 13/86 - Watching games played by other players

72.

Generating Visual Assets for Video Games

      
Application Number 17463730
Status Pending
Filing Date 2021-09-01
First Publication Date 2023-03-02
Owner Electronic Arts Inc. (USA)
Inventor
  • Liu, Han
  • Liang, Jingwen
  • Harder, Jesse
  • Hing, Gary Ng Thow
  • Sardari, Mohsen

Abstract

This specification describes a computing system for generating visual assets for video games. The computing system comprises an image segmentation model, a first 3D generation model, and a second 3D generation model. At least one of the first 3D generation model and the second 3D generation model comprises a machine-learning model. The system is configured to obtain: (i) a plurality of images corresponding to the visual asset, each image showing a different view of an object to be generated in the visual asset, and (ii) orientation data for each image that specifies an orientation of the object in the image. A segmented image is generated for each image. This comprises processing the image using the image segmentation model to segment distinct portions of the image into one or more classes of a predefined set of classes. For each image, 3D shape data is generated for a portion of the object displayed in the image. This comprises processing the segmented image of the image, the orientation data of the image, and style data for the visual asset using the first 3D generation model. 3D shape data is generated for the visual asset. This comprises processing the generated 3D shape data of each image using the second 3D generation model.

IPC Classes  ?

  • G06T 17/10 - Volume description, e.g. cylinders, cubes or using CSG [Constructive Solid Geometry]
  • G06T 7/11 - Region-based segmentation
  • G06T 7/55 - Depth or shape recovery from multiple images
  • G06N 3/04 - Architecture, e.g. interconnection topology
  • G06N 20/00 - Machine learning

73.

INTERACTION BASED SKILL MEASUREMENT FOR PLAYERS OF A VIDEO GAME

      
Application Number 17402421
Status Pending
Filing Date 2021-08-13
First Publication Date 2023-02-16
Owner Electronic Arts Inc. (USA)
Inventor
  • Pierse, Christopher
  • Zhang, Xiaozhe

Abstract

Skill measurement systems and methods include interaction pairs and an interaction uncertainty. The interaction pairs are pairwise matches corresponding to instances of interactions between players. The interaction uncertainty variable corresponds to the instance of interaction and is based in part on the uncertainties of a player, player team, and/or gameplay aspects. The interaction pairs and interaction uncertainty are used to more accurately determine a skill rating of a player based in part on interaction data among gameplay data from a gameplay session of a video game.

IPC Classes  ?

  • A63F 13/798 - Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories for assessing skills or for ranking players, e.g. for generating a hall of fame
  • A63F 13/795 - Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories for providing a buddy list
  • A63F 13/822 - Strategy games; Role-playing games 
  • G06N 7/00 - Computing arrangements based on specific mathematical models

74.

INTERACTIVE VIDEOGAME VERIFICATION USING CRYPTOGRAPHICALLY PROTECTED TRANSACTION RECORDS

      
Application Number 17970083
Status Pending
Filing Date 2022-10-20
First Publication Date 2023-02-09
Owner ELECTRONIC ARTS INC. (USA)
Inventor Maharshak, Erez

Abstract

An example method of performing interactive videogame verification using cryptographically protected transaction records includes: receiving, by a videogame server, from a first videogame client device, a first transaction record reflecting a first set of events associated with an interactive videogame session, wherein the first transaction record is cryptographically signed by a first private cryptographic key associated with the first videogame client device; receiving, from a second videogame client device, a second transaction record reflecting a second set of events associated with the interactive videogame session, wherein the second transaction record is cryptographically signed by a second private cryptographic key associated with the second videogame client device; and validating the first transaction record based on the second transaction record.

IPC Classes  ?

  • H04L 9/32 - Arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system
  • A63F 13/71 - Game security or game management aspects using secure communication between game devices and game servers, e.g. by encrypting game data or authenticating players

75.

AUTOMATED PIPELINE SELECTION FOR SYNTHESIS OF AUDIO ASSETS

      
Application Number 17970169
Status Pending
Filing Date 2022-10-20
First Publication Date 2023-02-09
Owner ELECTRONIC ARTS INC. (USA)
Inventor
  • Gupta, Kilol
  • Agarwal, Tushar
  • Shakeri, Zahra
  • Sardari, Mohsen
  • Chaput, Harold Henry
  • Aghdaie, Navid

Abstract

An example method of automated selection of audio asset synthesizing pipelines includes: receiving an audio stream comprising human speech; determining one or more features of the audio stream; selecting, based on the one or more features of the audio stream, an audio asset synthesizing pipeline; training, using the audio stream, one or more audio asset synthesizing models implementing respective stages of the selected audio asset synthesizing pipeline; and responsive to determining that a quality metric of the audio asset synthesizing pipeline satisfies a predetermined quality condition, synthesizing one or more audio assets by the selected audio asset synthesizing pipeline.

IPC Classes  ?

  • G10L 13/047 - Architecture of speech synthesisers
  • G10L 13/08 - Text analysis or generation of parameters for speech synthesis out of text, e.g. grapheme to phoneme translation, prosody generation or stress or intonation determination

76.

Edutainment overlay for learning foreign languages in video games

      
Application Number 17393327
Grant Number 11574557
Status In Force
Filing Date 2021-08-03
First Publication Date 2023-02-07
Grant Date 2023-02-07
Owner Electronic Arts Inc. (USA)
Inventor
  • Borovikov, Igor
  • Sardari, Mohsen

Abstract

Various aspects of the subject technology relate to systems, methods, and machine-readable media for learning a foreign language. The method includes executing a video game in a first human language. The method includes pausing gameplay of the video game for a paused time instance. The method includes executing a digital mini-puzzle game during the paused time instance in the gameplay of the video game, the digital mini-puzzle game executed in a second human language, the digital mini-puzzle game executed utilizing assets of the video game. The method includes receiving a response to the digital mini-puzzle game from a player-computing device corresponding to a player, the response comprising at least one of the first human language and/or the second human language. The method includes determining a score of the response corresponding to the player based at least in part on a comparison of the response with translation pairs in a database.

IPC Classes  ?

  • G09B 19/06 - Foreign languages
  • G09B 7/06 - Electrically-operated teaching apparatus or devices working with questions and answers of the multiple-choice answer type, i.e. where a given question is provided with a series of answers and a choice has to be made from the answers

77.

ROUTE GENERATION SYSTEM WITHIN A VIRTUAL ENVIRONMENT OF A GAME APPLICATION

      
Application Number 17384224
Status Pending
Filing Date 2021-07-23
First Publication Date 2023-01-26
Owner Electronic Arts Inc. (USA)
Inventor
  • Carter, Jr., Ben Folsom
  • Rich, Jr., Benjamin Scott
  • Hayes, Jonathan Douglas

Abstract

The systems and processes described herein can provide dynamic and realistic route generation based on actual route data within the game environment. The system provides for generating a route database for use with a sports simulation game application. The present disclosure also provides for generation of routes during runtime of the game application. The route generation system can help address the problem of generating realistic and lifelike routes based on real life movements of athletes.

IPC Classes  ?

  • A63F 13/56 - Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding
  • A63F 13/57 - Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
  • A63F 13/65 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
  • A63F 13/812 - Ball games, e.g. soccer or baseball

78.

Feedback oriented gameplay sessions in video games

      
Application Number 17402424
Grant Number 11559746
Status In Force
Filing Date 2021-08-13
First Publication Date 2023-01-24
Grant Date 2023-01-24
Owner Electronic Arts Inc. (USA)
Inventor
  • Caballero, Oswaldo
  • Main, Paul
  • Lam, Samuel
  • Velez, Santiago

Abstract

Systems and methods are described herein for monitoring a gameplay session for violations of a policy and creating a remediation gameplay session through which remediation can be provided to players or player accounts that violate gameplay policies. The systems and methods can create a remediation gameplay session based in part on the game state data of the gameplay session during which the violation occurs.

IPC Classes  ?

  • A63F 13/75 - Enforcing rules, e.g. detecting foul play or generating lists of cheating players
  • A63F 13/79 - Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories

79.

Enhanced animation generation based on motion matching using local bone phases

      
Application Number 17392123
Grant Number 11562523
Status In Force
Filing Date 2021-08-02
First Publication Date 2023-01-24
Grant Date 2023-01-24
Owner Electronic Arts Inc. (USA)
Inventor
  • Starke, Wolfram Sebastian
  • Zhao, Yiwei
  • Sardari, Mohsen
  • Chaput, Harold Henry
  • Aghdaie, Navid

Abstract

Systems and methods are provided for enhanced animation generation based on using motion mapping with local bone phases. An example method includes accessing first animation control information generated for a first frame of an electronic game including local bone phases representing phase information associated with contacts of a plurality of rigid bodies of an in-game character with an in-game environment. Executing a local motion matching process for each of the plurality of local bone phases and generating a second pose of the character model based on the plurality of matched local poses for a second frame of the electronic game.

IPC Classes  ?

  • G06T 13/40 - 3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
  • G06T 7/215 - Motion-based segmentation
  • G06T 7/223 - Analysis of motion using block-matching
  • G06V 40/10 - Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands

80.

AUTOMATED TEST MULTIPLEXING SYSTEM

      
Application Number 17819593
Status Pending
Filing Date 2022-08-12
First Publication Date 2023-01-12
Owner Electronic Arts Inc. (USA)
Inventor
  • Borovikov, Igor
  • Harder, Jesse Hans Stokes
  • O'Neill, Thomas Patrick
  • Rein, Jonathan Albert
  • Lee, Avery H.
  • Wrotek, Pawel Piotr
  • Parker, Graham Michael
  • Vincent, David

Abstract

An imitation learning system may learn how to play a video game based on user interactions by a tester or other user of the video game. The imitation learning system may develop an imitation learning model based, at least in part, on the tester's interaction with the video game and the corresponding state of the video game to determine or predict actions that may be performed when interacting with the video game. The imitation learning system may use the imitation learning model to control automated agents that can play additional instances of the video game. Further, as the user continues to interact with the video game during testing, the imitation learning model may continue to be updated. Thus, the interactions by the automated agents with the video game may, over time, almost mimic the interaction by the user enabling multiple tests of the video game to be performed simultaneously.

IPC Classes  ?

  • A63F 13/35 - Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers - Details of game servers
  • G06N 3/08 - Learning methods
  • G06F 11/36 - Preventing errors by testing or debugging of software
  • A63F 13/60 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor

81.

Enhanced animation generation based on video with local phase

      
Application Number 17305229
Grant Number 11670030
Status In Force
Filing Date 2021-07-01
First Publication Date 2023-01-05
Grant Date 2023-06-06
Owner ELECTRONIC ARTS INC. (USA)
Inventor
  • Shi, Mingyi
  • Zhao, Yiwei
  • Starke, Wolfram Sebastian
  • Sardari, Mohsen
  • Aghdaie, Navid

Abstract

Embodiments of the systems and methods described herein provide a dynamic animation generation system that can apply a real-life video clip with a character in motion to a first neural network to receive rough motion data, such as pose information, for each of the frames of the video clip, and overlay the pose information on top of the video clip to generate a modified video clip. The system can identify a sliding window that includes a current frame, past frames, and future frames of the modified video clip, and apply the modified video clip to a second neural network to predict a next frame. The dynamic animation generation system can then move the sliding window to the next frame while including the predicted next frame, and apply the new sliding window to the second neural network to predict the following frame to the next frame.

IPC Classes  ?

  • G06T 13/40 - 3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
  • G06T 7/70 - Determining position or orientation of objects or cameras
  • G06T 7/20 - Analysis of motion
  • H04N 5/272 - Means for inserting a foreground image in a background image, i.e. inlay, outlay

82.

Generating positions of map items for placement on a virtual map

      
Application Number 17902290
Grant Number 11668581
Status In Force
Filing Date 2022-09-02
First Publication Date 2022-12-29
Grant Date 2023-06-06
Owner ELECTRONIC ARTS INC. (USA)
Inventor
  • Liu, Han
  • Zhao, Yiwei
  • Liang, Jingwen
  • Sardari, Mohsen
  • Chaput, Harold
  • Aghdaie, Navid
  • Zaman, Kazi

Abstract

This specification describes a system for generating positions of map items such as buildings, for placement on a virtual map. The system comprises: at least one processor; and a non-transitory computer-readable medium including executable instructions that when executed by the at least one processor cause the at least one processor to perform at least the following operations: receiving an input at a generator neural network trained for generating map item positions; generating, with the generator neural network, a probability of placing a map item for each subregion of a plurality of subregions of the region of the virtual map; and generating position data of map items for placement on the virtual map using the probability for each subregion. The input to the generator neural network comprises: map data comprising one or more channels of position information for at least a region of the virtual map, said one or more channels including at least one channel comprising road position information for the region; and a latent vector encoding a selection of a placement configuration.

IPC Classes  ?

83.

REQUEST DISTRIBUTION SYSTEM

      
Application Number 17929293
Status Pending
Filing Date 2022-09-01
First Publication Date 2022-12-29
Owner Electronic Arts Inc. (USA)
Inventor
  • Kolen, John
  • Chaput, Harold Henry
  • Aghdaie, Navid
  • Zaman, Kazi Atif-Uz
  • Moss, Kenneth Alan

Abstract

Embodiments of the systems and methods disclosed herein provide a request distribution system in which a request for resources may be executed by a plurality of workers. Upon receiving a request for resources from a user computing system, the request distribution system may select a subset of workers from the plurality of workers to execute the request within a time limit. Once the workers generate a plurality of outputs, each output associated with a quality level, the request distribution system may transmit the output associated with the highest quality level to the user computing system.

IPC Classes  ?

  • A63F 13/355 - Performing operations on behalf of clients with restricted processing capabilities, e.g. servers transform changing game scene into an MPEG-stream for transmitting to a mobile phone or a thin client
  • G07F 17/32 - Coin-freed apparatus for hiring articles; Coin-freed facilities or services for games, toys, sports, or amusements
  • G06N 20/00 - Machine learning
  • A63F 13/71 - Game security or game management aspects using secure communication between game devices and game servers, e.g. by encrypting game data or authenticating players

84.

ENHANCED SYSTEM FOR GENERATION OF FACIAL MODELS AND ANIMATION

      
Application Number 17344471
Status Pending
Filing Date 2021-06-10
First Publication Date 2022-12-15
Owner Electronic Arts Inc. (USA)
Inventor Phan, Hau Nghiep

Abstract

Systems and methods are provided for enhanced animation generation based on generative modeling. An example method includes training models based on faces and information associated with persons, each face being defined based on location information associated with facial features, and identity information for each person. The modeling system being trained to reconstruct expressions, textures, and models of persons.

IPC Classes  ?

  • G06T 13/40 - 3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
  • G06T 17/20 - Wire-frame description, e.g. polygonalisation or tessellation
  • G06K 9/00 - Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
  • G06N 3/08 - Learning methods

85.

Enhanced system for generation of facial models and animation

      
Application Number 17344581
Grant Number 11887232
Status In Force
Filing Date 2021-06-10
First Publication Date 2022-12-15
Grant Date 2024-01-30
Owner Electronic Arts Inc. (USA)
Inventor Phan, Hau Nghiep

Abstract

Systems and methods are provided for enhanced animation generation based on generative modeling. An example method includes training models based on faces and information associated with persons, each face being defined based on location information associated with facial features, and identity information for each person. The modeling system being trained to reconstruct expressions, textures, and models of persons.

IPC Classes  ?

  • G06T 13/40 - 3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
  • G06T 17/20 - Wire-frame description, e.g. polygonalisation or tessellation
  • G06N 3/08 - Learning methods
  • G06V 20/40 - Scenes; Scene-specific elements in video content
  • G06V 40/16 - Human faces, e.g. facial parts, sketches or expressions

86.

ENHANCED SYSTEM FOR GENERATION OF FACIAL MODELS AND ANIMATION

      
Application Number 17344618
Status Pending
Filing Date 2021-06-10
First Publication Date 2022-12-15
Owner Electronic Arts Inc. (USA)
Inventor Phan, Hau Nghiep

Abstract

Systems and methods are provided for enhanced animation generation based on generative modeling. An example method includes training models based on faces and information associated with persons, each face being defined based on location information associated with facial features, and identity information for each person. The modeling system being trained to reconstruct expressions, textures, and models of persons.

IPC Classes  ?

  • G06T 13/40 - 3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
  • G06T 17/20 - Wire-frame description, e.g. polygonalisation or tessellation
  • G06K 9/00 - Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
  • G06N 3/08 - Learning methods

87.

Interactive videogame verification using cryptographically protected transaction records

      
Application Number 16786125
Grant Number 11528148
Status In Force
Filing Date 2020-02-10
First Publication Date 2022-12-13
Grant Date 2022-12-13
Owner Electronic Arts Inc. (USA)
Inventor Maharshak, Erez

Abstract

An example method of performing interactive videogame verification using cryptographically protected transaction records includes: receiving, by a videogame server, from a first videogame client device, a first transaction record reflecting a first set of events associated with an interactive videogame session, wherein the first transaction record is cryptographically signed by a first private cryptographic key associated with the first videogame client device; receiving, from a second videogame client device, a second transaction record reflecting a second set of events associated with the interactive videogame session, wherein the second transaction record is cryptographically signed by a second private cryptographic key associated with the second videogame client device; and validating the first transaction record based on the second transaction record.

IPC Classes  ?

  • H04L 9/40 - Network security protocols
  • H04L 9/32 - Arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system
  • A63F 13/71 - Game security or game management aspects using secure communication between game devices and game servers, e.g. by encrypting game data or authenticating players
  • H04L 9/00 - Arrangements for secret or secure communications; Network security protocols

88.

Systems and methods for transcribing user interface elements of a game application into haptic feedback

      
Application Number 17806655
Grant Number 11786812
Status In Force
Filing Date 2022-06-13
First Publication Date 2022-12-01
Grant Date 2023-10-17
Owner ELECTRONIC ARTS INC. (USA)
Inventor Stevens, Karen Elaine

Abstract

The present invention introduces an in-game API wrapper to perform identification and transcription of in-game visual and audio data by way of identifiable tags. Identified tags for visual and audio data are sent to an external audio API for transcribing into Morse code. The Morse code transcribing is sent back to the in-game API wrapper for transcription into haptic feedback. Identified tags for available on-screen button selections are transcribed by the in-game API wrapper into haptic feedback.

IPC Classes  ?

  • A63F 13/285 - Generating tactile feedback signals via the game input device, e.g. force feedback
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 40/47 - Machine-assisted translation, e.g. using translation memory
  • H04L 67/131 - Protocols for games, networked simulations or virtual reality

89.

Player profile management system

      
Application Number 17303056
Grant Number 11660543
Status In Force
Filing Date 2021-05-19
First Publication Date 2022-11-24
Grant Date 2023-05-30
Owner Electronic Arts Inc. (USA)
Inventor
  • Zhang, Zhaosheng
  • Huang, Sijia
  • Li, Li
  • Zhou, Biao
  • Zhu, Shanzhong

Abstract

A player profile management system collects player data from various systems and generates and manages player profiles. A snapshot pipeline of the player profile management system generates a snapshot player profile associated with a player. The player profile management system receives, after generating the snapshot player profile associated with the player, player data associated with the player. An update pipeline of the player profile management system generates, based on the snapshot player profile and the player data associated with the player, an update player profile associated with the player.

IPC Classes  ?

  • A63F 13/79 - Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories
  • A63F 13/355 - Performing operations on behalf of clients with restricted processing capabilities, e.g. servers transform changing game scene into an MPEG-stream for transmitting to a mobile phone or a thin client

90.

Interactive reenactment within a video game

      
Application Number 17410245
Grant Number 11504619
Status In Force
Filing Date 2021-08-24
First Publication Date 2022-11-22
Grant Date 2022-11-22
Owner Electronic Arts Inc. (USA)
Inventor
  • Borovikov, Igor
  • Chaput, Harold Henry
  • Victor, Nitish
  • Sardari, Mohsen

Abstract

A video reenactment system and method analyze a video clip that a video game player wishes to reenact and maps objects and actions within the video clip to virtual objects and virtual actions within the video game. A reenactment script indicating a sequence of virtual objects and virtual actions as mapped to objects and actions in the video clip is generated using a video translation model and stored for use in reenacting the video clip. The reenactment script can be used within the video game to reenact the objects and actions of the video clip. The reenactment of the video clip may be interactive, where a player may assume control within the reenactment and when the player relinquishes control, the reenactment will continue at an appropriate part of the sequence of actions by skipping actions corresponding to the ones played by the player.

IPC Classes  ?

  • A63F 13/497 - Partially or entirely replaying previous game actions

91.

Detecting collusion in online games

      
Application Number 17302837
Grant Number 11717757
Status In Force
Filing Date 2021-05-13
First Publication Date 2022-11-17
Grant Date 2023-08-08
Owner Electronic Arts Inc. (USA)
Inventor
  • Greige, Laura
  • Trotter, Meredith
  • Narravula, Sundeep
  • Aghdaie, Navid
  • De Mesentier Silva, Fernando

Abstract

A collusion detection system may detect collusion between entities participating in online gaming. The collusion detection system may identify a plurality of entities associated with and opponents within an instance of an online game, determine social data associated with the plurality of entities, determine in-game behavior data associated with the plurality of entities, and determine, for one or more pairings of the plurality of entities, respective pairwise feature sets based at least in part on the social data and the in-game behavior data. The collusion detection system may then perform anomaly detection on the respective pairwise feature sets and, in response to the anomaly detection detecting one or more anomalous pairwise feature sets, output one or more suspect pairings of the plurality of entities corresponding to the one or more anomalous pairwise feature sets as suspected colluding pairings.

IPC Classes  ?

  • A63F 13/75 - Enforcing rules, e.g. detecting foul play or generating lists of cheating players
  • A63F 13/35 - Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers - Details of game servers

92.

DYNAMIC CONTROL SURFACE

      
Application Number 17656826
Status Pending
Filing Date 2022-03-28
First Publication Date 2022-11-10
Owner Electronic Arts Inc. (USA)
Inventor
  • Kestell, Stephen Roger
  • Hejl, Jr., James Nunn

Abstract

The present disclosure provides for the dynamic mapping of functions within a content development application to a control surface including a plurality of distinct modular input consoles. The system includes a console controller that is configured to monitor usage of the content development application by the user and to dynamically control the mapping of functions to the control surface based on the contextual operation of the content development application. The console controller can determine the functions that are to be mapped to the control surface based on the context of the application and the functions that are prioritized for use by a user of the content development application

IPC Classes  ?

  • A63F 13/22 - Setup operations, e.g. calibration, key configuration or button assignment
  • G06N 20/00 - Machine learning
  • A63F 13/23 - Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console
  • A63F 13/217 - Input arrangements for video game devices characterised by their sensors, purposes or types using environment-related information, i.e. information generated otherwise than by the player, e.g. ambient temperature or humidity

93.

Spatial partitioning for graphics rendering

      
Application Number 17659064
Grant Number 11704868
Status In Force
Filing Date 2022-04-13
First Publication Date 2022-10-27
Grant Date 2023-07-18
Owner Electronic Arts Inc. (USA)
Inventor Loodin Ek, Alexander

Abstract

An improved virtual environment creation and testing process can be achieved by a combination of spatial partitioning and reverse tree generation. The reverse tree may be representative of the virtual environment and may be generated starting from a smallest portion or zone of the virtual environment (represented as a leaf node) and expanding up towards a root node representative of the entire virtual environment. Advantageously, the system can add new zones to the virtual environment and representative tree data structure that are external to the existing virtual environment without generating a new tree data structure. Thus, the computing resources utilized by the system disclosed herein may be significantly reduced compared to existing processes while improving the flexibility of the spatial partitioning and tree generation process thereby enabling spatial partitioning to be performed in real or near real time as a developer authors the virtual environment.

IPC Classes  ?

  • G06T 17/00 - 3D modelling for computer graphics
  • G06T 15/20 - Perspective computation
  • A63F 13/63 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor by the player, e.g. authoring using a level editor

94.

Automated detection of emergent behaviors in interactive agents of an interactive environment

      
Application Number 17107163
Grant Number 11478713
Status In Force
Filing Date 2020-11-30
First Publication Date 2022-10-25
Grant Date 2022-10-25
Owner Electronic Arts Inc. (USA)
Inventor Dills, Thomas Bradley

Abstract

Various aspects of the subject technology relate to systems, methods, and machine-readable media for automated detection of emergent behaviors in interactive agents of an interactive environment. The disclosed system represents an artificial intelligence based entity that utilizes a trained machine-learning-based clustering algorithm to group users together based on similarities in behavior. The clusters are processed based on a determination of the type of activity of the clustered users. In order to identify new categories of behavior and to label those new categories, a separate cluster analysis is performed using interaction data obtained at a subsequent time. The additional cluster analysis determines any change in behavior between the clustered sets and/or change in the number of users in a cluster. The identification of emergent user behavior enables the subject system to treat users differently based on their in-game behavior and to adapt in near real-time to changes in their behavior.

IPC Classes  ?

  • A63F 13/79 - Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories
  • G06K 9/62 - Methods or arrangements for recognition using electronic means
  • A63F 13/75 - Enforcing rules, e.g. detecting foul play or generating lists of cheating players
  • G06N 20/00 - Machine learning

95.

User-controllable model-driven matchmaking

      
Application Number 16908421
Grant Number 11478715
Status In Force
Filing Date 2020-06-22
First Publication Date 2022-10-25
Grant Date 2022-10-25
Owner Electronic Arts Inc. (USA)
Inventor
  • Wu, Meng
  • Yu, Qilian
  • Chaput, Harold Henry
  • Aghdaie, Navid
  • Zaman, Kazi Atif-Uz
  • Moss, Kenneth Alan

Abstract

Various aspects of the subject technology relate to systems, methods, and machine-readable media for user matchmaking. The method includes training a quality model and an embedding model based on historical data and user control options. The method also includes receiving user control options and matchmaking requests from users. The method also includes embedding, through the embedding model, user data regarding the users into an embedded space based on the received user control options and the matchmaking requests. The method also includes determining, based on the embedded user data, that a distance between two users satisfies a distance threshold. The method also includes matching the two users when the determined distance satisfies the distance threshold.

IPC Classes  ?

  • A63F 13/798 - Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories for assessing skills or for ranking players, e.g. for generating a hall of fame
  • A63F 13/795 - Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories for providing a buddy list
  • A63F 13/352 - Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers - Details of game servers involving special game server arrangements, e.g. regional servers connected to a national server or a plurality of servers managing partitions of the game world
  • A63F 13/67 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor adaptively or by learning from player actions, e.g. skill level adjustment or by storing successful combat sequences for re-use
  • G06N 20/20 - Ensemble learning
  • G06N 3/08 - Learning methods

96.

Deep learning for data-driven skill estimation

      
Application Number 17090177
Grant Number 11478716
Status In Force
Filing Date 2020-11-05
First Publication Date 2022-10-25
Grant Date 2022-10-25
Owner Electronic Arts Inc. (USA)
Inventor Zhao, Chong

Abstract

Various aspects of the subject technology relate to systems, methods, and machine-readable media for determining player skill for video games. The method includes aggregating a plurality of player statistics for match outcomes from a plurality of video games. The method also includes calculating, for each player in a pool of players, a matchmaking rating for each player based on the plurality of player statistics, the matchmaking rating for each player comprising a predicted number of points each player will contribute to a match. The method also includes selecting, based on the matchmaking rating for each player, players from the pool of players. The method also includes matching the players based on the matchmaking rating for each player, a sum of the matchmaking ratings comprising a total predicted team score for the match.

IPC Classes  ?

  • A63F 13/00 - Video games, i.e. games using an electronically generated display having two or more dimensions
  • A63F 9/24 - Games using electronic circuits not otherwise provided for
  • A63F 13/798 - Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories for assessing skills or for ranking players, e.g. for generating a hall of fame

97.

PUZZLE VALIDATION USING HEURISTIC EVALUATION

      
Application Number 17643106
Status Pending
Filing Date 2021-12-07
First Publication Date 2022-10-13
Owner ELECTRONIC ARTS INC. (USA)
Inventor
  • Harder, Jesse Hans Stokes
  • Levonyan, Karine Andranikovna
  • Sardari, Mohsen

Abstract

A puzzle validation system and method determine whether one or more solutions to a puzzle to be validated exist. If one or more solutions for the puzzle do exist, then the puzzle is valid. The puzzle validation system may use a path traversing algorithm that limits selections along the path to only valid selections may be implemented to find valid solutions to the puzzle that do not violate any constraints of the puzzle. The puzzle validation mechanism may also heuristically optimize, using an initial set of valid solutions, to produce optimal or near-optimal solutions to the puzzle. The puzzle validation mechanism may further generate one or more statistics associated with the puzzle that may be used to evaluated solutions when the puzzle is deployed for gameplay. The mechanisms disclosed allow for deployment of confirmed valid puzzles, either as a standalone puzzle or as a puzzle incorporated in a video game.

IPC Classes  ?

  • G06N 3/12 - Computing arrangements based on biological models using genetic models
  • A63F 13/60 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor

98.

SYSTEMS AND METHODS FOR GENERATING A PROXY MESH FOR A POLYGONAL MESH THAT INCLUDES SUB-MESHES

      
Application Number 17735702
Status Pending
Filing Date 2022-05-03
First Publication Date 2022-10-13
Owner Electronic Arts Inc. (USA)
Inventor Mason, Ashton

Abstract

A method, device, and computer-readable storage medium for generating a proxy mesh. The method includes: receiving an input polygonal mesh that includes multiple sub-meshes, each of which is a polygonal mesh, where the input polygonal mesh is a computer representation of a three-dimensional (3D) object; generating a voxel volume representing the input polygonal mesh, wherein the voxel volume comprises voxels that approximates a shape of the 3D object, wherein a first set of voxels of the voxel volume includes voxels that are identified as boundary voxels that correspond to positions of polygons of the multiple sub-meshes of the input polygonal mesh; determining a grouping of two or more sub-meshes that together enclose one or more voxels of the voxel volume other than the voxels in the first set of voxels; and generating a proxy mesh corresponding to the input polygonal mesh based on the grouping of two or more sub-meshes.

IPC Classes  ?

  • G06T 17/20 - Wire-frame description, e.g. polygonalisation or tessellation

99.

Creating and exporting graphical user interfaces for video game runtime environments

      
Application Number 17217722
Grant Number 11687351
Status In Force
Filing Date 2021-03-30
First Publication Date 2022-10-06
Grant Date 2023-06-27
Owner ELECTRONIC ARTS INC. (USA)
Inventor
  • Popa, Adrian-Ciprian
  • Cowan, Timothy J.
  • Hayes, Jonathan Douglas

Abstract

Systems and methods for creating graphical user interfaces (GUIs) for runtime execution in virtual environments of software, such as video games. The system utilizes mock GUIs, which can be images illustrating or displaying mocked graphical user interfaces, to create GUIs that can be exported into runtime environments of software. The system creates GUIs by analyzing the graphical elements and attributes of mock GUIs, and assigning functionality to those graphical elements, enabling the operating of the GUIs within executable runtime environments.

IPC Classes  ?

  • G06F 3/048 - Interaction techniques based on graphical user interfaces [GUI]
  • G06F 9/451 - Execution arrangements for user interfaces
  • A63F 13/533 - Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game for prompting the player, e.g. by displaying a game menu

100.

Automated controller configuration recommendation system

      
Application Number 16997798
Grant Number 11458388
Status In Force
Filing Date 2020-08-19
First Publication Date 2022-10-04
Grant Date 2022-10-04
Owner Electronic Arts Inc. (USA)
Inventor Kestell, Stephen Roger

Abstract

Various aspects of the subject technology relate to systems, methods, and machine-readable media for adjusting controller settings. The method includes receiving, through a controller associated with a user, controller input for software. The method also includes determining, based on the controller input, a user profile for the user comprising at least a skill level and an input tendency of the user. The method also includes providing suggested adjustments to the controller settings intended to improve performance of the user in relation to the software, the controller settings comprising at least one of controller sensitivity or controller assignments. The method also includes receiving approval of the user to implement the suggested adjustments to the controller settings. The method also includes adjusting the controller settings based on the approval of the user.

IPC Classes  ?

  • A63F 13/42 - Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
  • A63F 13/23 - Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console
  • G06N 20/00 - Machine learning
  • A63F 13/22 - Setup operations, e.g. calibration, key configuration or button assignment
  1     2     3     ...     8        Next Page