Warner Bros. Entertainment Inc.

United States of America

Back to Profile

1-100 of 166 for Warner Bros. Entertainment Inc. Sort by
Query
Patent
United States - USPTO
Excluding Subsidiaries
Aggregations Reset Report
Date
New (last 4 weeks) 2
2024 April (MTD) 2
2024 March 1
2024 (YTD) 3
2023 12
See more
IPC Class
G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer 16
H04N 21/81 - Monomedia components thereof 13
G11B 27/11 - Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information not detectable on the record carrier 11
H04N 13/117 - Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation the virtual viewpoint locations being selected by the viewers or determined by viewer tracking 10
H04N 21/422 - Input-only peripherals, e.g. global positioning system [GPS] 10
See more
Status
Pending 31
Registered / In Force 135
Found results for  patents
  1     2        Next Page

1.

GESTURE RECOGNITION DEVICE AND METHOD FOR SENSING MULTI-FACTOR ASSERTION

      
Application Number 18334045
Status Pending
Filing Date 2023-06-13
First Publication Date 2024-04-18
Owner Warner Bros. Entertainment Inc. (USA)
Inventor
  • Nocon, Nathan
  • Mcwilliams, Tom

Abstract

A gesture-recognition (GR) device is disclosed that includes a capacitive touch sensor panel and a controller. The capacitive touch sensor panel comprises a plurality of sensing pads arranged in a cylindrical pattern inside a handle of the GR device and detects a multi-factor touch assertion at a set of sensing pads of the plurality of sensing pads. The controller transmits a driving signal to each of the plurality of sensing pads for the detection of the multi-factor touch assertion, generates an assertion signal, determines a signal sequence based on the assertion signal, and converts a current inactive state of the GR device to an active state based on a validation of the determined signal sequence corresponding to the multi-factor touch assertion and an inferred user intent.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • A63F 9/24 - Games using electronic circuits not otherwise provided for
  • A63H 33/22 - Optical, colour, or shadow toys
  • G06F 3/0346 - Pointing devices displaced or positioned by the user; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
  • G06F 3/041 - Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
  • G06F 3/044 - Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
  • G06V 40/10 - Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
  • G06V 40/20 - Movements or behaviour, e.g. gesture recognition

2.

NEMESIS CHARACTERS, NEMESIS FORTS, SOCIAL VENDETTAS AND FOLLOWERS IN COMPUTER GAMES

      
Application Number 18144798
Status Pending
Filing Date 2023-05-08
First Publication Date 2024-04-11
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • De Plater, Michael
  • Hoge, Christopher Herman
  • Roberts, Robert Kenyon Hull
  • Valerius, Daniel Paul
  • Newton, Rocky Albert
  • Stephens, Kevin Leslie

Abstract

Methods for managing non-player characters and power centers in a computer game are based on character hierarchies and individualized correspondences between each character's traits or rank and events that involve other non-player characters or objects. Players may share power centers, character hierarchies, non-player characters, and related quests involving the shared objects with other players playing separate and unrelated game instances over a computer network, with the outcome of the quests reflected in different the games. Various configurations of game machines are used to implement the methods.

IPC Classes  ?

  • A63F 13/67 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor adaptively or by learning from player actions, e.g. skill level adjustment or by storing successful combat sequences for re-use
  • A63F 13/58 - Controlling game characters or game objects based on the game progress by computing conditions of game characters, e.g. stamina, strength, motivation or energy level

3.

SYSTEM AND METHOD FOR GENERATING VIDEO CONTENT WITH HUE-PRESERVATION IN VIRTUAL PRODUCTION

      
Application Number 18451979
Status Pending
Filing Date 2023-08-18
First Publication Date 2024-03-07
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Zink, Michael
  • Smith, Michael

Abstract

A system is provided for generating video content with hue-preservation in virtual production. The system comprises a memory for storing instructions and a processor configured to execute the instructions. Based on the executed instructions, the processor is further configured to control a saturation of scene linear data based on mapping of a first color gamut corresponding to a first encoding format of raw data to a second color gamut corresponding to a defined color space. The processor is further configured to determine a standard dynamic range (SDR) video content in the defined color space based on the scene linear data. Based on a scaling factor that is applied to three primary color values that describe the first color gamut, hue of the SDR video content is preserved.

IPC Classes  ?

  • H04N 9/64 - Circuits for processing colour signals
  • H04N 9/67 - Circuits for processing colour signals for matrixing
  • H04N 9/73 - Colour balance circuits, e.g. white balance circuits or colour temperature control
  • H04N 19/186 - Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being a colour or a chrominance component

4.

CONTROL OF SOCIAL ROBOT BASED ON PRIOR CHARACTER PORTRAYAL

      
Application Number 18130405
Status Pending
Filing Date 2023-04-03
First Publication Date 2023-11-23
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Gewecke, Thomas
  • Colf, Victoria Lynn
  • Gewickey, Gregory I.
  • Ostrover, Lewis S.

Abstract

A method and apparatus for controlling a social robot includes providing a set of quantitative personality trait values, also called a “personality profile” to a decision engine of the social robot. The personality profile is derived from a character portrayal in a fictional work, dramatic performance, or by a real-life person (any one of these sometime referred to herein as a “source character”). The decision engine controls social responses of the social robot to environmental stimuli, based in part on the set of personality trait values. The social robot thereby behaves in a manner consistent with the personality profile for the profiled source character.

IPC Classes  ?

  • B25J 11/00 - Manipulators not otherwise provided for
  • B25J 9/16 - Programme controls
  • G06N 3/008 - Artificial life, i.e. computing arrangements simulating life based on physical entities controlled by simulated intelligence so as to replicate intelligent life forms, e.g. based on robots replicating pets or humans in their appearance or behaviour

5.

System and method for controlling digital cinematic content based on emotional state of characters

      
Application Number 17963624
Grant Number 11822719
Status In Force
Filing Date 2022-10-11
First Publication Date 2023-11-21
Grant Date 2023-11-21
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Chappell, Iii, Arvel
  • Zink, Michael
  • Nguyen, Ha
  • Mosher, Clayton
  • Veeder, Andrew
  • Martinez, Michael
  • Sun, Shanshan
  • Lake-Schaal, Gary

Abstract

Provided is a system for controlling digital cinematic content based on emotional state of characters. A focus on one or more computer-controlled characters appearing in digital cinematic content is determined based on emotion indicators of a first user actively interacting with at least the one or more computer-controlled characters. A set of emotion indicators is inferred for each of the one or more computer-controlled characters based on one or more criteria and multifactor feedback loops are created. A story line of the digital cinematic content and behavioural characteristics of the one or more computer-controlled characters are controlled to achieve a target emotional arc of the first user based on the multifactor feedback loops.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

6.

CINEMATIC MASTERING FOR VIRTUAL REALITY AND AUGMENTED REALITY

      
Application Number 17888853
Status Pending
Filing Date 2022-08-16
First Publication Date 2023-10-19
Owner Warner Bros. Entertainment Inc. (USA)
Inventor
  • Defaria, Christopher
  • Gewickey, Gregory I.
  • Smith, Michael
  • Ostrover, Lewis S.

Abstract

An entertainment system provides data to a common screen (e.g., cinema screen) and personal immersive reality devices. For example, a cinematic data distribution server communicates with multiple immersive output devices each configured for providing immersive output (e.g., a virtual reality output) based on a data signal. Each of the multiple immersive output devices is present within eyesight of a common display screen. The server configures the data signal based on digital cinematic master data that includes immersive reality data. The server transmits the data signal to the multiple immersive output devices contemporaneously with each other, and optionally contemporaneously with providing a coordinated audio-video signal for output via the common display screen and shared audio system.

IPC Classes  ?

  • H04N 21/81 - Monomedia components thereof
  • H04N 21/422 - Input-only peripherals, e.g. global positioning system [GPS]
  • H04N 21/214 - Specialised server platform, e.g. server located in an airplane, hotel or hospital
  • H04N 21/2225 - Local VOD servers
  • H04N 21/433 - Content storage operation, e.g. storage operation in response to a pause request or caching operations
  • H04N 21/436 - Interfacing a local distribution network, e.g. communicating with another STB or inside the home

7.

METHODS FOR CONTROLLING SCENE, CAMERA AND VIEWING PARAMETERS FOR ALTERING PERCEPTION OF 3D IMAGERY

      
Application Number 17968722
Status Pending
Filing Date 2022-10-18
First Publication Date 2023-09-14
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Nolan, Christopher E.
  • Collar, Bradley T.
  • Smith, Michael D.

Abstract

Mathematical relationships between the scene geometry, camera parameters, and viewing environment are used to control stereography to obtain various results influencing the viewer's perception of 3D imagery. The methods may include setting a horizontal shift, convergence distance, and camera interaxial parameter to achieve various effects. The methods may be implemented in a computer-implemented tool for interactively modifying scene parameters during a 2D-to-3D conversion process, which may then trigger the re-rendering of the 3D content on the fly.

IPC Classes  ?

  • H04N 13/189 - Recording image signals; Reproducing recorded image signals
  • H04N 13/275 - Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
  • H04N 13/204 - Image signal generators using stereoscopic image cameras
  • H04N 13/128 - Adjusting depth or disparity

8.

Portal device and cooperating video game machine

      
Application Number 17962237
Grant Number 11766607
Status In Force
Filing Date 2022-10-07
First Publication Date 2023-05-11
Grant Date 2023-09-26
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor Burton, Jon

Abstract

A portal device for a video game includes a pad with different zones that can be illuminated by selectable colors, a toy sensor (e.g., an RFID tag sensor) associated with each zone, a controller and a communications port for communicating with a video game process executing on a game machine. The colors of each zone can be configured to one or a combination of three primary colors during game play, based on the game process. The portal device reacts to placement of tagged toys on zones and the color of the zones during play and provides sensor data to the game process. The game process controls the game state in part based on data from the portal device and in part on other user input.

IPC Classes  ?

  • A63F 9/24 - Games using electronic circuits not otherwise provided for
  • A63F 13/213 - Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
  • A63F 13/214 - Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
  • A63F 13/65 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
  • A63F 13/822 - Strategy games; Role-playing games 
  • G07F 17/32 - Coin-freed apparatus for hiring articles; Coin-freed facilities or services for games, toys, sports, or amusements

9.

MIXED REALITY SYSTEM FOR CONTEXT-AWARE VIRTUAL OBJECT RENDERING

      
Application Number 17979946
Status Pending
Filing Date 2022-11-03
First Publication Date 2023-04-20
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Smith, Michael
  • Ostrover, Lewis

Abstract

A computer-implemented method in conjunction with mixed reality gear (e.g., a headset) includes imaging a real scene encompassing a user wearing a mixed reality output apparatus. The method includes determining data describing a real context of the real scene, based on the imaging; for example, identifying or classifying objects, lighting, sound or persons in the scene. The method includes selecting a set of content including content enabling rendering of at least one virtual object from a content library, based on the data describing a real context, using various selection algorithms. The method includes rendering the virtual object in the mixed reality session by the mixed reality output apparatus, optionally based on the data describing a real context (“context parameters”). An apparatus is configured to perform the method using hardware, firmware, and/or software.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G02B 27/01 - Head-up displays
  • H04S 7/00 - Indicating arrangements; Control arrangements, e.g. balance control
  • G06V 20/10 - Terrestrial scenes
  • A63F 13/217 - Input arrangements for video game devices characterised by their sensors, purposes or types using environment-related information, i.e. information generated otherwise than by the player, e.g. ambient temperature or humidity
  • A63F 13/213 - Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
  • G06T 7/60 - Analysis of geometric attributes

10.

MATCHING MOUTH SHAPE AND MOVEMENT IN DIGITAL VIDEO TO ALTERNATIVE AUDIO

      
Application Number 17855801
Status Pending
Filing Date 2022-07-01
First Publication Date 2023-04-20
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Stratton, Tom David
  • Lile, Shaun

Abstract

A method for matching mouth shape and movement in digital video to alternative audio includes deriving a sequence of facial poses including mouth shapes for an actor from a source digital video. Each pose in the sequence of facial poses corresponds to a middle position of each audio sample. The method further includes generating an animated face mesh based on the sequence of facial poses and the source digital video, transferring tracked expressions from the animated face mesh or the target video to the source video, and generating a rough output video that includes transfers of the tracked expressions. The method further includes generating a finished video at least in part by refining the rough video using a parametric autoencoder trained on mouth shapes in the animated face mesh or the target video. One or more computers may perform the operations of the method.

IPC Classes  ?

  • G06T 13/40 - 3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
  • G06T 7/246 - Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods
  • G06T 13/20 - 3D [Three Dimensional] animation
  • G10L 13/047 - Architecture of speech synthesisers
  • G10L 15/16 - Speech classification or search using artificial neural networks

11.

System and method for generating video content with hue-preservation in virtual production

      
Application Number 17902136
Grant Number 11736670
Status In Force
Filing Date 2022-09-02
First Publication Date 2023-02-16
Grant Date 2023-08-22
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Zink, Michael
  • Smith, Michael

Abstract

A system is provided for generating video content with hue-preservation in virtual production. The system comprises a memory for storing instructions and a processor configured to execute the instructions. Based on the executed instructions, the processor is further configured to control a saturation of scene linear data based on mapping of a first color gamut corresponding to a first encoding format of raw data to a second color gamut corresponding to a defined color space. The processor is further configured to determine a standard dynamic range (SDR) video content in the defined color space based on the scene linear data. Based on a scaling factor that is applied to three primary color values that describe the first color gamut, hue of the SDR video content is preserved.

IPC Classes  ?

  • H04N 9/64 - Circuits for processing colour signals
  • H04N 9/73 - Colour balance circuits, e.g. white balance circuits or colour temperature control
  • H04N 9/67 - Circuits for processing colour signals for matrixing
  • H04N 19/186 - Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being a colour or a chrominance component

12.

CONTROLLING PROGRESS OF AUDIO-VIDEO CONTENT BASED ON SENSOR DATA OF MULTIPLE USERS, COMPOSITE NEURO-PHYSIOLOGICAL STATE AND/OR CONTENT ENGAGEMENT POWER

      
Application Number 17963741
Status Pending
Filing Date 2022-10-11
First Publication Date 2023-02-16
Owner Warner Bros. Entertainment Inc. (USA)
Inventor
  • Chappell, Iii, Arvel
  • Zink, Michael
  • Nguyen, Ha
  • Mosher, Clayton
  • Veeder, Andrew
  • Martinez, Michael
  • Sun, Shanshan
  • Lake-Schaal, Gary

Abstract

Provided is a system for controlling progress of audio-video content based on sensor data of multiple users, composite neuro-physiological state (CNS) and/or content engagement power (CEP). Sensor data is received from sensors positioned on an electronic device of a first user to sense neuro-physiological responses of the first user and second users that are in field-of-view (FOV) of the sensors. Based on the sensor data and at least one of a CNS value for social interaction application and a CEP value for immersive content, recommendations of action items for first user are predicted. Content of a feedback loop, created based on sensor data, CNS value, CEP value, and predicted recommendations, is rendered on output unit of electronic device during play of the at least one of social interaction application and immersive content experience. Progress of social interaction and immersive content experience is controlled by first user based on predicted recommendations.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

13.

Geometry matching in virtual reality and augmented reality

      
Application Number 17839475
Grant Number 11863845
Status In Force
Filing Date 2022-06-13
First Publication Date 2023-02-02
Grant Date 2024-01-02
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Defaria, Christopher
  • Smith, Michael

Abstract

Methods, apparatus and systems for geometric matching of virtual reality (VR) or augmented reality (AR) output contemporaneously with video output formatted for display on a 2D screen include a determination of value sets that when used in image processing cause an off-screen angular field of view of the at least one of the AR output object or the VR output object to have a fixed relationship to at least one of the angular field of view of the onscreen object or of the 2D screen. The AR/VR output object is outputted to an AR/VR display device and the user experience is improved by the geometric matching between objects observed on the AR/VR display device and corresponding objects appearing on the 2D screen.

IPC Classes  ?

  • H04N 13/00 - PICTORIAL COMMUNICATION, e.g. TELEVISION - Details thereof
  • H04N 21/81 - Monomedia components thereof
  • H04N 13/117 - Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation the virtual viewpoint locations being selected by the viewers or determined by viewer tracking
  • G06T 15/20 - Perspective computation
  • H04N 13/30 - Image reproducers

14.

GENERATION AND USE OF USER-SELECTED SCENES PLAYLIST FROM DISTRIBUTED DIGITAL CONTENT

      
Application Number 17856952
Status Pending
Filing Date 2022-07-02
First Publication Date 2023-01-26
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Lau, Kim
  • Frautschi, Jacob
  • Gasparri, Massimiliano
  • Lee, Randy
  • Harman, Patrick

Abstract

A digital content package includes first content comprising a video feature such as a motion picture or the like, and a user-selectable application configured to operate as follows. When activated using an icon off of a menu screen, the application records an identifier for scenes (discrete portions) of the first content that are selected by a user to generate a playlist. The user may select the scenes by indicating a start and end of each scene. The application saves the playlist locally, then uploads to a server. Via a user account at the server, a user may publish the playlist to a user-created distribution list, webpage, or other electronic publication, and modify the playlist by deleting or reordering scenes.

IPC Classes  ?

  • G11B 27/034 - Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
  • G11B 27/11 - Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information not detectable on the record carrier
  • H04N 21/2743 - Video hosting of uploaded data from client
  • H04N 21/426 - Internal components of the client
  • H04N 21/432 - Content retrieval operation from a local storage medium, e.g. hard-disk
  • H04N 21/482 - End-user interface for program selection
  • H04N 21/845 - Structuring of content, e.g. decomposing content into time segments
  • H04N 21/442 - Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed or the storage space available from the internal hard disk

15.

Perfless and cadenceless scanning and digitization of motion picture film

      
Application Number 17856837
Grant Number 11902688
Status In Force
Filing Date 2022-07-01
First Publication Date 2023-01-19
Grant Date 2024-02-13
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Klevatt, Steven Craig
  • Borquez, Michael Charles
  • Gillaspie, Christopher Eugene

Abstract

Motion picture film is scanned by high-resolution, continuous sprocketless scanner, producing a first sequence of digital images each representing a plurality of motion picture frames and perforations (perfs) of the film input. The first sequence of images is processed using a processor running an analysis and extraction algorithm, producing a second sequence of images each including a single, edge-stabilized frame of the motion picture.

IPC Classes  ?

  • H04N 3/38 - Scanning of motion picture films, e.g. for telecine with continuously moving film
  • G06F 9/448 - Execution paradigms, e.g. implementations of programming paradigms

16.

SOCIAL ROBOT WITH ENVIRONMENTAL CONTROL FEATURE

      
Application Number 17842730
Status Pending
Filing Date 2022-06-16
First Publication Date 2022-12-15
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Gewickey, Gregory I.
  • Ostrover, Lewis S.

Abstract

A method and apparatus for controlling a social robot includes operating an electronic output device based on social interactions between the social robot and a user. The social robot utilizes an algorithm or other logical solution process to infer a user mental state, for example a mood or desire, based on observation of the social interaction. Based on the inferred mental state, the social robot causes an action of the electronic output device to be selected. Actions may include, for example, playing a selected video clip, brewing a cup of coffee, or adjusting window blinds.

IPC Classes  ?

  • B25J 11/00 - Manipulators not otherwise provided for
  • B25J 13/00 - Controls for manipulators
  • G06N 3/00 - Computing arrangements based on biological models

17.

CONTROLLING CHARACTERISTICS OF LIGHT OUTPUT FROM LED WALLS

      
Application Number 17685314
Status Pending
Filing Date 2022-03-02
First Publication Date 2022-12-01
Owner Warner Bros. Entertainment Inc. (USA)
Inventor
  • Zink, Michael
  • Smith, Michael

Abstract

A computer-generated scene is generated as background for a live action set, for display on a panel of light emitting diodes (LEDs). Characteristics of light output by the LED panel are controlled such that the computer-generated scene rendered on the LED panel, when captured by a motion picture camera, has high fidelity to the original computer-generated scene. Consequently, the scene displayed on the screen more closely simulates the rendered scene from the viewpoint of the camera. Thus, a viewpoint captured by the camera appears more realistic and/or truer to the creative intent.

IPC Classes  ?

  • G06F 3/147 - Digital output to display device using display panels
  • H04N 5/222 - Studio circuitry; Studio devices; Studio equipment
  • G06T 7/60 - Analysis of geometric attributes
  • G06F 3/14 - Digital output to display device

18.

Inter-vehicle electronic games

      
Application Number 17432799
Grant Number 11911693
Status In Force
Filing Date 2020-02-21
First Publication Date 2022-11-03
Grant Date 2024-02-27
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Lake-Schaal, Gary
  • Allison, Pamela J.
  • Ostrover, Lewis S.
  • Gewickey, Gregory I.
  • Nguyen, Ha
  • Abhyankar, Prashant

Abstract

A computer-implemented method for providing electronics games for play by a group of users in two or more moving vehicles. The method includes maintaining data structures of media program data, user profile data and vehicle profile data, receiving user and vehicle state information, identifying a group of users based on contemporaneous presence in two or more vehicles or common participation in a game or other group experience for related trips at different times, and selecting, configuring or creating media program for play at media players. An apparatus or system is configured to perform the method, and related operations.

IPC Classes  ?

  • A63F 13/355 - Performing operations on behalf of clients with restricted processing capabilities, e.g. servers transform changing game scene into an MPEG-stream for transmitting to a mobile phone or a thin client
  • A63F 13/79 - Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories
  • H04W 4/46 - Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P] for vehicle-to-vehicle communication [V2V]
  • H04W 4/21 - Services signalling; Auxiliary data signalling, i.e. transmitting data via a non-traffic channel for social networking applications
  • H04W 4/08 - User group management

19.

SCANNER NOISE ELIMINATION FOR SCANNED FILMS

      
Application Number 17723231
Status Pending
Filing Date 2022-04-18
First Publication Date 2022-11-03
Owner Warner Bros. Entertainment Inc. (USA)
Inventor
  • Smith, Michael
  • Zink, Michael
  • Nolan, Christopher

Abstract

A method for preparing digital image data from an analog image input by scanning, and reducing visibility of the scanning noise, may include estimating a visibility of scanning noise, and a number of scanning samples needed to reduce scanning noise to below a visible threshold. Related methods include scanning, by an analog-to-digital image scanner, an analog image for multiple iterations, resulting in digital image data for each of the iterations; calculating a noise statistic for individual pixels of digital image data across the iterations; determining true values of individual pixels of the digital image data based on the noise statistic for each of the individual pixels and generating scanner noise reduced digital image data wherein pixels are assigned their respective ones of the true values; and saving the scanner noise reduced digital image data in a computer memory.

IPC Classes  ?

  • G06T 5/00 - Image enhancement or restoration
  • G06T 5/20 - Image enhancement or restoration by the use of local operators

20.

System and method for generating video content with hue-preservation in virtual production

      
Application Number 17393858
Grant Number 11457187
Status In Force
Filing Date 2021-08-04
First Publication Date 2022-09-27
Grant Date 2022-09-27
Owner Warner Bros. Entertainment Inc. (USA)
Inventor
  • Zink, Michael
  • Smith, Michael

Abstract

A system is provided for generating video content with hue-preservation in virtual production. A processor determines data in a scene-based encoding format based on raw data received in a pre-defined format. The raw data includes a computer-generated background rendered on a rendering panel and a foreground object. Based on the data in the scene-based encoding format, scene linear data is determined. A saturation of the scene linear data is controlled when a first color gamut corresponding to the pre-defined format is mapped to a second color gamut corresponding to a display-based encoding color space. Based on the scene linear data, a standard dynamic range (SDR) video content in the display-based encoding color space is determined. Hue of the SDR video content is preserved, when the rendering panel is in-focus or out-of-focus, based on a scaling factor that is applied to three primary color values that describe the first color gamut.

IPC Classes  ?

  • H04N 9/64 - Circuits for processing colour signals
  • H04N 9/73 - Colour balance circuits, e.g. white balance circuits or colour temperature control
  • H04N 9/67 - Circuits for processing colour signals for matrixing
  • H04N 19/186 - Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being a colour or a chrominance component

21.

HARDWARE FOR ENTERTAINMENT CONTENT IN VEHICLES

      
Application Number 17713185
Status Pending
Filing Date 2022-04-04
First Publication Date 2022-09-22
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Gewickey, Gregory I.
  • Nguyen, Ha
  • Brahma, Rana
  • Abhyankar, Prashant
  • Schoepf, Nicole
  • Morris, Genevieve
  • Zink, Michael
  • Smith, Michael
  • Lake-Schaal, Gary

Abstract

Systems and computer-implemented methods are disclosed for providing social entertainment experiences in a moving vehicle via an apparatus that simulates human social behavior relevant to a journey undertaken by the vehicle, for displaying human-perceivable exterior communication on the moving vehicle to neighboring vehicles and/or pedestrians, and for providing a modular travel experience.

IPC Classes  ?

  • H04N 21/414 - Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
  • H04N 21/422 - Input-only peripherals, e.g. global positioning system [GPS]
  • H04N 21/45 - Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies 
  • H04N 21/4788 - Supplemental services, e.g. displaying phone caller identification or shopping application communicating with other users, e.g. chatting

22.

HETEROGENOUS GEOMETRY CACHING FOR REAL-TIME SIMULATED FLUIDS

      
Application Number 17630440
Status Pending
Filing Date 2020-07-27
First Publication Date 2022-09-15
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor Nadro, Jason

Abstract

A method for simulating fluid surfaces in real-time in response to user input includes detecting interactive conditions triggering insertion of a heterogeneous mesh sequence in a 3D model sequence for rendering, fetching ones of the heterogenous mesh sequence from a computer memory, inserting the successive members in corresponding representations of the 3D model sequence in a computer memory, and rendering successive video frames from the representations of the 3D model sequence each including a successive member of the heterogenous mesh sequence. A related method for generating a compact heterogeneous mesh sequence for use in rendering corresponding frames of video includes generating a heterogenous mesh sequence modeling response of a fluid surface to physical forces, the heterogenous mesh sequence characterized by position values represented in computer memory by not less than 12 bytes for each vertex thereof, transforming the heterogenous mesh sequence into the compact heterogeneous mesh sequence, at least in part by quantizing the position values to not greater than four bytes, and storing the compact heterogeneous mesh sequence in a computer memory for use in real-time rendering.

IPC Classes  ?

  • G06T 17/20 - Wire-frame description, e.g. polygonalisation or tessellation
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
  • G06F 30/23 - Design optimisation, verification or simulation using finite element methods [FEM] or finite difference methods [FDM]

23.

Efficient real-time shadow rendering

      
Application Number 17630145
Grant Number 11908062
Status In Force
Filing Date 2020-07-27
First Publication Date 2022-08-18
Grant Date 2024-02-20
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor Li, Bo

Abstract

A method for real-time shadow rendering using cached shadow maps and deferred shading by a video processor of a game console or the like includes, for at least each key frame of video output, determining a viewpoint for a current key frame based on user input, filtering a texel of a frame-specific shadow map based on a dynamic mask wherein the texel is filtered, for a shadowed light, from a static shadow map and a dynamic shadow map or from the static shadow map only, based on the dynamic mask value for the texel, and rendering the current key frame based on the frame-specific shadow map and a deferred-shadow rendering algorithm. The method enables efficient rendering of thousands of shadowed lights in large environments by consumer-grade game consoles.

IPC Classes  ?

24.

SENSITIVITY ASSESSMENT FOR MEDIA PRODUCTION USING ARTIFICIAL INTELLIGENCE

      
Application Number 17494582
Status Pending
Filing Date 2021-10-05
First Publication Date 2022-08-04
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Pau, Hitesh
  • Murillo, Geoffrey P.
  • Lund, Rajiv R.
  • Kumar, Anshul
  • Mehta, Tasha T.
  • Bringas, Alejandro
  • Kaur, Tarundeep
  • Tanita, Ty Y.

Abstract

An automatic flagging of sensitive portions of a digital dataset for media production includes receiving the digital dataset comprising at least one of audio data, video data, or audio-video data for producing at least one media program. A processor identifies sensitive portions of the digital dataset likely to be in one or more defined content classifications, based at least in part on comparing unclassified portions of the digital dataset with classified portions of the prior media production using an algorithm, and generates a plurality of sensitivity tags each signifying a sensitivity assessment for a corresponding one of the sensitive portions. The processor may save the plurality of sensitivity tags each correlated to its corresponding one of the sensitive portions in a computer memory for use by a media production or localization team.

IPC Classes  ?

  • H04N 21/4545 - Input to filtering algorithms, e.g. filtering a region of the image
  • G06N 20/00 - Machine learning
  • H04N 21/431 - Generation of visual interfaces; Content or additional data rendering
  • H04N 21/466 - Learning process for intelligent management, e.g. learning user preferences for recommending movies

25.

TRIP-CONFIGURABLE CONTENT

      
Application Number 17311682
Status Pending
Filing Date 2019-12-06
First Publication Date 2022-07-14
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Herz, Justin
  • Zajac, William
  • Nguyen, Ha
  • Chappell, Iii, Arvell A.
  • Gewickey, Gregory
  • Allison, Pamela J.
  • Brahma, Rana
  • Abhyankar, Prashant
  • Zink, Michael
  • Novak, Shaun

Abstract

Methods and apparatus for personalizing a vehicle with a sensory output device include receiving, by one or more processors, a signal indicating an identity or passenger profile of a detected passenger in or boarding the vehicle, accessing preference data and geographic location data for the passenger, and selecting sensory content for delivery to the passenger in the vehicle based on the preference data and geographic location data. Methods and apparatus for producing video customized for a preference profile of a person or cohort include associating each of stored video clips with a set of characteristic parameters relating to user-perceivable characteristics, receiving user profile data relating to the person or cohort, selecting video clips from the data structure based at least partly on the user profile data, and automatically producing a video including the preferred video clips.

IPC Classes  ?

  • H04N 21/414 - Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
  • H04N 21/45 - Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies 
  • H04N 21/442 - Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed or the storage space available from the internal hard disk
  • H04N 21/458 - Scheduling content for creating a personalised stream, e.g. by combining a locally stored advertisement with an incoming stream; Updating operations, e.g. for OS modules
  • H04N 21/8549 - Creating video summaries, e.g. movie trailer

26.

GESTURE-CENTRIC USER INTERFACE

      
Application Number 17687480
Status Pending
Filing Date 2022-03-04
First Publication Date 2022-06-23
Owner Warner Bros. Entertainment Inc. (USA)
Inventor
  • Gewickey, Gregory I
  • Zajac, William
  • Nguyen, Ha
  • Maliszewski, Sam

Abstract

A gesture-recognition (GR) device made to be held or worn by a user includes an electronic processor configured by program instructions in memory to recognize a gesture. The device or a cooperating system may match a gesture identifier to an action identifier for one or more target devices in a user's environment, enabling control of the target devices by user movement of the GR device in three-dimensional space.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/03 - Arrangements for converting the position or the displacement of a member into a coded form
  • G06F 3/0346 - Pointing devices displaced or positioned by the user; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors

27.

SCALABLE SIMULATION AND AUTOMATED TESTING OF MOBILE VIDEOGAMES

      
Application Number 17602725
Status Pending
Filing Date 2020-04-10
First Publication Date 2022-06-02
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor Azapian, Alejandro S.S.

Abstract

A method for evaluating performance of a video game by a computing device. The method includes a harness application independent of device's execution context, and an agent application to simulate player's actions.

IPC Classes  ?

  • G06F 11/36 - Preventing errors by testing or debugging of software

28.

Perfless and cadenceless scanning and digitization of motion picture film

      
Application Number 17094807
Grant Number 11381713
Status In Force
Filing Date 2020-11-10
First Publication Date 2022-05-12
Grant Date 2022-07-05
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Klevatt, Steven Craig
  • Borquez, Michael Charles
  • Gillaspie, Christopher Eugene

Abstract

Motion picture film is scanned by high-resolution, continuous sprocketless scanner, producing a first sequence of digital images each representing a plurality of motion picture frames and perforations (perfs) of the film input. The first sequence of images is processed using a processor running an analysis and extraction algorithm, producing a second sequence of images each including a single, edge-stabilized frame of the motion picture.

IPC Classes  ?

  • H04N 3/38 - Scanning of motion picture films, e.g. for telecine with continuously moving film
  • G06F 9/448 - Execution paradigms, e.g. implementations of programming paradigms

29.

Gesture recognition device and method for sensing multi-factor assertion

      
Application Number 17547351
Grant Number 11698684
Status In Force
Filing Date 2021-12-10
First Publication Date 2022-04-21
Grant Date 2023-07-11
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Nocon, Nathan
  • Mcwilliams, Tom

Abstract

A gesture-recognition (GR) device is disclosed that includes a capacitive touch sensor panel and a controller. The capacitive touch sensor panel comprises a plurality of sensing pads arranged in a cylindrical pattern inside a handle of the GR device and detects a multi-factor touch assertion at a set of sensing pads of the plurality of sensing pads. The controller transmits a driving signal to each of the plurality of sensing pads for the detection of the multi-factor touch assertion, generates an assertion signal, determines a signal sequence based on the assertion signal, and converts a current inactive state of the GR device to an active state based on a validation of the determined signal sequence corresponding to the multi-factor touch assertion and an inferred user intent.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/0346 - Pointing devices displaced or positioned by the user; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
  • G06F 3/041 - Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
  • G06F 3/044 - Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
  • G06V 40/10 - Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
  • G06V 40/20 - Movements or behaviour, e.g. gesture recognition
  • A63F 9/24 - Games using electronic circuits not otherwise provided for
  • A63H 33/22 - Optical, colour, or shadow toys
  • A63H 30/04 - Electrical arrangements using wireless transmission

30.

Gesture recognition device with minimal wand form factor

      
Application Number 17547395
Grant Number 11907431
Status In Force
Filing Date 2021-12-10
First Publication Date 2022-03-31
Grant Date 2024-02-20
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Nocon, Nathan
  • Hsu, Jonathan
  • Goslin, Michael

Abstract

Provided is a gesture-recognition (GR) device that includes a printed circuit board (PCB) and a controller. The circuit board has an aspect ratio exceeding a threshold value that corresponds to at least a 70 percent difference between length and width of the PCB. The PCB includes a first unit and a second unit. The first unit corresponds to a base unit to be grasped by hand of a user. The second unit corresponds to an elongate unit that extends outward from the first unit. The second unit is characterized by a minimal wand form factor. A rigid strength of the second unit is based on at least a shape of an outer shell and a structural attribute of the second unit. The controller controls illumination of a plurality of light sources mounted on the second unit of the circuit board based on assertion signals.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • A63H 33/22 - Optical, colour, or shadow toys
  • A63F 9/24 - Games using electronic circuits not otherwise provided for
  • G06V 40/10 - Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
  • G06V 40/20 - Movements or behaviour, e.g. gesture recognition
  • G06F 3/0346 - Pointing devices displaced or positioned by the user; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
  • G06F 3/041 - Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
  • G06F 3/044 - Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
  • A63H 30/04 - Electrical arrangements using wireless transmission

31.

MANAGING STATES OF A GESTURE RECOGNITION DEVICE AND AN INTERACTIVE CASING

      
Application Number 17547443
Status Pending
Filing Date 2021-12-10
First Publication Date 2022-03-31
Owner Warner Bros. Entertainment Inc. (USA)
Inventor
  • Nocon, Nathan
  • Goslin, Michael
  • Hsu, Jonathan

Abstract

Provided is a system that includes an interactive casing and a GR device. The interactive casing receives a first signal based on activation of a masked electrical switch by release of a magnetic assertion when a lid member of an interactive casing is disengaged from a base member of the interactive casing. A first system state of the interactive casing is converted to a second system state. Audio-visual feedback is generated and a second signal is communicated to the GR device based on the conversion. Based on the received second signal, a first device state of the GR device is converted to a second device state. Power levels of a first power storage device of the interactive casing and a second power storage device of the GR device are maintained during the first system state and the first device state, respectively.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • A63H 33/26 - Magnetic or electric toys
  • A63H 33/22 - Optical, colour, or shadow toys
  • A63F 9/24 - Games using electronic circuits not otherwise provided for
  • G06V 40/20 - Movements or behaviour, e.g. gesture recognition

32.

Gesture recognition (GR) device with multiple light sources generating multiple lighting effects

      
Application Number 17547483
Grant Number 11914786
Status In Force
Filing Date 2021-12-10
First Publication Date 2022-03-31
Grant Date 2024-02-27
Owner Warner Bros. Entertainment Inc. (USA)
Inventor Nocon, Nathan

Abstract

Provided is a gesture recognition (GR) device that includes a circuit board on which a plurality of light sources are mounted. A first light source is side-mounted at a tip of a second unit of the circuit board, and the set of second light sources is mounted at right angles at top and bottom surfaces of the second unit. A first pair from the set of second light sources is positioned adjacent to the side-mounted first light source. The plurality of light sources are controlled to generate multiple lighting effects for the tip based on assertion signals generated at a first unit of the circuit board. A first lighting effect corresponds to a directional beam generated by the first light source. A set of second lighting effects, which remains unblocked by the side-mounted first light source, corresponds to a multi-color illumination generated by the set of second light sources.

IPC Classes  ?

  • H05B 47/13 - Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings by using passive infrared detectors
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • H05B 45/20 - Controlling the colour of the light
  • H05B 47/195 - Controlling the light source by remote control via wireless transmission the transmission using visible or infrared light
  • G06V 40/10 - Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
  • G06V 40/20 - Movements or behaviour, e.g. gesture recognition
  • A63F 9/24 - Games using electronic circuits not otherwise provided for
  • A63H 33/22 - Optical, colour, or shadow toys
  • A63H 33/26 - Magnetic or electric toys
  • G06F 3/03 - Arrangements for converting the position or the displacement of a member into a coded form
  • G06F 3/0346 - Pointing devices displaced or positioned by the user; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors

33.

REAL-TIME ROUTE CONFIGURING OF ENTERTAINMENT CONTENT

      
Application Number 17417106
Status Pending
Filing Date 2019-12-18
First Publication Date 2022-03-10
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Gewickey, Gregory I.
  • Zink, Michael
  • Nguyen, Ha
  • Allison, Pamela J.
  • Brahma, Rana
  • Abhyankar, Prashant
  • Morris, Genevieve

Abstract

A computer-implemented method for producing real-time media content customized for a travel event in a dynamic travel route between an origin and a destination of a connected vehicle. The method includes maintaining media production data, receiving information on a travel event including route information, producing media content based on the travel event and travel route information, and streaming the media content to connected cars and servers along the travel route.

IPC Classes  ?

  • G01C 21/36 - Input/output arrangements for on-board computers
  • G06Q 50/14 - Travel agencies
  • G06Q 30/02 - Marketing; Price estimation or determination; Fundraising
  • G01C 21/34 - Route searching; Route guidance

34.

PROFILE-BASED STANDARD DYNAMIC RANGE AND HIGH DYNAMIC RANGE CONTENT GENERATION

      
Application Number 17417110
Status Pending
Filing Date 2019-12-19
First Publication Date 2022-03-10
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Zink, Michael
  • Nguyen, Ha

Abstract

A method for converting a source video content constrained to a first color space to a video content constrained to a second color space using an artificial intelligence machine-learning algorithm based on a creative profile.

IPC Classes  ?

35.

SOCIAL VIDEO PLATFORM FOR GENERATING AND EXPERIENCING CONTENT

      
Application Number 17460159
Status Pending
Filing Date 2021-08-27
First Publication Date 2022-03-03
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Davis, Matthew Gordon
  • Redniss, Jesse I.

Abstract

Systems and methods described herein are configured to enhance the understanding and experience of news and live events in real-time. The systems and methods leverage a distributed network of professional and amateur journalists/correspondents using technology to create unique experiences and/or provide views and perspectives different from experiences, views and/or perspectives provided by existing newscasts and/or sportscasts.

IPC Classes  ?

  • H04N 21/2187 - Live feed
  • H04N 21/234 - Processing of video elementary streams, e.g. splicing of video streams or manipulating MPEG-4 scene graphs
  • H04N 21/854 - Content authoring

36.

Partial frame replacement enabling multiple versions in an integrated video package

      
Application Number 17298562
Grant Number 11863818
Status In Force
Filing Date 2019-12-02
First Publication Date 2022-01-20
Grant Date 2024-01-02
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Lile, Shaun
  • Stratton, Tom David

Abstract

Multiple different versions of media content are contained in a single package of audio-video media content, using compression algorithms that reduce storage and bandwidth required for storing multiple full-resolution versions of the media. Portions of individual frames are replaced during playback so that only the pixels that differ between versions need to be stored.

IPC Classes  ?

  • H04N 21/4402 - Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
  • H04N 21/235 - Processing of additional data, e.g. scrambling of additional data or processing content descriptors
  • H04N 21/262 - Content or additional data distribution scheduling, e.g. sending additional data at off-peak times, updating software modules, calculating the carousel transmission frequency, delaying a video stream transmission or generating play-lists
  • H04N 21/8543 - Content authoring using a description language, e.g. MHEG [Multimedia and Hypermedia information coding Expert Group] or XML [eXtensible Markup Language]

37.

Rendering extended video in virtual reality

      
Application Number 17379480
Grant Number 11503265
Status In Force
Filing Date 2021-07-19
First Publication Date 2022-01-13
Grant Date 2022-11-15
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Zink, Michael
  • Cardenas, Mercedes Christine

Abstract

A method for transforming extended video data for display in virtual reality processes digital extended video data for display on a center screen and two auxiliary screens of a real extended video cinema. The method includes accessing, by a computer executing a rendering application, data that defines virtual screens including a center screen and auxiliary screens, wherein tangent lines to each of the auxiliary screens at their respective centers of area intersect with a tangent line to the center screen at its center of area at equal angles in a range of 75 to 105 degrees. The method includes preparing virtual extended video data at least in part by rendering the digital extended video on corresponding ones of the virtual screens; and saving the virtual extended video data in a computer memory. A corresponding playback method and apparatus display the processed data in virtual reality.

IPC Classes  ?

  • H04N 13/111 - Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation
  • H04N 13/332 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
  • H04N 13/282 - Image signal generators for generating image signals corresponding to three or more geometrical viewpoints, e.g. multi-view systems
  • H04N 13/366 - Image reproducers using viewer tracking

38.

CHARACTERIZING CONTENT FOR AUDIO-VIDEO DUBBING AND OTHER TRANSFORMATIONS

      
Application Number 17233443
Status Pending
Filing Date 2021-04-17
First Publication Date 2021-11-11
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Duncan, Grant L.
  • Mckenzie, Elizabeth
  • Grethmann, Iris
  • Whiles, Jonathan

Abstract

A computer-implemented method for transforming audio-video data includes automatically detecting substantially all discrete human-perceivable messages encoded in the audio-video data, determining a semantic encoding for each of the detected messages, assigning a time code to each of the encodings correlated to specific frames of the audio-video data, and recording a data structure relating each time code to a corresponding one of the semantic encodings in a recording medium. The method may further include converting extracted recorded vocal instances from the audio-video data into a text data, generating a dubbing list comprising the text data and the time code, assigning a set of annotations corresponding to the one or more vocal instances specifying one or more creative intents, generating the scripting data comprising the dubbing list and the set of annotations, and other optional operations. An apparatus may be programmed to perform the method by executable instructions for the foregoing operations.

IPC Classes  ?

  • H04N 21/81 - Monomedia components thereof
  • H04N 21/439 - Processing of audio elementary streams
  • H04N 21/44 - Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to MPEG-4 scene graphs
  • H04N 21/845 - Structuring of content, e.g. decomposing content into time segments
  • H04N 21/488 - Data services, e.g. news ticker

39.

ULTRASONIC MESSAGING IN MIXED REALITY

      
Application Number 17233437
Status Pending
Filing Date 2021-04-17
First Publication Date 2021-10-07
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Nguyen, Ha
  • Chang-Ogimoto, Masatoshi

Abstract

A method for receiving and processing sonic messaging by a mixed reality (xR) device from acoustic output of a media player playing content for a two-dimensional (2D) screen or projection surface, to coordinate the rendering and playing of an xR output in the xR device with the playing of the 2D content at the media player.

IPC Classes  ?

  • H04S 7/00 - Indicating arrangements; Control arrangements, e.g. balance control
  • H04B 11/00 - Transmission systems employing ultrasonic, sonic or infrasonic waves

40.

Automated videogame testing

      
Application Number 17169470
Grant Number 11561890
Status In Force
Filing Date 2021-02-07
First Publication Date 2021-09-09
Grant Date 2023-01-24
Owner Warner Bros. Entertainment Inc. (USA)
Inventor Jose, Eldo

Abstract

An automated testing framework and associated tools for executable applications such as games focus on integration testing, wherein users create data-driven tests by using test modules and configuration data as building blocks. The tools facilitate cooperation between coders and non-technical Quality Assurance (QA) staff in creating automated tests, by simplifying the user interface for configuring tests. Components of the tools simulate user interactions with the application under test, for example, gamepad button presses. The tools are also capable skipping portions of gameplay or other interactive activity and directly jumping into a desired game mode during automated testing, and other functions.

IPC Classes  ?

41.

SYSTEMS AND METHODS FOR GENERATING MULTI-LANGUAGE MEDIA CONTENT WITH AUTOMATIC SELECTION OF MATCHING VOICES

      
Application Number 17196285
Status Pending
Filing Date 2021-03-09
First Publication Date 2021-09-09
Owner Warner Bros. Entertainment Inc. (USA)
Inventor
  • Malik, Aansh
  • Nguyen, Ha Thanh

Abstract

A method and system for automated voice casting compares candidate voices samples from candidate speakers in a target language with a primary voice sample from a primary speaker in a primary language. Utterances in the audio samples of the candidates speakers and the primary speaker are identified and typed and voice samples generated that meet applicable utterance type criteria. A neural network is used to generate an embedding for the voice samples. A voice sample can include groups of different utterance types and embeddings generated for each utterance group in the voice sample and then combined in a weighted form wherein the resulting embedding emphasizes selected utterance types. Similarities between embeddings for the candidate voice samples relative to the primary voice sample are evaluated and used to select a candidate speaker that is a vocal match.

IPC Classes  ?

  • G06F 40/47 - Machine-assisted translation, e.g. using translation memory
  • G06F 40/58 - Use of machine translation, e.g. for multi-lingual retrieval, for server-side translation for client devices or for real-time translation
  • G10L 15/16 - Speech classification or search using artificial neural networks
  • G10L 15/00 - Speech recognition

42.

SOCIAL AND PROCEDURAL EFFECTS FOR COMPUTER-GENERATED ENVIRONMENTS

      
Application Number 17189257
Status Pending
Filing Date 2021-03-01
First Publication Date 2021-08-19
Owner Warner Bros. Entertainment Inc. (USA)
Inventor
  • Gewickey, Gregory I.
  • Lake-Schaal, Gary
  • Mintus, Piotr
  • Ostrover, Lewis S.
  • Smith, Michael

Abstract

A processor provides a simulated three-dimensional (3D) environment for a game or virtual reality (VR) experience, including controlling a characteristic parameter of a 3D object or character based on at least one of: an asynchronous event in a second game, feedback from multiple synchronous users of the VR experience, or on a function driven by one or variables reflecting a current state of at least one of the 3D environment, the game or the VR experience.

IPC Classes  ?

  • A63F 13/525 - Changing parameters of virtual cameras
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G09G 3/00 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
  • H04L 29/06 - Communication control; Communication processing characterised by a protocol
  • A63F 13/28 - Output arrangements for video game devices responding to control signals received from the game device for affecting ambient conditions, e.g. for vibrating players' seats, activating scent dispensers or affecting temperature or light
  • A63F 13/30 - Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
  • A63F 13/58 - Controlling game characters or game objects based on the game progress by computing conditions of game characters, e.g. stamina, strength, motivation or energy level

43.

Nemesis characters, nemesis forts, social vendettas and followers in computer games

      
Application Number 17182239
Grant Number 11660540
Status In Force
Filing Date 2021-02-23
First Publication Date 2021-08-12
Grant Date 2023-05-30
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • De Plater, Michael
  • Hoge, Christopher Herman
  • Roberts, Robert Kenyon Hull
  • Valerius, Daniel Paul
  • Newton, Rocky Albert
  • Stephens, Kevin Leslie

Abstract

Methods for managing non-player characters and power centers in a computer game are based on character hierarchies and individualized correspondences between each character's traits or rank and events that involve other non-player characters or objects. Players may share power centers, character hierarchies, non-player characters, and related quests involving the shared objects with other players playing separate and unrelated game instances over a computer network, with the outcome of the quests reflected in different the games. Various configurations of game machines are used to implement the methods.

IPC Classes  ?

  • A63F 13/58 - Controlling game characters or game objects based on the game progress by computing conditions of game characters, e.g. stamina, strength, motivation or energy level
  • A63F 13/67 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor adaptively or by learning from player actions, e.g. skill level adjustment or by storing successful combat sequences for re-use
  • A63F 13/69 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor by enabling or updating specific game elements, e.g. unlocking hidden features, items, levels or versions
  • A63F 13/798 - Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories for assessing skills or for ranking players, e.g. for generating a hall of fame

44.

Cinematic mastering for virtual reality and augmented reality

      
Application Number 17176936
Grant Number 11451882
Status In Force
Filing Date 2021-02-16
First Publication Date 2021-08-05
Grant Date 2022-09-20
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Defaria, Christopher
  • Gewickey, Gregory I.
  • Smith, Michael
  • Ostrover, Lewis S.

Abstract

An entertainment system provides data to a common screen (e.g., cinema screen) and personal immersive reality devices. For example, a cinematic data distribution server communicates with multiple immersive output devices each configured for providing immersive output (e.g., a virtual reality output) based on a data signal. Each of the multiple immersive output devices is present within eyesight of a common display screen. The server configures the data signal based on digital cinematic master data that includes immersive reality data. The server transmits the data signal to the multiple immersive output devices contemporaneously with each other, and optionally contemporaneously with providing a coordinated audio-video signal for output via the common display screen and shared audio system.

IPC Classes  ?

  • H04N 21/81 - Monomedia components thereof
  • H04N 21/422 - Input-only peripherals, e.g. global positioning system [GPS]
  • H04N 21/214 - Specialised server platform, e.g. server located in an airplane, hotel or hospital
  • H04N 21/2225 - Local VOD servers
  • H04N 21/433 - Content storage operation, e.g. storage operation in response to a pause request or caching operations
  • H04N 21/436 - Interfacing a local distribution network, e.g. communicating with another STB or inside the home

45.

Matching mouth shape and movement in digital video to alternative audio

      
Application Number 17102399
Grant Number 11436780
Status In Force
Filing Date 2020-11-23
First Publication Date 2021-05-20
Grant Date 2022-09-06
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Stratton, Tom David
  • Lile, Shaun

Abstract

A method for matching mouth shape and movement in digital video to alternative audio includes deriving a sequence of facial poses including mouth shapes for an actor from a source digital video. Each pose in the sequence of facial poses corresponds to a middle position of each audio sample. The method further includes generating an animated face mesh based on the sequence of facial poses and the source digital video, transferring tracked expressions from the animated face mesh or the target video to the source video, and generating a rough output video that includes transfers of the tracked expressions. The method further includes generating a finished video at least in part by refining the rough video using a parametric autoencoder trained on mouth shapes in the animated face mesh or the target video. One or more computers may perform the operations of the method.

IPC Classes  ?

  • G06T 13/40 - 3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
  • G06T 7/246 - Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods
  • G06T 13/20 - 3D [Three Dimensional] animation
  • G10L 13/047 - Architecture of speech synthesisers
  • G10L 15/16 - Speech classification or search using artificial neural networks

46.

Geometry matching in virtual reality and augmented reality

      
Application Number 17084509
Grant Number 11363349
Status In Force
Filing Date 2020-10-29
First Publication Date 2021-04-22
Grant Date 2022-06-14
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Defaria, Christopher
  • Smith, Michael

Abstract

Methods, apparatus and systems for geometric matching of virtual reality (VR) or augmented reality (AR) output contemporaneously with video output formatted for display on a 2D screen include a determination of value sets that when used in image processing cause an off-screen angular field of view of the at least one of the AR output object or the VR output object to have a fixed relationship to at least one of the angular field of view of the onscreen object or of the 2D screen. The AR/VR output object is outputted to an AR/VR display device and the user experience is improved by the geometric matching between objects observed on the AR/VR display device and corresponding objects appearing on the 2D screen.

IPC Classes  ?

  • H04N 13/00 - PICTORIAL COMMUNICATION, e.g. TELEVISION - Details thereof
  • H04N 21/81 - Monomedia components thereof
  • H04N 13/117 - Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation the virtual viewpoint locations being selected by the viewers or determined by viewer tracking
  • G06T 15/20 - Perspective computation
  • H04N 13/30 - Image reproducers

47.

Technical solutions for customized tours

      
Application Number 16589735
Grant Number 11532245
Status In Force
Filing Date 2019-10-01
First Publication Date 2021-04-01
Grant Date 2022-12-20
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Klappert, Walter
  • Ferguson, Daniel

Abstract

An apparatus and method for configuring a customized tour includes providing a list of tour subject indicators with a relevance value for locations to be toured, receiving user selection data regarding the subject indicators, and configuring a tour route based on an aggregate relevance score calculated for the tour subject indicators and locations indicated by one or more users. The method and apparatus may further include defining at least one tour route record comprising an ordered list of locations that satisfies at least a minimum aggregate relevance score constraint and a maximum tour duration constraint and saving information defining the at least one tour route record in a computer memory for use in delivering a corresponding tour.

IPC Classes  ?

  • G09B 29/00 - Maps; Plans; Charts; Diagrams, e.g. route diagrams
  • G06Q 50/14 - Travel agencies
  • G06Q 10/04 - Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
  • G06F 16/901 - Indexing; Data structures therefor; Storage structures
  • G06N 20/00 - Machine learning

48.

Generation and use of user-selected scenes playlist from distributed digital content

      
Application Number 16994133
Grant Number 11410704
Status In Force
Filing Date 2020-08-14
First Publication Date 2021-03-11
Grant Date 2022-08-09
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Lau, Kim
  • Frautschi, Jacob
  • Gasparri, Massimiliano
  • Lee, Randy
  • Harman, Patrick

Abstract

A digital content package includes first content comprising a video feature such as a motion picture or the like, and a user-selectable application configured to operate as follows. When activated using an icon off of a menu screen, the application records an identifier for scenes (discrete portions) of the first content that are selected by a user to generate a playlist. The user may select the scenes by indicating a start and end of each scene. The application saves the playlist locally, then uploads to a server. Via a user account at the server, a user may publish the playlist to a user-created distribution list, webpage, or other electronic publication, and modify the playlist by deleting or reordering scenes.

IPC Classes  ?

  • G11B 27/034 - Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
  • G11B 27/11 - Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information not detectable on the record carrier
  • H04N 21/2743 - Video hosting of uploaded data from client
  • H04N 21/426 - Internal components of the client
  • H04N 21/432 - Content retrieval operation from a local storage medium, e.g. hard-disk
  • H04N 21/482 - End-user interface for program selection
  • H04N 21/845 - Structuring of content, e.g. decomposing content into time segments
  • H04N 21/442 - Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed or the storage space available from the internal hard disk

49.

Methods for controlling scene, camera and viewing parameters for altering perception of 3D imagery

      
Application Number 17021499
Grant Number 11477430
Status In Force
Filing Date 2020-09-15
First Publication Date 2021-03-11
Grant Date 2022-10-18
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Nolan, Christopher E.
  • Collar, Bradley T.
  • Smith, Michael D.

Abstract

Mathematical relationships between the scene geometry, camera parameters, and viewing environment are used to control stereography to obtain various results influencing the viewer's perception of 3D imagery. The methods may include setting a horizontal shift, convergence distance, and camera interaxial parameter to achieve various effects. The methods may be implemented in a computer-implemented tool for interactively modifying scene parameters during a 2D-to-3D conversion process, which may then trigger the re-rendering of the 3D content on the fly.

IPC Classes  ?

  • H04N 19/70 - Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by syntax aspects related to video coding, e.g. related to compression standards
  • H04N 19/119 - Adaptive subdivision aspects e.g. subdivision of a picture into rectangular or non-rectangular coding blocks
  • H04N 19/124 - Quantisation
  • H04N 19/52 - Processing of motion vectors by encoding by predictive encoding
  • H04N 13/189 - Recording image signals; Reproducing recorded image signals
  • H04N 13/275 - Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
  • H04N 13/204 - Image signal generators using stereoscopic image cameras
  • H04N 13/128 - Adjusting depth or disparity

50.

Systems and methods to maintain user privacy whtle providing recommendations

      
Application Number 16919056
Grant Number 11907401
Status In Force
Filing Date 2020-07-01
First Publication Date 2021-01-14
Grant Date 2024-02-20
Owner Warner Bros. Entertainment Inc. (USA)
Inventor
  • Rich, Sarah J.
  • Recht, Benjamin

Abstract

A systematic method of introducing obfuscating “organic” noise to a user's content engagement history leverages a recommender system by creating a public history on a client device which is a superset of the user's true engagement history. The method builds up the superset history over time through a client's interaction with the recommender system by simulating organic growth in a user's actual engagement history. The organic superset prevents an adversary with access to the underlying recommendation model from readily distinguishing between signal and noise in a user's query and obfuscates the user's engagement history with the recommender system.

IPC Classes  ?

  • G06F 21/62 - Protecting access to data via a platform, e.g. using keys or access control rules
  • G06F 16/9535 - Search customisation based on user profiles and personalisation

51.

CONTENT GENERATION AND CONTROL USING SENSOR DATA FOR DETECTION OF NEUROLOGICAL STATE

      
Application Number 16923053
Status Pending
Filing Date 2020-07-07
First Publication Date 2020-12-31
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Chappell, Iii, Arvel A.
  • Ostrover, Lewis S.
  • Mack, Harold C.

Abstract

Applications for a Content Engagement Power (CEP) value include generating a script, text, or other communication content or for controlling playout of communication content based on neuro-physiological data, gathering neuro-physiological data correlated to consumption of communication content, or rating effectiveness of personal communications. The CEP is computed based on neuro-physiological sensor data processed to express engagement with content along multiple dimensions such as valence, arousal, and dominance. An apparatus is configured to perform the method using hardware, firmware, and/or software.

IPC Classes  ?

  • A61B 5/16 - Devices for psychotechnics; Testing reaction times
  • A61B 5/04 - Measuring bioelectric signals of the body or parts thereof
  • A63F 13/352 - Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers - Details of game servers involving special game server arrangements, e.g. regional servers connected to a national server or a plurality of servers managing partitions of the game world
  • G10L 15/04 - Segmentation; Word boundary detection

52.

SOCIAL INTERACTIVE APPLICATIONS USING BIOMETRIC SENSOR DATA FOR DETECTION OF NEURO-PHYSIOLOGICAL STATE

      
Application Number 16923033
Status Pending
Filing Date 2020-07-07
First Publication Date 2020-12-31
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Chappell, Iii, Arvel A.
  • Ostrover, Lewis S.
  • Lake-Schaal, Gary

Abstract

Applications for a Composite Neuro-physiological State (CNS) value include rating using the value as in indicator of participant emotional state in computer games and other social interaction applications. The CNS is computed based on biometric sensor data processed to express player engagement with content, game play, and other participants along multiple dimensions such as valence, arousal, and dominance. An apparatus is configured to perform the method using hardware, firmware, and/or software.

IPC Classes  ?

  • A61B 5/16 - Devices for psychotechnics; Testing reaction times
  • G06Q 50/00 - Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
  • G16H 40/67 - ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
  • G16H 50/30 - ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for individual health risk assessment
  • A61B 5/0476 - Electroencephalography
  • A61B 5/0488 - Electromyography
  • A61B 5/00 - Measuring for diagnostic purposes ; Identification of persons
  • A61B 5/0402 - Electrocardiography, i.e. ECG
  • A61B 5/053 - Measuring electrical impedance or conductance of a portion of the body
  • A61B 5/055 - Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves involving electronic [EMR] or nuclear [NMR] magnetic resonance, e.g. magnetic resonance imaging
  • A61B 5/04 - Measuring bioelectric signals of the body or parts thereof
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • H04N 21/422 - Input-only peripherals, e.g. global positioning system [GPS]
  • H04N 21/442 - Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed or the storage space available from the internal hard disk
  • H04N 21/845 - Structuring of content, e.g. decomposing content into time segments
  • A63F 13/212 - Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
  • A63F 13/79 - Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories

53.

Transforming audio content for subjective fidelity

      
Application Number 16916105
Grant Number 11075609
Status In Force
Filing Date 2020-06-29
First Publication Date 2020-12-24
Grant Date 2021-07-27
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Antonellis, Darcy
  • Ostrover, Lewis

Abstract

A method or apparatus for delivering audio programming such as music to listeners may include identifying, capturing and applying a listener's audiometric profile to transform audio content so that the listener hears the content similarly to how the content was originally heard by a creative producer of the content. An audio testing tool may be implemented as software application to identify and capture the listener's audiometric profile. A signal processor may operate an algorithm used for processing source audio content, obtaining an identity and an audiometric reference profile of the creative producer from metadata associated with the content. The signal processor may then provide audio output based on a difference between the listener's and creative producer's audiometric profiles.

IPC Classes  ?

  • H03G 5/00 - Tone control or bandwidth control in amplifiers
  • H03G 5/16 - Automatic control
  • H03G 9/00 - Combinations of two or more types of control, e.g. gain control and tone control

54.

VIDEO CONVERSION TECHNOLOGY

      
Application Number 16902252
Status Pending
Filing Date 2020-06-15
First Publication Date 2020-12-10
Owner Warner Bros. Entertainment Inc. (USA)
Inventor
  • Butterworth, William Evans
  • Dwinell, Roger Raymond
  • Mcmahon, Valerie

Abstract

Video conversion technology, in which a first stream of video content is accessed and multiple, different layers are extracted from the first stream of the video content. Each of the multiple, different layers are separately processed to convert the multiple, different layers into modified layers that each have a higher resolution. The modified layers are reassembled into a second stream of the video content that has a higher resolution than the first stream of the video content.

IPC Classes  ?

  • H04N 7/01 - Conversion of standards
  • H04N 21/2343 - Processing of video elementary streams, e.g. splicing of video streams or manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
  • G09G 5/00 - Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
  • G09G 5/02 - Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed
  • G09G 5/14 - Display of multiple viewports

55.

Portal device and cooperating video game machine

      
Application Number 16813725
Grant Number 11478695
Status In Force
Filing Date 2020-03-09
First Publication Date 2020-12-10
Grant Date 2022-10-25
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor Burton, Jon

Abstract

A portal device for a video game includes a pad with different zones that can be illuminated by selectable colors, a toy sensor (e.g., an RFID tag sensor) associated with each zone, a controller and a communications port for communicating with a video game process executing on a game machine. The colors of each zone can be configured to one or a combination of three primary colors during game play, based on the game process. The portal device reacts to placement of tagged toys on zones and the color of the zones during play and provides sensor data to the game process. The game process controls the game state in part based on data from the portal device and in part on other user input.

IPC Classes  ?

  • A63F 13/00 - Video games, i.e. games using an electronically generated display having two or more dimensions
  • A63F 9/24 - Games using electronic circuits not otherwise provided for
  • A63F 13/213 - Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
  • A63F 13/214 - Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
  • A63F 13/65 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
  • G07F 17/32 - Coin-freed apparatus for hiring articles; Coin-freed facilities or services for games, toys, sports, or amusements
  • A63F 13/822 - Strategy games; Role-playing games 

56.

Flexible computer gaming based on machine learning

      
Application Number 16846268
Grant Number 11826653
Status In Force
Filing Date 2020-04-10
First Publication Date 2020-12-10
Grant Date 2023-11-28
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Lake-Schaal, Gary
  • Ostrover, Lewis S.
  • Huard, Matthew
  • Husein, Adam

Abstract

A game modification engine modifies configuration settings affecting game play and the user experience in computer games after initial publication of the game, based on device level and game play data associated with a user or cohort of users and on machine-learned relationships between input data and a use metric for the game. The modification is selected to improve performance of the game as measured by the use metric. The modification may be tailored for a user cohort. The game modification engine may define the cohort automatically based on correlations discovered in the input data relative to a defined use metric.

IPC Classes  ?

  • A63F 13/67 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor adaptively or by learning from player actions, e.g. skill level adjustment or by storing successful combat sequences for re-use
  • A63F 13/77 - Game security or game management aspects involving data related to game devices or game servers, e.g. configuration data, software version or amount of memory
  • G06N 20/00 - Machine learning
  • G06N 5/04 - Inference or reasoning models
  • A63F 13/79 - Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories

57.

Simple nonautonomous peering media clone detection

      
Application Number 16989668
Grant Number 11868170
Status In Force
Filing Date 2020-08-10
First Publication Date 2020-11-26
Grant Date 2024-01-09
Owner Warner Bros. Entertainment Inc. (USA)
Inventor
  • Marking, Aaron
  • Goeller, Kenneth
  • Lotspiech, Jeffrey Bruce

Abstract

A playback device includes a port configured to receive content from an external memory device, a device memory residing in the device, and a controller programmed to execute instructions that cause the controller to read a read data pattern from the defined region in the external memory device and determine if the read data pattern correlates to an expected data pattern to a predetermined level, wherein the expected data pattern is derived at least in part from a defect map of the defined region.

IPC Classes  ?

  • G06F 21/44 - Program or device authentication
  • H04L 9/40 - Network security protocols
  • H04K 1/00 - Secret communication
  • H04L 67/06 - Protocols specially adapted for file transfer, e.g. file transfer protocol [FTP]
  • H04L 67/1074 - Peer-to-peer [P2P] networks for supporting data block transmission mechanisms
  • G06F 21/10 - Protecting distributed programs or content, e.g. vending or licensing of copyrighted material
  • H04L 67/104 - Peer-to-peer [P2P] networks

58.

Sensitivity assessment for media production using artificial intelligence

      
Application Number 16536229
Grant Number 11140446
Status In Force
Filing Date 2019-08-08
First Publication Date 2020-11-19
Grant Date 2021-10-05
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Pau, Hitesh
  • Murillo, Geoffrey P.
  • Lund, Rajiv R.
  • Kumar, Anshul
  • Mehta, Tasha T.
  • Bringas, Alejandro
  • Kaur, Tarundeep
  • Tanita, Ty Y.

Abstract

An automatic flagging of sensitive portions of a digital dataset for media production includes receiving the digital dataset comprising at least one of audio data, video data, or audio-video data for producing at least one media program. A processor identifies sensitive portions of the digital dataset likely to be in one or more defined content classifications, based at least in part on comparing unclassified portions of the digital dataset with classified portions of the prior media production using an algorithm, and generates a plurality of sensitivity tags each signifying a sensitivity assessment for a corresponding one of the sensitive portions. The processor may save the plurality of sensitivity tags each correlated to its corresponding one of the sensitive portions in a computer memory for use by a media production or localization team.

IPC Classes  ?

  • H04N 21/4545 - Input to filtering algorithms, e.g. filtering a region of the image
  • G06N 20/00 - Machine learning
  • H04N 21/431 - Generation of visual interfaces; Content or additional data rendering
  • H04N 21/466 - Learning process for intelligent management, e.g. learning user preferences for recommending movies

59.

SCROLLING INTERFACE CONTROL FOR COMPUTER DISPLAY

      
Application Number 16869571
Status Pending
Filing Date 2020-05-07
First Publication Date 2020-10-29
Owner Warner Bros. Entertainment Inc. (USA)
Inventor
  • Tandon, Prem V.
  • Kamphausen, Jr., Robert Joseph

Abstract

Methods and apparatus control scrolling of graphical content arranged in a sequence of panels by a computer responsive to a user input device. A method includes sensing, by the computer, a direction and length of continuous cursor movement along a first axis of the user input device. The method further includes progressing display of the graphical content through the sequence of panels based on the direction and length of the continuous cursor movement. The user input device may be a touchscreen or a touchpad and the computer may determine the cursor location based on a one-finger touch registered by the user input device. The method enables the user to scroll through panels of graphical content using a reduced number of finger taps and smaller movements, reducing hand and finger fatigue.

IPC Classes  ?

60.

Consumer intelligence for automatic real time message decisions and selection

      
Application Number 16824692
Grant Number 11681933
Status In Force
Filing Date 2020-03-19
First Publication Date 2020-10-08
Grant Date 2023-06-20
Owner Warner Bros. Entertainment Inc. (USA)
Inventor
  • Herz, Justin
  • Kursar, Brian
  • Camoosa, Keith
  • Gewickey, Gregory
  • Ostrover, Lewis
  • Husein, Adam

Abstract

Methods and apparatus for improving automatic selection and timing of messages by a machine or system of machines include an inductive computational process driven by log-level network data from mobile devices and other network-connected devices, optionally in addition to traditional application-level data from cookies or the like. The methods and apparatus may be used, for example, to improve or optimize effectiveness of automatically-generated electronic communications with consumers and potential consumers for achieving a specified target.

IPC Classes  ?

  • G06N 5/04 - Inference or reasoning models
  • G06F 16/28 - Databases characterised by their database models, e.g. relational or object models
  • G06N 20/00 - Machine learning
  • G06Q 20/20 - Point-of-sale [POS] network systems
  • G06Q 30/0201 - Market modelling; Market analysis; Collecting market data
  • G06Q 30/0203 - Market surveys; Market polls
  • G06Q 30/0242 - Determining effectiveness of advertisements
  • G06Q 30/0251 - Targeted advertisements
  • H04L 51/046 - Interoperability with other network applications or services
  • G06Q 30/0202 - Market predictions or forecasting for commercial activities

61.

Digitally representing user engagement with directed content based on biometric sensor data

      
Application Number 16833504
Grant Number 11343596
Status In Force
Filing Date 2020-03-27
First Publication Date 2020-09-17
Grant Date 2022-05-24
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Chappell, Iii, Arvel A.
  • Ostrover, Lewis S.

Abstract

A computer-implemented method for obtaining a digital representation of user engagement with audio-video content includes playing digital data comprising audio-video content by an output device that outputs audio-video output based on the digital data and receiving sensor data from at least one sensor positioned to sense an involuntary response of one or more users while engaged with the audio-video output. The method further includes determining a Content Engagement Power (CEP) value, based on the sensor data and recording the digital representation of CEP in a computer memory. An apparatus is configured to perform the method using hardware, firmware, and/or software.

IPC Classes  ?

  • H04N 21/8545 - Content authoring for generating interactive applications
  • H04N 21/422 - Input-only peripherals, e.g. global positioning system [GPS]
  • H04N 21/45 - Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies 
  • H04N 21/8541 - Content authoring involving branching, e.g. to different story endings
  • H04N 21/442 - Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed or the storage space available from the internal hard disk
  • H04N 21/466 - Learning process for intelligent management, e.g. learning user preferences for recommending movies
  • A61B 5/16 - Devices for psychotechnics; Testing reaction times
  • G16H 40/67 - ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
  • G16H 50/20 - ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G09B 19/00 - Teaching not covered by other main groups of this subclass

62.

Production and control of cinematic content responsive to user emotional state

      
Application Number 16833492
Grant Number 11303976
Status In Force
Filing Date 2020-03-27
First Publication Date 2020-09-17
Grant Date 2022-04-12
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Chappell, Iii, Arvel A.
  • Ostrover, Lewis S.

Abstract

A computer-implemented method for providing cinematic content to a user via a computer-controlled media player includes accessing by a processor a content package including a targeted emotional arc and a collection of digital objects each associated with codes indicating an emotional profile of the each digital object, playing digital objects selected from the content package thereby outputting an audio-video signal for display by an output device; receiving sensor data from at least one sensor positioned to sense a biometric feature of a user watching the output device; determining a value of one or more emotional state variables, based on the sensor data; and selecting the digital objects for the playing based on the one or more emotional state variables, a recent value of the targeted emotional arc, and the one or more codes indicating an emotional profile. An apparatus is configured to perform the method using hardware, firmware, and/or software.

IPC Classes  ?

  • H04H 60/33 - Arrangements for monitoring the users' behaviour or opinions
  • H04H 60/56 - Arrangements characterised by components specially adapted for monitoring, identification or recognition covered by groups or
  • H04N 7/10 - Adaptations for transmission by electrical cable
  • H04N 7/025 - Systems for transmission of digital non-picture data, e.g. of text during the active part of a television frame
  • H04N 21/8545 - Content authoring for generating interactive applications
  • H04N 21/422 - Input-only peripherals, e.g. global positioning system [GPS]
  • H04N 21/45 - Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies 
  • H04N 21/8541 - Content authoring involving branching, e.g. to different story endings
  • H04N 21/442 - Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed or the storage space available from the internal hard disk
  • H04N 21/466 - Learning process for intelligent management, e.g. learning user preferences for recommending movies
  • A61B 5/16 - Devices for psychotechnics; Testing reaction times
  • G16H 40/67 - ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
  • G16H 50/20 - ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G09B 19/00 - Teaching not covered by other main groups of this subclass

63.

Digital audio-video content mobile library

      
Application Number 16779450
Grant Number 11190822
Status In Force
Filing Date 2020-01-31
First Publication Date 2020-08-06
Grant Date 2021-11-30
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Herz, Justin
  • Kozan, Kevin
  • Mahmoud, Essam
  • Levicki, Alan

Abstract

Methods for managing digital content include authenticating a user account identifier from a client device over a computer network, registering a telephone number for at least one wireless mobile device in a registry identified with the user account based on the authenticating, as a pre-authorized identifier for accessing digital content licensed for use with the client device. The methods include maintaining a library of digital content identified with the user account for access by the at least one wireless mobile device, and initiating streaming of the digital video content to the at least one wireless mobile device without requiring user authentication from the at least one wireless mobile device, based on the registering of the telephone number as the pre-authorized identifier. An apparatus for performing the method comprises a processor coupled to a memory, the memory holding instructions for performing steps of the method as summarized above.

IPC Classes  ?

  • H04N 21/258 - Client or end-user data management, e.g. managing client capabilities, user preferences or demographics or processing of multiple end-users preferences to derive collaborative data
  • H04N 21/254 - Management at additional data server, e.g. shopping server or rights management server
  • H04N 21/414 - Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
  • H04N 21/4627 - Rights management
  • H04N 21/488 - Data services, e.g. news ticker
  • H04N 21/8355 - Generation of protective data, e.g. certificates involving usage data, e.g. number of copies or viewings allowed
  • H04N 21/8358 - Generation of protective data, e.g. certificates involving watermark
  • H04N 21/84 - Generation or processing of descriptive data, e.g. content descriptors
  • H04N 21/61 - Network physical structure; Signal processing
  • H04N 21/81 - Monomedia components thereof

64.

Region-of-interest encoding enhancements for variable-bitrate compression

      
Application Number 16714726
Grant Number 11076172
Status In Force
Filing Date 2019-12-14
First Publication Date 2020-07-16
Grant Date 2021-07-27
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Smith, Michael
  • Collar, Bradley

Abstract

A specification defining allowable luma and chroma code-values is applied in a region-of-interest encoding method of a mezzanine compression process. The method may include analyzing an input image to determine regions or areas within each image frame that contain code-values that are near allowable limits as specified by the specification. In addition, the region-of-interest method may comprise then compressing those regions with higher precision than the other regions of the image that do not have code-values that are close to the legal limits.

IPC Classes  ?

  • H04N 19/136 - Incoming video signal characteristics or properties
  • H04N 19/64 - Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding using sub-band based transform, e.g. wavelets characterised by ordering of coefficients or of bits for transmission
  • H04N 19/147 - Data rate or code amount at the encoder output according to rate distortion criteria
  • H04N 19/14 - Coding unit complexity, e.g. amount of activity or edge presence estimation
  • H04N 19/186 - Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being a colour or a chrominance component
  • H04N 19/154 - Measured or subjectively estimated visual quality after decoding, e.g. measurement of distortion
  • H04N 19/17 - Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
  • H04N 19/12 - Selection from among a plurality of transforms or standards, e.g. selection between discrete cosine transform [DCT] and sub-band transform or selection between H.263 and H.264
  • H04N 19/176 - Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a block, e.g. a macroblock
  • H04N 19/182 - Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being a pixel

65.

Cinematic mastering for virtual reality and augmented reality

      
Application Number 16712820
Grant Number 10924817
Status In Force
Filing Date 2019-12-12
First Publication Date 2020-07-16
Grant Date 2021-02-16
Owner Warner Bros. Entertainment Inc. (USA)
Inventor
  • Defaria, Christopher
  • Gewickey, Gregory I.
  • Smith, Michael
  • Ostrover, Lewis S.

Abstract

An entertainment system provides data to a common screen (e.g., cinema screen) and personal immersive reality devices. For example, a cinematic data distribution server communicates with multiple immersive output devices each configured for providing immersive output (e.g., a virtual reality output) based on a data signal. Each of the multiple immersive output devices is present within eyesight of a common display screen. The server configures the data signal based on digital cinematic master data that includes immersive reality data. The server transmits the data signal to the multiple immersive output devices contemporaneously with each other, and optionally contemporaneously with providing a coordinated audio-video signal for output via the common display screen and shared audio system.

IPC Classes  ?

  • H04N 21/81 - Monomedia components thereof
  • H04N 21/422 - Input-only peripherals, e.g. global positioning system [GPS]
  • H04N 21/214 - Specialised server platform, e.g. server located in an airplane, hotel or hospital
  • H04N 21/2225 - Local VOD servers
  • H04N 21/433 - Content storage operation, e.g. storage operation in response to a pause request or caching operations
  • H04N 21/436 - Interfacing a local distribution network, e.g. communicating with another STB or inside the home

66.

Method and system for reducing drop-outs during video stream playback

      
Application Number 16214921
Grant Number 10779017
Status In Force
Filing Date 2018-12-10
First Publication Date 2020-06-11
Grant Date 2020-09-15
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Stratton, Tom
  • Lile, Shaun

Abstract

An improved video streaming method and system that reduces video stream playback dropouts when video data flow from a server is degraded or interrupted. Two adaptive bitrate video streams of the content are operated in parallel with video quality that can be adjusted independently. The first stream receives data blocks at a rate generally concurrent with the video playback. The second stream receives data blocks in advance of the playback point and these blocks are buffered. The best available block is used during playback. Buffered second stream blocks can be used to continue playback when the network connection is compromised or interrupted.

IPC Classes  ?

  • G06F 15/16 - Combinations of two or more digital computers each having at least an arithmetic unit, a program unit and a register, e.g. for a simultaneous processing of several programs
  • H04N 21/234 - Processing of video elementary streams, e.g. splicing of video streams or manipulating MPEG-4 scene graphs
  • H04N 21/2381 - Adapting the multiplex stream to a specific network, e.g. an IP [Internet Protocol] network
  • H04L 29/06 - Communication control; Communication processing characterised by a protocol
  • H04N 21/44 - Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to MPEG-4 scene graphs

67.

Transforming audio content for subjective fidelity

      
Application Number 16453985
Grant Number 10700657
Status In Force
Filing Date 2019-06-26
First Publication Date 2020-05-21
Grant Date 2020-06-30
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Antonellis, Darcy
  • Ostrover, Lewis

Abstract

A method or apparatus for delivering audio programming such as music to listeners may include identifying, capturing and applying a listener's audiometric profile to transform audio content so that the listener hears the content similarly to how the content was originally heard by a creative producer of the content. An audio testing tool may be implemented as software application to identify and capture the listener's audiometric profile. A signal processor may operate an algorithm used for processing source audio content, obtaining an identity and an audiometric reference profile of the creative producer from metadata associated with the content. The signal processor may then provide audio output based on a difference between the listener's and creative producer's audiometric profiles.

IPC Classes  ?

  • H03G 5/00 - Tone control or bandwidth control in amplifiers
  • H03G 9/00 - Combinations of two or more types of control, e.g. gain control and tone control
  • H03G 5/16 - Automatic control

68.

Rendering extended video in virtual reality

      
Application Number 16530899
Grant Number 11070781
Status In Force
Filing Date 2019-08-02
First Publication Date 2020-05-21
Grant Date 2021-07-20
Owner Warner Bros. Entertainment Inc. (USA)
Inventor
  • Zink, Michael
  • Cardenas, Mercedes Christine

Abstract

A method for transforming extended video data for display in virtual reality processes digital extended video data for display on a center screen and two auxiliary screens of a real extended video cinema. The method includes accessing, by a computer executing a rendering application, data that defines virtual screens including a center screen and auxiliary screens, wherein tangent lines to each of the auxiliary screens at their respective centers of area intersect with a tangent line to the center screen at its center of area at equal angles in a range of 75 to 105 degrees. The method includes preparing virtual extended video data at least in part by rendering the digital extended video on corresponding ones of the virtual screens; and saving the virtual extended video data in a computer memory. A corresponding playback method and apparatus display the processed data in virtual reality.

IPC Classes  ?

  • H04N 13/111 - Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation
  • H04N 13/332 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
  • H04N 13/282 - Image signal generators for generating image signals corresponding to three or more geometrical viewpoints, e.g. multi-view systems
  • H04N 13/366 - Image reproducers using viewer tracking

69.

Wand

      
Application Number 29664977
Grant Number D0880621
Status In Force
Filing Date 2018-09-28
First Publication Date 2020-04-07
Grant Date 2020-04-07
Owner Warner Bros. Entertainment, Inc. (USA)
Inventor
  • Craig, Stuart
  • Sole, Molly
  • Pinnock, Anna
  • Grimble, Kate

70.

Immersive virtual reality production and playback for storytelling content

      
Application Number 16565347
Grant Number 11342000
Status In Force
Filing Date 2019-09-09
First Publication Date 2020-03-19
Grant Date 2022-05-24
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Gewickey, Gregory I.
  • Ostrover, Lewis S.
  • Smith, Michael
  • Zink, Michael

Abstract

Methods for digital content production and playback of an immersive stereographic video work provide or enhance interactivity of immersive entertainment using various different playback and production techniques. “Immersive stereographic” may refer to virtual reality, augmented reality, or both. The methods may be implemented using specialized equipment for immersive stereographic playback or production. Aspects of the methods may be encoded as instructions in a computer memory, executable by one or more processors of the equipment to perform the aspects.

IPC Classes  ?

  • H04N 5/76 - Television signal recording
  • G11B 27/036 - Insert-editing
  • H04N 13/117 - Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation the virtual viewpoint locations being selected by the viewers or determined by viewer tracking
  • G11B 27/031 - Electronic editing of digitised analogue information signals, e.g. audio or video signals
  • G11B 27/11 - Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information not detectable on the record carrier
  • H04N 13/388 - Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume
  • H04N 13/344 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
  • H04N 13/25 - Image signal generators using stereoscopic image cameras using image signals from one sensor to control the characteristics of another sensor

71.

Adding motion effects to digital still images

      
Application Number 16447892
Grant Number 10867425
Status In Force
Filing Date 2019-06-20
First Publication Date 2019-12-12
Grant Date 2020-12-15
Owner Warner Bros. Entertainment Inc. (USA)
Inventor
  • Roache, Donald
  • Collar, Bradley
  • Ostrover, Lewis

Abstract

A digital still image is processed using motion-adding algorithms that are provided with an original still image and a set of motionizing parameters. Output of the motion-adding algorithms include a motionized digital image suitable for display by any digital image display device. The motionized digital image may be used in place of a still image in any context that a still image would be used, for example, in an ebook, e-zine, digital graphic novel, website, picture or poster, or user interface.

IPC Classes  ?

  • G06T 13/00 - Animation
  • G06T 13/80 - 2D animation, e.g. using sprites
  • G06F 16/74 - Browsing; Visualisation therefor
  • G06T 5/00 - Image enhancement or restoration
  • G06T 5/20 - Image enhancement or restoration by the use of local operators
  • G06T 5/50 - Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction

72.

Social robot with environmental control feature

      
Application Number 16408403
Grant Number 11370125
Status In Force
Filing Date 2019-05-09
First Publication Date 2019-12-05
Grant Date 2022-06-28
Owner Warner Bros. Entertainment Inc. (USA)
Inventor
  • Gewickey, Gregory I.
  • Ostrover, Lewis S.

Abstract

A method and apparatus for controlling a social robot includes operating an electronic output device based on social interactions between the social robot and a user. The social robot utilizes an algorithm or other logical solution process to infer a user mental state, for example a mood or desire, based on observation of the social interaction. Based on the inferred mental state, the social robot causes an action of the electronic output device to be selected. Actions may include, for example, playing a selected video clip, brewing a cup of coffee, or adjusting window blinds.

IPC Classes  ?

  • B25J 11/00 - Manipulators not otherwise provided for
  • B25J 13/00 - Controls for manipulators
  • G06N 3/00 - Computing arrangements based on biological models

73.

Geometry matching in virtual reality and augmented reality

      
Application Number 16179710
Grant Number 10827233
Status In Force
Filing Date 2018-11-02
First Publication Date 2019-11-07
Grant Date 2020-11-03
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Defaria, Christopher
  • Smith, Michael

Abstract

Methods, apparatus and systems for geometric matching of virtual reality (VR) or augmented reality (AR) output contemporaneously with video output formatted for display on a 2D screen include a determination of value sets that when used in image processing cause an off-screen angular field of view of the at least one of the AR output object or the VR output object to have a fixed relationship to at least one of the angular field of view of the onscreen object or of the 2D screen. The AR/VR output object is outputted to an AR/VR display device and the user experience is improved by the geometric matching between objects observed on the AR/VR display device and corresponding objects appearing on the 2D screen.

IPC Classes  ?

  • H04N 13/00 - PICTORIAL COMMUNICATION, e.g. TELEVISION - Details thereof
  • H04N 21/81 - Monomedia components thereof
  • H04N 13/117 - Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation the virtual viewpoint locations being selected by the viewers or determined by viewer tracking
  • G06T 15/20 - Perspective computation

74.

Methods for controlling scene, camera and viewing parameters for altering perception of 3D imagery

      
Application Number 16370067
Grant Number 10778955
Status In Force
Filing Date 2019-03-29
First Publication Date 2019-09-26
Grant Date 2020-09-15
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Nolan, Christopher E.
  • Collar, Bradley T.
  • Smith, Michael D.

Abstract

Mathematical relationships between the scene geometry, camera parameters, and viewing environment are used to control stereography to obtain various results influencing the viewer's perception of 3D imagery. The methods may include setting a horizontal shift, convergence distance, and camera interaxial parameter to achieve various effects. The methods may be implemented in a computer-implemented tool for interactively modifying scene parameters during a 2D-to-3D conversion process, which may then trigger the re-rendering of the 3D content on the fly.

IPC Classes  ?

  • H04N 13/189 - Recording image signals; Reproducing recorded image signals
  • H04N 13/204 - Image signal generators using stereoscopic image cameras
  • H04N 13/128 - Adjusting depth or disparity
  • H04N 13/275 - Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals

75.

Video conversion technology

      
Application Number 16397090
Grant Number 10687017
Status In Force
Filing Date 2019-04-29
First Publication Date 2019-08-15
Grant Date 2020-06-16
Owner Warner Bros. Entertainment Inc. (USA)
Inventor
  • Butterworth, William Evans
  • Dwinell, Roger Raymond
  • Mcmahon, Valerie

Abstract

Video conversion technology, in which a first stream of video content is accessed and multiple, different layers are extracted from the first stream of the video content. Each of the multiple, different layers are separately processed to convert the multiple, different layers into modified layers that each have a higher resolution. The modified layers are reassembled into a second stream of the video content that has a higher resolution than the first stream of the video content.

IPC Classes  ?

  • H04N 7/01 - Conversion of standards
  • H04N 21/2343 - Processing of video elementary streams, e.g. splicing of video streams or manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
  • G09G 5/00 - Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
  • G09G 5/02 - Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed
  • G09G 5/14 - Display of multiple viewports
  • H04N 19/85 - Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using pre-processing or post-processing specially adapted for video compression
  • H04N 21/4402 - Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display

76.

Production and packaging of entertainment data for virtual reality

      
Application Number 16277918
Grant Number 10657727
Status In Force
Filing Date 2019-02-15
First Publication Date 2019-08-15
Grant Date 2020-05-19
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Defaria, Christopher
  • Mintus, Piotr
  • Lake-Schaal, Gary
  • Ostrover, Lewis

Abstract

An augmented reality (AR) output device or virtual reality (VR) output device is worn by a user and includes one or more sensors positioned to detect actions performed by a user of the immersive output device. A processor provides a data signal configured for the AR or VR output device, causing the immersive output device to provide AR output or VR output via a stereographic display device. The data signal encodes audio-video data. The processor controls a pace of scripted events defined by a narrative in the one of the AR output or the VR output, based on output from the one or more sensors indicating actions performed by a user of the AR or VR output device. The audio-video data may be packaged in a non-transitory computer-readable medium with additional content that is coordinated with the defined narrative and is configured for providing an alternative output, such as 2D video output or the stereoscopic 3D output.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • H04N 13/356 - Image reproducers having separate monoscopic and stereoscopic modes
  • G06F 3/0487 - Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

77.

Method and apparatus for generating media presentation content with environmentally modified audio components

      
Application Number 16396280
Grant Number 10819969
Status In Force
Filing Date 2019-04-26
First Publication Date 2019-08-15
Grant Date 2020-10-27
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Ostrover, Lewis S.
  • Collar, Bradley Thomas

Abstract

An apparatus for generating a presentation from content having original audio and video components is described wherein an environment detector is configured to output an environment-type signal indicating a detected particular environment. An acoustics memory is configured to output selected acoustic characteristics indicative of the environment identified by the environment-type signal. An audio processor receives the audio components and the acoustic characteristics and operates to modify the original audio components to produce modified audio components based on the selected acoustic characteristics. The presentation including the modified audio components is output.

IPC Classes  ?

  • H04N 13/161 - Encoding, multiplexing or demultiplexing different image signal components
  • H04S 7/00 - Indicating arrangements; Control arrangements, e.g. balance control
  • H04N 13/117 - Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation the virtual viewpoint locations being selected by the viewers or determined by viewer tracking
  • H04N 5/76 - Television signal recording
  • H04N 9/802 - Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving processing of the sound signal
  • G10L 19/008 - Multichannel audio signal coding or decoding using interchannel correlation to reduce redundancy, e.g. joint-stereo, intensity-coding or matrixing
  • G11B 20/10 - Digital recording or reproducing
  • H04N 19/597 - Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding specially adapted for multi-view video sequence encoding

78.

Control of social robot based on prior character portrayal

      
Application Number 16258492
Grant Number 11618170
Status In Force
Filing Date 2019-01-25
First Publication Date 2019-07-25
Grant Date 2023-04-04
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Gewecke, Thomas
  • Colf, Victoria L.
  • Gewickey, Gregory I.
  • Ostrover, Lewis S.

Abstract

A method and apparatus for controlling a social robot includes providing a set of quantitative personality trait values, also called a “personality profile” to a decision engine of the social robot. The personality profile is derived from a character portrayal in a fictional work, dramatic performance, or by a real-life person (any one of these sometime referred to herein as a “source character”). The decision engine controls social responses of the social robot to environmental stimuli, based in part on the set of personality trait values. The social robot thereby behaves in a manner consistent with the personality profile for the profiled source character.

IPC Classes  ?

  • B25J 11/00 - Manipulators not otherwise provided for
  • B25J 9/16 - Programme controls
  • G06N 3/00 - Computing arrangements based on biological models
  • G06N 3/008 - Artificial life, i.e. computing arrangements simulating life based on physical entities controlled by simulated intelligence so as to replicate intelligent life forms, e.g. based on robots replicating pets or humans in their appearance or behaviour

79.

Method and apparatus for color difference transform

      
Application Number 16192960
Grant Number 10856011
Status In Force
Filing Date 2018-11-16
First Publication Date 2019-07-04
Grant Date 2020-12-01
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor Smith, Michael D.

Abstract

Efficient image compression for video data characterized by a non-neutral dominant white point is achieved by transforming the input video signal into a de-correlated video signal based on a color difference encoding transform, wherein the color difference encoding transform is adapted based on the dominant white point using an algorithm. The adapting algorithm is designed for optimizing low-entropy output when the white point is other than a neutral or equal-energy value. Decompression is handled conversely.

IPC Classes  ?

  • H04N 19/85 - Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using pre-processing or post-processing specially adapted for video compression
  • H04N 19/186 - Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being a colour or a chrominance component
  • H04N 1/64 - Colour picture communication systems - Details therefor, e.g. coding or decoding means therefor

80.

Mixed reality system for context-aware virtual object rendering

      
Application Number 16212438
Grant Number 11497986
Status In Force
Filing Date 2018-12-06
First Publication Date 2019-06-13
Grant Date 2022-11-15
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Smith, Michael
  • Ostrover, Lewis

Abstract

A computer-implemented method in conjunction with mixed reality gear (e.g., a headset) includes imaging a real scene encompassing a user wearing a mixed reality output apparatus. The method includes determining data describing a real context of the real scene, based on the imaging; for example, identifying or classifying objects, lighting, sound or persons in the scene. The method includes selecting a set of content including content enabling rendering of at least one virtual object from a content library, based on the data describing a real context, using various selection algorithms. The method includes rendering the virtual object in the mixed reality session by the mixed reality output apparatus, optionally based on the data describing a real context (“context parameters”). An apparatus is configured to perform the method using hardware, firmware, and/or software.

IPC Classes  ?

  • A63F 13/217 - Input arrangements for video game devices characterised by their sensors, purposes or types using environment-related information, i.e. information generated otherwise than by the player, e.g. ambient temperature or humidity
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G02B 27/01 - Head-up displays
  • H04S 7/00 - Indicating arrangements; Control arrangements, e.g. balance control
  • G06V 10/70 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning
  • G06V 20/10 - Terrestrial scenes
  • A63F 13/213 - Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
  • G06T 7/60 - Analysis of geometric attributes
  • A63F 13/211 - Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
  • A63F 13/25 - Output arrangements for video game devices

81.

Immersive virtual reality production and playback for storytelling content

      
Application Number 16149033
Grant Number 10410675
Status In Force
Filing Date 2018-10-01
First Publication Date 2019-04-11
Grant Date 2019-09-10
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Gewickey, Gregory I.
  • Ostrover, Lewis S.
  • Smith, Michael
  • Zink, Michael

Abstract

Methods for digital content production and playback of an immersive stereographic video work provide or enhance interactivity of immersive entertainment using various different playback and production techniques. “Immersive stereographic” may refer to virtual reality, augmented reality, or both. The methods may be implemented using specialized equipment for immersive stereographic playback or production. Aspects of the methods may be encoded as instructions in a computer memory, executable by one or more processors of the equipment to perform the aspects.

IPC Classes  ?

  • G11B 27/036 - Insert-editing
  • H04N 13/117 - Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation the virtual viewpoint locations being selected by the viewers or determined by viewer tracking
  • G11B 27/031 - Electronic editing of digitised analogue information signals, e.g. audio or video signals
  • G11B 27/11 - Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information not detectable on the record carrier
  • H04N 13/388 - Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume
  • H04N 13/344 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
  • H04N 13/25 - Image signal generators using stereoscopic image cameras using image signals from one sensor to control the characteristics of another sensor

82.

Biometric feedback in production and playback of video content

      
Application Number 15990589
Grant Number 10497399
Status In Force
Filing Date 2018-05-26
First Publication Date 2019-01-03
Grant Date 2019-12-03
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Gewickey, Gregory I.
  • Ostrover, Lewis S.
  • Smith, Michael
  • Zink, Michael

Abstract

Methods for digital content production and playback of an immersive stereographic video work provide or enhance interactivity of immersive entertainment using various different playback and production techniques. “Immersive stereographic” may refer to virtual reality, augmented reality, or both. The methods may be implemented using specialized equipment for immersive stereographic playback or production. Aspects of the methods may be encoded as instructions in a computer memory, executable by one or more processors of the equipment to perform the aspects.

IPC Classes  ?

  • H04N 5/76 - Television signal recording
  • G11B 27/036 - Insert-editing
  • H04N 13/117 - Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation the virtual viewpoint locations being selected by the viewers or determined by viewer tracking
  • G11B 27/031 - Electronic editing of digitised analogue information signals, e.g. audio or video signals
  • G11B 27/11 - Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information not detectable on the record carrier
  • H04N 13/388 - Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume
  • H04N 13/344 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
  • H04N 13/25 - Image signal generators using stereoscopic image cameras using image signals from one sensor to control the characteristics of another sensor

83.

Social and procedural effects for computer-generated environments

      
Application Number 15909892
Grant Number 10933324
Status In Force
Filing Date 2018-03-01
First Publication Date 2018-09-13
Grant Date 2021-03-02
Owner Warner Bros. Entertainment Inc. (USA)
Inventor
  • Gewickey, Greg
  • Lake-Schaal, Gary
  • Mintus, Piotr
  • Ostrover, Lewis
  • Smith, Michael

Abstract

A sensor coupled to an AR/VR headset detects an eye convergence distance. A processor adjusts a focus distance for a virtual camera that determines rendering of a three-dimensional (3D) object for a display device of the headset, based on at least one of the eye convergence distance or a directed focus of attention for the at least one of the VR content or the AR content.

IPC Classes  ?

  • A63F 13/525 - Changing parameters of virtual cameras
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • A63F 13/58 - Controlling game characters or game objects based on the game progress by computing conditions of game characters, e.g. stamina, strength, motivation or energy level
  • G09G 3/00 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
  • H04L 29/06 - Communication control; Communication processing characterised by a protocol
  • A63F 13/28 - Output arrangements for video game devices responding to control signals received from the game device for affecting ambient conditions, e.g. for vibrating players' seats, activating scent dispensers or affecting temperature or light
  • A63F 13/30 - Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers

84.

Portal device and cooperating video game machine

      
Application Number 15727531
Grant Number 10583352
Status In Force
Filing Date 2017-10-06
First Publication Date 2018-02-01
Grant Date 2020-03-10
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor Burton, Jon

Abstract

A portal device for a video game includes a pad with different zones that can be illuminated by selectable colors, a toy sensor (e.g., an RFID tag sensor) associated with each zone, a controller and a communications port for communicating with a video game process executing on a game machine. The colors of each zone can be configured to one or a combination of three primary colors during game play, based on the game process. The portal device reacts to placement of tagged toys on zones and the color of the zones during play, and provides sensor data to the game process. The game process controls the game state in part based on data from the portal device and in part on other user input.

IPC Classes  ?

  • A63F 9/24 - Games using electronic circuits not otherwise provided for
  • A63F 13/00 - Video games, i.e. games using an electronically generated display having two or more dimensions
  • A63F 13/213 - Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
  • A63F 13/214 - Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
  • A63F 13/65 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
  • G07F 17/32 - Coin-freed apparatus for hiring articles; Coin-freed facilities or services for games, toys, sports, or amusements
  • A63F 13/822 - Strategy games; Role-playing games 

85.

Immersive virtual reality production and playback for storytelling content

      
Application Number 15717561
Grant Number 10276211
Status In Force
Filing Date 2017-09-27
First Publication Date 2018-01-25
Grant Date 2019-04-30
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Gewickey, Gregory I.
  • Ostrover, Lewis S.
  • Smith, Michael
  • Zink, Michael

Abstract

Methods for digital content production and playback of an immersive stereographic video work provide or enhance interactivity of immersive entertainment using various different playback and production techniques. “Immersive stereographic” may refer to virtual reality, augmented reality, or both. The methods may be implemented using specialized equipment for immersive stereographic playback or production. Aspects of the methods may be encoded as instructions in a computer memory, executable by one or more processors of the equipment to perform the aspects.

IPC Classes  ?

  • H04N 13/117 - Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation the virtual viewpoint locations being selected by the viewers or determined by viewer tracking
  • G11B 27/036 - Insert-editing
  • G11B 27/031 - Electronic editing of digitised analogue information signals, e.g. audio or video signals
  • G11B 27/11 - Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information not detectable on the record carrier
  • H04N 13/02 - Picture signal generators
  • H04N 13/04 - Picture reproducers
  • H04N 13/388 - Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume
  • H04N 13/344 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
  • H04N 13/25 - Image signal generators using stereoscopic image cameras using image signals from one sensor to control the characteristics of another sensor

86.

Immersive virtual reality production and playback for storytelling content

      
Application Number 15717823
Grant Number 10109320
Status In Force
Filing Date 2017-09-27
First Publication Date 2018-01-18
Grant Date 2018-10-23
Owner Warner Bros. Entertainment Inc. (USA)
Inventor
  • Gewickey, Gregory I.
  • Ostrover, Lewis S.
  • Smith, Michael
  • Zink, Michael

Abstract

Methods for digital content production and playback of an immersive stereographic video work provide or enhance interactivity of immersive entertainment using various different playback and production techniques. “Immersive stereographic” may refer to virtual reality, augmented reality, or both. The methods may be implemented using specialized equipment for immersive stereographic playback or production. Aspects of the methods may be encoded as instructions in a computer memory, executable by one or more processors of the equipment to perform the aspects.

IPC Classes  ?

  • G11B 27/036 - Insert-editing
  • G11B 27/11 - Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information not detectable on the record carrier
  • G11B 27/031 - Electronic editing of digitised analogue information signals, e.g. audio or video signals
  • H04N 13/04 - Picture reproducers
  • H04N 13/02 - Picture signal generators
  • H04N 13/00 - PICTORIAL COMMUNICATION, e.g. TELEVISION - Details thereof
  • H04N 13/117 - Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation the virtual viewpoint locations being selected by the viewers or determined by viewer tracking
  • H04N 13/25 - Image signal generators using stereoscopic image cameras using image signals from one sensor to control the characteristics of another sensor
  • H04N 13/344 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays

87.

Transforming audio content for subjective fidelity

      
Application Number 15687292
Grant Number 10340870
Status In Force
Filing Date 2017-08-25
First Publication Date 2018-01-04
Grant Date 2019-07-02
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Antonellis, Darcy
  • Ostover, Lewis

Abstract

A method or apparatus for delivering audio programming such as music to listeners may include identifying, capturing and applying a listener's audiometric profile to transform audio content so that the listener hears the content similarly to how the content was originally heard by a creative producer of the content. An audio testing tool may be implemented as software application to identify and capture the listener's audiometric profile. A signal processor may operate an algorithm used for processing source audio content, obtaining an identity and an audiometric reference profile of the creative producer from metadata associated with the content. The signal processor may then provide audio output based on a difference between the listener's and creative producer's audiometric profiles.

IPC Classes  ?

  • H03G 5/00 - Tone control or bandwidth control in amplifiers
  • H03G 5/16 - Automatic control
  • H03G 9/00 - Combinations of two or more types of control, e.g. gain control and tone control

88.

Region-of-interest encoding enhancements for variable-bitrate mezzanine compression

      
Application Number 15688523
Grant Number 10511861
Status In Force
Filing Date 2017-08-28
First Publication Date 2017-12-28
Grant Date 2019-12-17
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Smith, Michael
  • Collar, Bradley

Abstract

A specification defining allowable luma and chroma code-values is applied in a region-of-interest encoding method of a mezzanine compression process. The method may include analyzing an input image to determine regions or areas within each image frame that contain code-values that are near allowable limits as specified by the specification. In addition, the region-of-interest method may comprise then compressing those regions with higher precision than the other regions of the image that do not have code-values that are close to the legal limits.

IPC Classes  ?

  • H04N 19/136 - Incoming video signal characteristics or properties
  • H04N 19/64 - Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding using sub-band based transform, e.g. wavelets characterised by ordering of coefficients or of bits for transmission
  • H04N 19/147 - Data rate or code amount at the encoder output according to rate distortion criteria
  • H04N 19/14 - Coding unit complexity, e.g. amount of activity or edge presence estimation
  • H04N 19/186 - Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being a colour or a chrominance component
  • H04N 19/154 - Measured or subjectively estimated visual quality after decoding, e.g. measurement of distortion
  • H04N 19/17 - Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
  • H04N 19/12 - Selection from among a plurality of transforms or standards, e.g. selection between discrete cosine transform [DCT] and sub-band transform or selection between H.263 and H.264
  • H04N 19/176 - Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a block, e.g. a macroblock
  • H04N 19/182 - Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being a pixel

89.

Generation and use of user-selected scenes playlist from distributed digital content

      
Application Number 15691362
Grant Number 10748578
Status In Force
Filing Date 2017-08-30
First Publication Date 2017-12-21
Grant Date 2020-08-18
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Lau, Kim
  • Frautschi, Jacob
  • Gasparri, Massimiliano
  • Lee, Randy
  • Harman, Patrick

Abstract

A digital content package includes first content comprising a video feature such as a motion picture or the like, and a user-selectable application configured to operate as follows. When activated using an icon off of a menu screen, the application records an identifier for scenes (discrete portions) of the first content that are selected by a user to generate a playlist. The user may select the scenes by indicating a start and end of each scene. The application saves the playlist locally, then uploads to a server. Via a user account at the server, a user may publish the playlist to a user-created distribution list, webpage, or other electronic publication, and modify the playlist by deleting or reordering scenes.

IPC Classes  ?

  • G11B 27/034 - Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
  • G11B 27/11 - Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information not detectable on the record carrier
  • H04N 21/2743 - Video hosting of uploaded data from client
  • H04N 21/426 - Internal components of the client
  • H04N 21/432 - Content retrieval operation from a local storage medium, e.g. hard-disk
  • H04N 21/442 - Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed or the storage space available from the internal hard disk
  • H04N 21/482 - End-user interface for program selection
  • H04N 21/845 - Structuring of content, e.g. decomposing content into time segments

90.

Digital audio-video content mobile library

      
Application Number 15676858
Grant Number 10555017
Status In Force
Filing Date 2017-08-14
First Publication Date 2017-11-30
Grant Date 2020-02-04
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Herz, Justin
  • Kozan, Kevin
  • Mahmoud, Essam
  • Levicki, Alan

Abstract

Methods for managing digital content include authenticating a user account identifier from a client device over a computer network, registering a telephone number for at least one wireless mobile device in a registry identified with the user account based on the authenticating, as a pre-authorized identifier for accessing digital content licensed for use with the client device. The methods include maintaining a library of digital content identified with the user account for access by the at least one wireless mobile device, and initiating streaming of the digital video content to the at least one wireless mobile device without requiring user authentication from the at least one wireless mobile device, based on the registering of the telephone number as the pre-authorized identifier. An apparatus for performing the method comprises a processor coupled to a memory, the memory holding instructions for performing steps of the method as summarized above.

IPC Classes  ?

  • H04N 7/173 - Analogue secrecy systems; Analogue subscription systems with two-way working, e.g. subscriber sending a programme selection signal
  • H04N 21/254 - Management at additional data server, e.g. shopping server or rights management server
  • H04N 21/258 - Client or end-user data management, e.g. managing client capabilities, user preferences or demographics or processing of multiple end-users preferences to derive collaborative data
  • H04N 21/414 - Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
  • H04N 21/4627 - Rights management
  • H04N 21/488 - Data services, e.g. news ticker
  • H04N 21/8355 - Generation of protective data, e.g. certificates involving usage data, e.g. number of copies or viewings allowed
  • H04N 21/8358 - Generation of protective data, e.g. certificates involving watermark
  • H04N 21/84 - Generation or processing of descriptive data, e.g. content descriptors
  • H04N 21/61 - Network physical structure; Signal processing
  • H04N 21/81 - Monomedia components thereof

91.

Video conversion technology

      
Application Number 15664813
Grant Number 10277862
Status In Force
Filing Date 2017-07-31
First Publication Date 2017-11-16
Grant Date 2019-04-30
Owner Warner Bros. Entertainment Inc. (USA)
Inventor
  • Butterworth, William Evans
  • Dwinell, Roger Raymond
  • Mcmahon, Valerie

Abstract

Video conversion technology, in which a first stream of video content is accessed and multiple, different layers are extracted from the first stream of the video content. Each of the multiple, different layers are separately processed to convert the multiple, different layers into modified layers that each have a higher resolution. The modified layers are reassembled into a second stream of the video content that has a higher resolution than the first stream of the video content.

IPC Classes  ?

  • H04N 7/01 - Conversion of standards
  • G09G 5/14 - Display of multiple viewports
  • G09G 5/02 - Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed
  • G09G 5/00 - Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
  • H04N 21/2343 - Processing of video elementary streams, e.g. splicing of video streams or manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
  • H04N 19/85 - Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using pre-processing or post-processing specially adapted for video compression
  • H04N 21/4402 - Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display

92.

Digital media distribution device

      
Application Number 15619339
Grant Number 09866876
Status In Force
Filing Date 2017-06-09
First Publication Date 2017-09-28
Grant Date 2018-01-09
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Klamer, Paul
  • Long, Ken
  • Ramamurthy, Arjun

Abstract

A digital media distribution device that includes an encoder, a decoder coupled to the encoder, and a transcoder coupled to the decoder. The encoder is configured to encode input data that is received by the digital media distribution device into a first data format. The decoder is configured to decode output data to be output by the digital media distribution device. The transcoder is configured to convert the encoded input data from the first data format into a second data format. The digital media distribution device is configured to be coupled to a computer network.

IPC Classes  ?

  • H04N 7/173 - Analogue secrecy systems; Analogue subscription systems with two-way working, e.g. subscriber sending a programme selection signal
  • H04N 7/16 - Analogue secrecy systems; Analogue subscription systems
  • G06F 15/16 - Combinations of two or more digital computers each having at least an arithmetic unit, a program unit and a register, e.g. for a simultaneous processing of several programs
  • H04L 12/50 - Circuit switching systems, i.e. systems in which the path is physically permanent during the communication
  • H04Q 11/00 - Selecting arrangements for multiplex systems
  • H04N 21/2343 - Processing of video elementary streams, e.g. splicing of video streams or manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
  • H04N 19/46 - Embedding additional information in the video signal during the compression process
  • H04N 21/231 - Content storage operation, e.g. caching movies for short term storage, replicating data over plural servers or prioritizing data for deletion
  • H04N 19/162 - User input
  • H04N 21/61 - Network physical structure; Signal processing
  • H04N 21/233 - Processing of audio elementary streams
  • H04N 19/157 - Assigned coding mode, i.e. the coding mode being predefined or preselected to be further used for selection of another element or parameter
  • H04N 19/40 - Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using video transcoding, i.e. partial or full decoding of a coded input stream followed by re-encoding of the decoded output stream
  • H04N 21/60 - Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client; Communication details between server and client

93.

Method and apparatus for generating 3D audio positioning using dynamically optimized audio 3D space perception cues

      
Application Number 15594071
Grant Number 10026452
Status In Force
Filing Date 2017-05-12
First Publication Date 2017-08-31
Grant Date 2018-07-17
Owner Warner Bros. Entertainment Inc. (USA)
Inventor
  • Antonellis, Darcy
  • Gasparri, Massimiliano
  • Ostrover, Lewis S.
  • Collar, Bradley Thomas

Abstract

a first audio/video encoder receiving an input and encoding said input into an audio visual content having visual objects and audio objects, said audio objects being disposed at location corresponding to said one spatial position, said encoder using said encoding coefficients for said encoding.

IPC Classes  ?

  • H04N 9/87 - Regeneration of colour television signals
  • G11B 27/30 - Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on the same track as the main recording
  • G11B 27/034 - Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
  • H04N 19/597 - Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding specially adapted for multi-view video sequence encoding
  • G10L 19/008 - Multichannel audio signal coding or decoding using interchannel correlation to reduce redundancy, e.g. joint-stereo, intensity-coding or matrixing
  • G10L 19/20 - Vocoders using multiple modes using sound class specific coding, hybrid encoders or object based coding
  • G11B 20/10 - Digital recording or reproducing

94.

Cable identification marker

      
Application Number 29534735
Grant Number D0788576
Status In Force
Filing Date 2015-07-30
First Publication Date 2017-06-06
Grant Date 2017-06-06
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor Scibetta, Samuel

95.

Methods for controlling scene, camera and viewing parameters for altering perception of 3D imagery

      
Application Number 15368456
Grant Number 10277883
Status In Force
Filing Date 2016-12-02
First Publication Date 2017-05-25
Grant Date 2019-04-30
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Nolan, Christopher E.
  • Collar, Bradley T.
  • Smith, Michael D.

Abstract

Mathematical relationships between the scene geometry, camera parameters, and viewing environment are used to control stereography to obtain various results influencing the viewer's perception of 3D imagery. The methods may include setting a horizontal shift, convergence distance, and camera interaxial parameter to achieve various effects. The methods may be implemented in a computer-implemented tool for interactively modifying scene parameters during a 2D-to-3D conversion process, which may then trigger the re-rendering of the 3D content on the fly.

IPC Classes  ?

  • H04N 13/189 - Recording image signals; Reproducing recorded image signals
  • H04N 13/128 - Adjusting depth or disparity
  • H04N 13/204 - Image signal generators using stereoscopic image cameras
  • H04N 13/275 - Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals

96.

Method and apparatus for generating encoded content using dynamically optimized conversion for 3D movies

      
Application Number 15408906
Grant Number 10453492
Status In Force
Filing Date 2017-01-18
First Publication Date 2017-05-04
Grant Date 2019-10-22
Owner Warner Bros. Entertainment Inc. (USA)
Inventor
  • Antonellis, Darcy
  • Ostrover, Lewis S.
  • Collar, Bradley Thomas

Abstract

The present invention pertains to an apparatus and method for adding a graphic element, such as a subtitle to selected locations of the frames in a 3D movie. The authoring tool receives a depth map indicating the position of various objects in the frames of 3D content along a Z-axis. The authoring device then designates a position for at least one additional graphic element in at least some of the frames, these positions being determined in relation either to the positions of the objects or the position of the screen along said Z-axis. An encoder uses parameters from the authoring tool to reauthor the 3D movie by adding the graphic content to the positions designated by the parameters.

IPC Classes  ?

  • H04N 5/92 - Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
  • H04N 5/89 - Television signal recording using holographic recording
  • G11B 27/02 - Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
  • H04N 13/156 - Mixing image signals
  • H04N 9/82 - Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
  • H04N 21/488 - Data services, e.g. news ticker
  • H04N 9/80 - Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback

97.

Production and packaging of entertainment data for virtual reality

      
Application Number 15289090
Grant Number 10249091
Status In Force
Filing Date 2016-10-07
First Publication Date 2017-04-13
Grant Date 2019-04-02
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Defaria, Christopher
  • Mintus, Piotr
  • Lake-Schaal, Gary
  • Ostrover, Lewis

Abstract

An augmented reality (AR) output device or virtual reality (VR) output device is worn by a user, and includes one or more sensors positioned to detect actions performed by a user of the immersive output device. A processor provides a data signal configured for the AR or VR output device, causing the immersive output device to provide AR output or VR output via a stereographic display device. The data signal encodes audio-video data. The processor controls a pace of scripted events defined by a narrative in the one of the AR output or the VR output, based on output from the one or more sensors indicating actions performed by a user of the AR or VR output device. The audio-video data may be packaged in a non-transitory computer-readable medium with additional content that is coordinated with the defined narrative and is configured for providing an alternative output, such as 2D video output or the stereoscopic 3D output.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • H04N 13/356 - Image reproducers having separate monoscopic and stereoscopic modes
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/0487 - Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser

98.

Cinematic mastering for virtual reality and augmented reality

      
Application Number 15289174
Grant Number 10511895
Status In Force
Filing Date 2016-10-08
First Publication Date 2017-04-13
Grant Date 2019-12-17
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Defaria, Christopher
  • Gewickey, Gregory
  • Smith, Michael
  • Ostrover, Lewis

Abstract

An entertainment system provides data to a common screen (e.g., cinema screen) and personal immersive reality devices. For example, a cinematic data distribution server communicates with multiple immersive output devices each configured for providing immersive output (e.g., a virtual reality output) based on a data signal. Each of the multiple immersive output devices is present within eyesight of a common display screen. The server configures the data signal based on digital cinematic master data that includes immersive reality data. The server transmits the data signal to the multiple immersive output devices contemporaneously with each other, and optionally contemporaneously with providing a coordinated audio-video signal for output via a the common display screen and shared audio system.

IPC Classes  ?

  • H04N 21/81 - Monomedia components thereof
  • H04N 21/422 - Input-only peripherals, e.g. global positioning system [GPS]
  • H04N 21/214 - Specialised server platform, e.g. server located in an airplane, hotel or hospital
  • H04N 21/2225 - Local VOD servers
  • H04N 21/433 - Content storage operation, e.g. storage operation in response to a pause request or caching operations
  • H04N 21/436 - Interfacing a local distribution network, e.g. communicating with another STB or inside the home

99.

Social and procedural effects for computer-generated environments

      
Application Number 15255071
Grant Number 10213688
Status In Force
Filing Date 2016-09-01
First Publication Date 2017-03-02
Grant Date 2019-02-26
Owner WARNER BROS. ENTERTAINMENT, INC. (USA)
Inventor
  • Gewicke, Greg
  • Lake-Schaal, Gary
  • Mintus, Piotr
  • Ostrover, Lewis
  • Smith, Michael

Abstract

A processor provides a simulated three-dimensional (3D) environment for a game or virtual reality (VR) experience, including controlling a characteristic parameter of a 3D object or character based on at least one of: an asynchronous event in a second game, feedback from multiple synchronous users of the VR experience, or on a function driven by one or variables reflecting a current state of at least one of the 3D environment, the game or the VR experience. In another aspect, a sensor coupled to an AR/VR headset detects an eye convergence distance. A processor adjusts a focus distance for a virtual camera that determines rendering of a three-dimensional (3D) object for a display device of the headset, based on at least one of the eye convergence distance or a directed focus of attention for the at least one of the VR content or the AR content.

IPC Classes  ?

  • A63F 13/525 - Changing parameters of virtual cameras
  • A63F 13/28 - Output arrangements for video game devices responding to control signals received from the game device for affecting ambient conditions, e.g. for vibrating players' seats, activating scent dispensers or affecting temperature or light
  • A63F 13/30 - Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
  • A63F 13/58 - Controlling game characters or game objects based on the game progress by computing conditions of game characters, e.g. stamina, strength, motivation or energy level
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G09G 3/00 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
  • H04L 29/06 - Communication control; Communication processing characterised by a protocol

100.

Region-of-interest encoding enhancements for variable-bitrate mezzanine compression

      
Application Number 15174765
Grant Number 09749659
Status In Force
Filing Date 2016-06-06
First Publication Date 2016-09-29
Grant Date 2017-08-29
Owner WARNER BROS. ENTERTAINMENT INC. (USA)
Inventor
  • Smith, Michael
  • Collar, Bradley

Abstract

A specification defining allowable luma and chroma code-values is applied in a region-of-interest encoding method of a mezzanine compression process. The method may include analyzing an input image to determine regions or areas within each image frame that contain code-values that are near allowable limits as specified by the specification. In addition, the region-of-interest method may comprise then compressing those regions with higher precision than the other regions of the image that do not have code-values that are close to the legal limits.

IPC Classes  ?

  • H04N 7/50 - involving transform and predictive coding
  • H04N 19/64 - Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding using sub-band based transform, e.g. wavelets characterised by ordering of coefficients or of bits for transmission
  • H04N 19/147 - Data rate or code amount at the encoder output according to rate distortion criteria
  • H04N 19/136 - Incoming video signal characteristics or properties
  • H04N 19/14 - Coding unit complexity, e.g. amount of activity or edge presence estimation
  • H04N 19/186 - Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being a colour or a chrominance component
  • H04N 19/154 - Measured or subjectively estimated visual quality after decoding, e.g. measurement of distortion
  • H04N 19/17 - Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
  • H04N 19/12 - Selection from among a plurality of transforms or standards, e.g. selection between discrete cosine transform [DCT] and sub-band transform or selection between H.263 and H.264
  • H04N 19/176 - Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a block, e.g. a macroblock
  • H04N 19/182 - Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being a pixel
  1     2        Next Page