Disney Enterprises, Inc.

United States of America

Back to Profile

Receive daily alerts for new
USPTO trademarks filed by this owner
1-100 of 2,266 for Disney Enterprises, Inc. Sort by
Query
Patent
United States - USPTO
Aggregations Reset Report
Date
New (last 4 weeks) 22
2021 January (MTD) 14
2020 December 10
2020 November 15
2020 October 14
See more
IPC Class
G06K 9/00 - Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints 154
H04L 29/06 - Communication control; Communication processing characterised by a protocol 137
G06T 19/00 - Manipulating 3D models or images for computer graphics 135
G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer 127
G06F 17/30 - Information retrieval; Database structures therefor 119
See more
Status
Pending 301
Registered / In Force 1,965
Found results for  patents
  1     2     3     ...     23        Next Page

1.

Rotational Blur-Free Image Generation

      
Application Number 16516873
Status Pending
Filing Date 2019-07-19
First Publication Date 2021-01-21
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Nocon, Nathan D.
  • Wong, Clifford

Abstract

According to one implementation, an image generation system includes a rotor and a motor for spinning the rotor about an axis of rotation, and a display secured to the rotor. The display includes a display surface and a display aperture, a first privacy s screen situated at the display surface and having a convex curvature relative to light emitted from the display surface, and a second privacy screen situated between the first privacy screen and the display aperture and having a concave curvature relative to light emitted from the display surface. The first privacy screen and the second privacy screen are configured to substantially prevent rotational blur of an image displayed by the display surface while the display is spun by the motor and the rotor.

IPC Classes  ?

  • G09F 13/30 - Illuminated signs; Luminous advertising with moving light sources, e.g. rotating luminous tubes
  • G09F 19/02 - Advertising or display means not otherwise provided for incorporating moving display members
  • G09F 13/04 - Signs, boards, or panels, illuminated from behind the insignia
  • G09F 13/16 - Signs formed of, or incorporating, reflecting elements or surfaces, e.g. warning signs having triangular or other geometrical shape

2.

SYSTEM FOR GENERATING CUES IN AN AUGMENTED REALITY ENVIRONMENT

      
Application Number 16517398
Status Pending
Filing Date 2019-07-19
First Publication Date 2021-01-21
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Baumbach, Elliott
  • Panec, Timothy
  • Nocon, Nathan

Abstract

A system has an augmented reality device accessory that interacts with a virtual element in an augmented reality experience. Further, the system has an augmented reality device that that renders a virtual element that overlays a real-world element in the augmented reality experience. The augmented reality device also determines that the augmented reality device accessory meets one or more mobility criteria with respect to the virtual element. Further, the augmented reality device adjusts the rendering of the virtual element to increase visibility of the real-world element based on the one or more mobility criteria being met.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G02B 27/01 - Head-up displays

3.

VARIABLE RESOLUTION RECOGNITION

      
Application Number 17039704
Status Pending
Filing Date 2020-09-30
First Publication Date 2021-01-21
Owner Disney Enterprises, Inc. (USA)
Inventor Fidaleo, Douglas

Abstract

Systems and methods are described for dynamically adjusting an amount of retrieved recognition data based on the needs of a show, experience, or other event where participants are recognized. The retrieved recognition data may be deleted once it is no longer needed for the event. Recognition data retrieval is limited to just what is needed for the particular task, minimizing the uniqueness of any retrieved recognition data to respect participant privacy while providing an enhanced participant experience through recognition.

IPC Classes  ?

  • G06K 9/00 - Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
  • G06K 9/68 - Methods or arrangements for recognition using electronic means using sequential comparisons of the image signals with a plurality of reference, e.g. addressable memory
  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods
  • G06F 16/90 - Information retrieval; Database structures therefor; File system structures therefor - Details of database functions independent of the retrieved data types

4.

SYSTEMS AND METHOD FOR DYNAMIC CONTENT UNLOCK AND ADAPTIVE CONTROLS

      
Application Number 17062367
Status Pending
Filing Date 2020-10-02
First Publication Date 2021-01-21
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Kalama, Asa K.
  • Trowbridge, Robert Scott
  • King, Jacqueline E.
  • Huebner, Robert E.
  • Stepniewicz, Peter

Abstract

Systems and methods for dynamic modification of an amusement ride are disclosed herein. The system can include a simulation vehicle including a plurality of controls and at least one interface. The system can include a content presentation system, and a processor. The processor can: provide content to the at least one passenger; identify a user skill level based on a plurality of user inputs received by at least some of the plurality of controls of the simulation vehicle; identify a modification to a difficulty of the ride experience based in part on the identified user skill level; and modify the difficulty of the ride experience according to the identified modification.

IPC Classes  ?

  • A63F 13/798 - Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories for assessing skills or for ranking players, e.g. for generating a hall of fame
  • A63G 31/00 - Amusement arrangements

5.

TECHNIQUES FOR FEATURE-BASED NEURAL RENDERING

      
Application Number 16511961
Status Pending
Filing Date 2019-07-15
First Publication Date 2021-01-21
Owner DISNEY ENTERPRISES, INC. (USA)
Inventor
  • Borer, Dominik Tobias
  • Guay, Martin
  • Buhmann, Jakob Joachim
  • Sumner, Robert Walker

Abstract

Techniques are disclosed for learning a machine learning model that maps control data, such as renderings of skeletons, and associated three-dimensional (3D) information to two-dimensional (2D) renderings of a character. The machine learning model may be an adaptation of the U-Net architecture that accounts for 3D information and is trained using a perceptual loss between images generated by the machine learning model and ground truth images. Once trained, the machine learning model may be used to animate a character, such as in the context of previsualization or a video game, based on control of associated control points.

IPC Classes  ?

  • G06T 13/40 - 3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
  • G06N 20/00 - Machine learning
  • G06N 3/04 - Architecture, e.g. interconnection topology
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 17/15 - Correlation function computation

6.

Quality Control Systems and Methods for Annotated Content

      
Application Number 16512223
Status Pending
Filing Date 2019-07-15
First Publication Date 2021-01-21
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Farre Guiu, Miquel Angel
  • Petrillo, Matthew C.
  • Martin, Marc Junyent
  • Accardo, Anthony M.
  • Swerdlow, Avner
  • Alfaro Vendrell, Monica

Abstract

According to one implementation, a quality control (QC) system for annotated content includes a computing platform having a hardware processor and a system memory storing an annotation culling software code. The hardware processor executes the annotation culling software code to receive multiple content sets annotated by an automated content classification engine, and obtain evaluations of the annotations applied by the automated content classification engine to the content sets. The hardware processor further executes the annotation culling software code to identify a sample size of the content sets for automated QC analysis of the annotations applied by the automated content classification engine, and cull the annotations applied by the automated content classification engine based on the evaluations when the number of annotated content sets equals the identified sample size.

IPC Classes  ?

  • G06K 9/62 - Methods or arrangements for recognition using electronic means
  • G06K 9/00 - Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints

7.

Virtual Puppeteering Using a Portable Device

      
Application Number 16917467
Status Pending
Filing Date 2020-06-30
First Publication Date 2021-01-14
Owner
  • Disney Enterprises, Inc. (USA)
  • ETH Zurich (Switzerland)
Inventor
  • Anderegg, Raphael
  • Ciccone, Loic
  • Sumner, Robert W.

Abstract

A virtual puppeteering system includes a portable device including a camera, a display, a hardware processor, and a system memory storing an object animation software code. The hardware processor is configured to execute the object animation software code to, using the camera, generate an image in response to receiving an activation input, using the display, display the image, and receive a selection input selecting an object shown in the image. The hardware processor is further configured to execute the object animation software code to determine a distance separating the selected object from the portable device, receive an animation input, identify, based on the selected object and the received animation input, a movement for animating the selected object, generate an animation of the selected object using the determined distance and the identified movement, and render the animation of the selected object.

IPC Classes  ?

  • A63H 3/36 - Dolls - Details; Accessories
  • A63J 19/00 - Puppet, marionette, or shadow shows or theatres
  • G06T 13/40 - 3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
  • G06T 13/80 - 2D animation, e.g. using sprites

8.

SYSTEMS AND METHODS TO PROVIDE AN INTERACTIVE ENVIRONMENT IN RESPONSE TO TOUCH-BASED INPUTS

      
Application Number 16506438
Status Pending
Filing Date 2019-07-09
First Publication Date 2021-01-14
Owner
  • Disney Enterprises, Inc. (USA)
  • ETH Zürich (Eidgenössische Technische Hochschule Zürich) (Switzerland)
Inventor
  • Sumner, Robert
  • Buergisser, Benjamin
  • Zünd, Fabio
  • Vakulya, Gergely
  • Varga, Virag
  • Gross, Thomas
  • Sample, Alanson

Abstract

This disclosure presents systems and methods to provide an interactive environment in response to touch-based inputs. A first body channel communication device coupled to a user may transmit and/or receive signals configured to be propagated along skin of the user such that the skin of the user comprises a signal transmission path. A second body channel communication device coupled to an interaction entity may be configured to transmit and/or receive signals configured to be propagated along the skin of the user along the signal transmission path. A presentation device may present images of virtual content to the user. Information may be communicated between the first body channel communication device, the second body channel communication device, and the presentation device so that virtual content specific to the interaction entity may be presented to augment an appearance of the interaction entity.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/03 - Arrangements for converting the position or the displacement of a member into a coded form
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • H04B 1/3827 - Portable transceivers

9.

TECHNIQUES FOR AUTOMATICALLY DETECTING PROGRAMMING DEFICIENCIES

      
Application Number 16506666
Status Pending
Filing Date 2019-07-09
First Publication Date 2021-01-14
Owner DISNEY ENTERPRISES, INC. (USA)
Inventor
  • Machacek, Jan
  • Chakraborty, Anirvan
  • Villoslada, Christian

Abstract

A quality control (QC) engine analyzes sample code provided by a user and then generates example code that more effectively performs the same or similar operations performed by the sample code. An objective model analyzes the sample code to generate one or more tags indicating the intended objective(s) of the sample code. The quality model analyzes the sample code to generate one or more ratings indicating the degree to which the sample code achieves each intended objective. The performance model analyzes the tags and the ratings and estimates the performance of the sample code when executed in a production environment. The recommendation engine queries a database of code based on the tags, the ratings, and the estimated performance of the sample code to determine example code that achieves the same or similar objectives(s) as the sample code, but with at least one of higher ratings and greater performance.

IPC Classes  ?

10.

JAW TRACKING WITHOUT MARKERS FOR FACIAL PERFORMANCE CAPTURE

      
Application Number 16510698
Status Pending
Filing Date 2019-07-12
First Publication Date 2021-01-14
Owner
  • Disney Enterprises, Inc. (USA)
  • ETH Zürich (Eidgenössische Technische Hochschule Zürich) (Switzerland)
Inventor
  • Beeler, Dominik Thabo
  • Bradley, Derek Edward
  • Zoss, Gaspard

Abstract

Some implementations of the disclosure are directed to capturing facial training data for one or more subjects, the captured facial training data including each of the one or more subject's facial skin geometry tracked over a plurality of times and the subject's corresponding jaw poses for each of those plurality of times; and using the captured facial training data to create a model that provides a mapping from skin motion to jaw motion. Additional implementations of the disclosure are directed to determining a facial skin geometry of a subject; using a model that provides a mapping from skin motion to jaw motion to predict a motion of the subject's jaw from a rest pose given the facial skin geometry; and determining a jaw pose of the subject using the predicted motion of the subject's jaw.

IPC Classes  ?

  • G06T 7/246 - Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods
  • G06T 7/33 - Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods

11.

Bi-level specificity content annotation using an artificial neural network

      
Application Number 16509422
Grant Number 10891985
Status In Force
Filing Date 2019-07-11
First Publication Date 2021-01-12
Grant Date 2021-01-12
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Farre Guiu, Miquel Angel
  • Alfaro Vendrell, Monica
  • Aparicio Isarn, Albert
  • Fojo, Daniel
  • Martin, Marc Junyent
  • Accardo, Anthony M.
  • Swerdlow, Avner

Abstract

A content annotation system includes a computing platform having a hardware processor and a memory storing a tagging software code including an artificial neural network (ANN). The hardware processor executes the tagging software code to receive content having a content interval including an image of a generic content feature, encode the image into a latent vector representation of the image using an encoder of the ANN, and use a first decoder of the ANN to generate a first tag describing the generic content feature based on the latent vector representation. When a specific content feature learned by the ANN corresponds to the generic content to feature described by the first tag, the tagging software code uses a second decoder of the ANN to generate a second tag uniquely identifying the specific content feature based on the latent vector representation, and tags the content interval with the first and second tags.

IPC Classes  ?

  • H04N 9/80 - Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
  • G11B 27/00 - Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
  • G11B 27/34 - Indicating arrangements
  • G06K 9/00 - Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
  • G06N 3/08 - Learning methods
  • G06N 3/04 - Architecture, e.g. interconnection topology
  • G11B 27/19 - Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
  • H04N 5/93 - Regeneration of the television signal or of selected parts thereof
  • H04N 5/78 - Television signal recording using magnetic recording
  • H04N 5/92 - Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback

12.

SYSTEMS AND METHODS TO PROVIDE A SPORTS-BASED INTERACTIVE EXPERIENCE

      
Application Number 16459234
Status Pending
Filing Date 2019-07-01
First Publication Date 2021-01-07
Owner Disney Enterprises, Inc. (USA)
Inventor Panec, Timothy M.

Abstract

This disclosure presents systems and methods to provide sports-based interactive experiences. The interactive experiences may be facilitated by providing users' views of virtual content related to a particular sport. The systems and methods may utilize action sequence information and/or other information. The action sequence information may specify anticipated sequences of output signals generated by sensors coupled to real-world items of playing equipment. The output signals in the anticipated sequences of output signals may be associated with anticipated control signals for controlling the virtual content.

IPC Classes  ?

  • A63F 13/573 - Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using trajectories of game objects, e.g. of a golf ball according to the point of impact
  • G06T 11/00 - 2D [Two Dimensional] image generation
  • A63F 13/65 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
  • A63F 13/53 - Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game

13.

BALLISTIC ROBOT SYSTEM WITH SPIN AND OTHER CONTROLLED MOTION OF ROBOT DURING FLIGHT

      
Application Number 17029952
Status Pending
Filing Date 2020-09-23
First Publication Date 2021-01-07
Owner DISNEY ENTERPRISES, INC. (USA)
Inventor
  • Dohi, Anthony Paul
  • Christensen, Steven Niels
  • Setrakian, Mark Sox
  • Christensen, David Loyal
  • Imahara, Grant Masaru
  • Pope, Morgan T.
  • Watson, Scott Frazier
  • Niemeyer, Günter D.

Abstract

Systems and corresponding control methods providing a ballistic robot that flies on a trajectory after being released (e.g., in non-powered flight as a ballistic body) from a launch mechanism. The ballistic robot is adapted to control its position and/or inflight movements by processing data from onboard and offboard sensors and by issuing well-timed control signals to one or more onboard actuators to achieve an inflight controlled motion. The actuators may move an appendage such as an arm or leg of the robot or may alter the configuration of one or more body links (e.g., to change from an untucked configuration to a tucked configuration), while other embodiments may trigger a drive mechanism of an inertia moving assembly to change/move the moment of inertia of the flying body. Inflight controlled movements are performed to achieve a desired or target pose and orientation of the robot during flight and upon landing.

IPC Classes  ?

  • B25J 9/16 - Programme controls
  • B25J 9/06 - Programme-controlled manipulators characterised by multi-articulated arms
  • B25J 9/14 - Programme-controlled manipulators characterised by positioning means for manipulator elements fluid
  • B25J 13/08 - Controls for manipulators by means of sensing devices, e.g. viewing or touching devices

14.

Bottle closure

      
Application Number 29684689
Grant Number D0906806
Status In Force
Filing Date 2019-03-22
First Publication Date 2021-01-05
Grant Date 2021-01-05
Owner
  • The Coca-Cola Company (USA)
  • Disney Enterprises, Inc. (USA)
Inventor
  • Joshi, Rohit
  • Cooper, Matthew
  • Propp, Susan
  • Gutierrez, Ellen
  • Pickholtz, Michael
  • Beatty, Christopher Lee
  • Loo, Brian Scott
  • Pugne, Darin
  • Smith, Carl

15.

ANGLE ENHANCING SCREEN

      
Application Number 16454186
Status Pending
Filing Date 2019-06-27
First Publication Date 2020-12-31
Owner Disney Enterprises, Inc. (USA)
Inventor Smithwick, Quinn Yorklun Jen

Abstract

Implementations of angle-enhancing screens are disclosed herein. The angle-enhancing screens increase the field of view of an image projected thereon by a projection lens while maintaining an increased size of the projected image, by decreasing the size of picture elements making up the image while maintaining their pitch. In some embodiments, the angle-enhancing screen includes a field lens, such as a Fresnel field lens, for straightening the views of light projected thereon and a double lenslet array of matched lenslet pairs, each of the pairs including either two positive lenslets or one positive and one negative lenslet, for increasing the field of view. In another embodiment, the angle-enhancing screen may include a field lens and an array of four positive lenslet quartets. In a further embodiment, the field lens may be replaced with a Gabor superlens including two lenslet arrays of different pitches.

IPC Classes  ?

  • G02B 3/08 - Simple or compound lenses with non-spherical faces with discontinuous faces, e.g. Fresnel lens

16.

COMPUTATIONAL VIBRATION SUPPRESSION FOR ROBOTIC SYSTEMS

      
Application Number 16451209
Status Pending
Filing Date 2019-06-25
First Publication Date 2020-12-31
Owner DISNEY ENTERPRISES, INC. (USA)
Inventor
  • Bächer, Moritz Niklaus
  • Hoshyari, Shayan
  • Xu, Hongyi
  • Coros, Stelian
  • Knoop, Lars Espen

Abstract

A robot control method, and associated robot controllers and robots operating with such methods and controllers, providing computational vibration suppression. Given a desired animation cycle for a robotic system or robot, the control method uses a dynamic simulation of the physical robot, which takes into account the flexible components of the robot, to predict if vibrations will be seen in the physical robot. If vibrations are predicted with the input animation cycle, the control method optimizes the set of motor trajectories to return a set of trajectories that are as close as possible to the artistic or original intent of the provider of the animation cycle, while minimizing unwanted vibration. The new control method or design tool suppresses unwanted vibrations and allows a robot designer to use lighter and/or softer (less stiff) and, therefore, less expensive systems in new robots.

IPC Classes  ?

17.

CALIBRATION, CUSTOMIZATION, AND IMPROVED USER EXPERIENCE FOR BIONIC LENSES

      
Application Number 16455012
Status Pending
Filing Date 2019-06-27
First Publication Date 2020-12-31
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Smithwick, Quinn Y.J.
  • Snoddy, Jon H.
  • Fidaleo, Douglas A.

Abstract

The present disclosure relates to calibration, customization, and improved user experiences for smart or bionic lenses that are worn by a user. The calibration techniques include detecting and correcting distortion of a display of the bionic lenses, as well as distortion due to characteristics of the lens or eyes of the user. The customization techniques include utilizing the bionic lenses to detect eye characteristics that can be used to improve insertion of the bionic lenses, track health over time, and provide user alerts. The user experiences include interactive environments and animation techniques that are improved via the bionic lenses.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 7/521 - Depth or shape recovery from the projection of structured light
  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods
  • G06K 9/00 - Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
  • G02C 7/04 - Contact lenses for the eyes

18.

Media Content Validation Using Geometrically Encoded Metadata

      
Application Number 16449062
Status Pending
Filing Date 2019-06-21
First Publication Date 2020-12-24
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Chapman, Steven M.
  • Swanson, Todd P.
  • Patel, Mehul
  • Popp, Joseph
  • Popko, Ty

Abstract

According to one implementation, a system for validating media content includes a computing platform having a hardware processor and a system memory storing a media content validation software code. The hardware processor is configured to execute the media content validation software code to search the media content for a geometrically encoded metadata structure. When the geometrically encoded metadata structure is detected, the hardware processor is further configured to execute the media content validation software code to identify an original three-dimensional (3D) geometry of the detected geometrically encoded metadata structure, to extract metadata from the detected geometrically encoded metadata structure, decode the metadata extracted from the detected geometrically encoded metadata structure based on the identified original 3D geometry, and obtain a validation status of the media content based on the decoded metadata.

IPC Classes  ?

  • G06T 9/00 - Image coding
  • G06F 21/10 - Protecting distributed programs or content, e.g. vending or licensing of copyrighted material
  • H04N 19/46 - Embedding additional information in the video signal during the compression process
  • H04N 19/20 - Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using video object coding
  • G06T 17/20 - Wire-frame description, e.g. polygonalisation or tessellation

19.

SOFTWARE DEFINED NETWORK ORCHESTRATION TO MANAGE MEDIA FLOWS FOR BROADCAST WITH PUBLIC CLOUD NETWORKS

      
Application Number 16800853
Status Pending
Filing Date 2020-02-25
First Publication Date 2020-12-24
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Strein, Michael J.
  • Mccormick, Ryan N.
  • Beardsley, Craig L.

Abstract

Software defined network orchestration to manage media flows for broadcast with public cloud networks is provided by identifying a media flow at a media production facility for multicast transmission; registering the media flow to a registration database; migrating the media flow from multicast transmission to unicast transmission; transmitting the media flow to a public cloud network facility; and updating the registration database with a location of the media flow in the public cloud network facility. Once registered, a media flow management system allows any authorized device to request for a media flow; and in response locates the media flow based on a registration database indicating a location of the media flow (whether in the public cloud network facility, on a common carrier, or in a production facility); receives access to the media flow at the location; and allows the authorized device to consume the media flow.

IPC Classes  ?

  • H04L 29/06 - Communication control; Communication processing characterised by a protocol
  • H04L 12/26 - Monitoring arrangements; Testing arrangements

20.

Media Flow Transport Security Management

      
Application Number 16869236
Status Pending
Filing Date 2020-05-07
First Publication Date 2020-12-24
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Strein, Michael J.
  • Mason, Douglas R.
  • Beardsley, Craig L.
  • Kepler, Benjamin H.

Abstract

A media flow transport security manager of a hybrid cloud-based media production system having a network orchestrator and an extensible resource manager (ERM) includes a firewall communicatively coupled to a computing platform having a hardware processor and a memory storing a security software code. The hardware processor executes the security software code to communicate with the network orchestrator to identify multicast production media flow(s) for processing in a cloud-based virtual production environment, and to communicate with the ERM to obtain an identifier of each cloud-based resource used for processing cloud production media flow(s) corresponding to the identified multicast production media flow(s). The hardware processor also executes the security software code to receive an alert that the cloud production media flow(s) have been processed to generate corresponding post-production cloud media flow(s), and to route, using the obtained identifier of the cloud-based resource(s), the post-production cloud media flow(s) through the firewall.

IPC Classes  ?

  • H04L 29/06 - Communication control; Communication processing characterised by a protocol
  • G06F 9/455 - Emulation; Interpretation; Software simulation, e.g. virtualisation or emulation of application or operating system execution engines
  • H04L 12/18 - Arrangements for providing special services to substations for broadcast or conference
  • H04L 12/24 - Arrangements for maintenance or administration

21.

Hybrid Cloud-Based Media Production

      
Application Number 16869137
Status Pending
Filing Date 2020-05-07
First Publication Date 2020-12-24
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Strein, Michael J.
  • Toback, Zachary N.
  • Mccormick, Ryan N.
  • Beardsley, Craig L.
  • Kennedy, Brian M.

Abstract

A hybrid cloud-based media production system includes a facility extension platform having a hardware processor and a memory storing a software code. The hardware processor executes the software code to identify multicast production media flow(s) for processing in a cloud-based virtual production environment, to identify cloud-based resource(s) for processing one or more cloud production media flow(s) corresponding to the multicast production media flow(s), in the cloud-based virtual production environment, and to coordinate provisioning of the cloud-based virtual production environment with the identified cloud-based resource(s). The hardware processor also executes the software code to align, using a cloud permissible timing protocol, the timing of the cloud production media flow(s) in the cloud-based virtual production environment, and to process the cloud production media flow(s) in the cloud-based virtual production environment using the identified cloud-based resource(s).

IPC Classes  ?

  • H04L 29/06 - Communication control; Communication processing characterised by a protocol

22.

Extensible Resource Management for Hybrid Cloud-Based Media Production

      
Application Number 16869203
Status Pending
Filing Date 2020-05-07
First Publication Date 2020-12-24
Owner Disney Enterprises, lnc. (USA)
Inventor
  • Strein, Michael J.
  • Beardsley, Craig L.
  • Mccormick, Ryan N.

Abstract

An extensible resource manager (ERM) of a hybrid cloud-based media production system includes a computing platform having a hardware processor and a memory storing a resource management software code. The hardware processor executes the resource management software code to communicate with a network orchestrator of the hybrid cloud-based media production system to identify multicast production media flow(s) for processing in a cloud-based virtual production environment, to identify cloud-based resource(s) for processing cloud production media flow(s) corresponding to the multicast production media flow(s), in the cloud-based virtual production environment, and to determine whether a license exists for the identified cloud-based resource(s). The hardware processor also executes the resource management software code to obtain the license when the license does not exist, and to provision, after obtaining the license or in response to determining that the license exists, the cloud-based virtual production environment with the identified cloud-based resource(s).

IPC Classes  ?

  • H04L 29/06 - Communication control; Communication processing characterised by a protocol
  • G06F 9/50 - Allocation of resources, e.g. of the central processing unit [CPU]
  • G06F 9/455 - Emulation; Interpretation; Software simulation, e.g. virtualisation or emulation of application or operating system execution engines
  • H04L 29/08 - Transmission control procedure, e.g. data link level control procedure
  • H04L 12/911 - Network admission control and resource allocation, e.g. bandwidth allocation or in-call renegotiation

23.

Content editing during broadcast

      
Application Number 16867212
Grant Number 10867634
Status In Force
Filing Date 2020-05-05
First Publication Date 2020-12-15
Grant Date 2020-12-15
Owner Disney Enterprises, Inc. (USA)
Inventor Hirschi, Daniel J.

Abstract

A content editing system includes a computing platform having a hardware processor and a system memory storing a software code. The hardware processor is configured to execute the software code to record a content feed concurrently with its broadcast to produce a recorded content feed, perform a first edit of the recorded content feed during the recording and the broadcast, and begin writing a content file, during the recording and the broadcast, wherein the content file includes the a portion of the recorded content feed. The hardware processor is further configured to execute the software code to begin transcoding the content file, after beginning the writing of the content file and during the recording and the broadcast, perform a last edit of the recorded content feed, and complete the writing and the transcoding of the content file after completion of the recording.

IPC Classes  ?

  • G11B 27/02 - Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
  • H04N 21/6587 - Control parameters, e.g. trick play commands or viewpoint selection
  • H04N 21/4402 - Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
  • H04N 21/433 - Content storage operation, e.g. storage operation in response to a pause request or caching operations

24.

SYSTEM AND METHOD FOR POLARIZATION AND WAVELENGTH GATED TRANSPARENT DISPLAYS

      
Application Number 16424932
Status Pending
Filing Date 2019-05-29
First Publication Date 2020-12-03
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Smithwick, Quinn
  • Reisner, Samuel J.

Abstract

A transparent display system is provided where broadcast talent (or presenter) can see interactive content, tool palettes, prompts (and the like) as well as their own sketches and annotations, but a viewing audience sees only the broadcast talent and content intended for the viewing audience with the talent's annotation thereof. A transparent scattering screen together with optical filtering or gating of a first optical property of the light (e.g., polarization-based or wavelength-based) is used such that the first property of the light is projected onto the screen so the talent can see the projection, and a camera-side filter blocks the first property of the light so it is not seen by the camera. Simultaneously, a broadcast talent (or presenter) is illuminated by light having properties other than the first property, which allows the talent image to pass through the screen and the camera-side filter allowing the talent to be seen by camera. In some embodiments, a transparent “two-sided” display screen allows people on opposite sides of the screen to see each other, as well as independent 2D or 3D content from each person's side of the screen.

IPC Classes  ?

  • H04N 5/222 - Studio circuitry; Studio devices; Studio equipment
  • G03B 21/604 - Polarised screens
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • H04N 9/31 - Projection devices for colour picture display
  • H04N 21/81 - Monomedia components thereof

25.

SYSTEMS AND METHODS TO FACILITATE INTERACTION BY ONE OR MORE PARTICIPANTS WITH CONTENT PRESENTED ACROSS MULTIPLE DISTINCT PHYSICAL LOCATIONS

      
Application Number 16430089
Status Pending
Filing Date 2019-06-03
First Publication Date 2020-12-03
Owner Disney Enterprises, Inc. (USA)
Inventor Baumbach, Elliott

Abstract

This disclosure presents systems and methods to facilitate interaction by one or more participants with content presented across multiple distinct physical locations. A current distinct physical location of a participant may be determined. In response to determining the current distinct physical location in which the participant is located, operation of one or more content devices physically present in the current distinct physical location may be effectuated.

IPC Classes  ?

  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods
  • G06K 9/00 - Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
  • H04N 21/442 - Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed or the storage space available from the internal hard disk

26.

Method and Device for Fantasy Sports Player Recommendations

      
Application Number 16994200
Status Pending
Filing Date 2020-08-14
First Publication Date 2020-12-03
Owner Disney Enterprises Inc. (USA)
Inventor
  • Sloan, J. Nathaniel
  • Fishel, David Michael

Abstract

A method and device generates a fantasy sports recommendation. The method includes receiving a plurality of ranking values associated with a sport player, each of the ranking values being generated from a respective source. The method includes assigning a weight value to each of the ranking values, the weight value being associated with the respective source. The method includes generating a recommendation value for the sport player as a function of the ranking values and the corresponding weight values. The method includes receiving a selection value for the sport player. The method includes determining a further weight value for each of the sources as a function of the selection value, the recommendation value, and the weight value for the corresponding source.

IPC Classes  ?

  • A63F 13/828 - Managing virtual sport teams
  • A63F 13/00 - Video games, i.e. games using an electronically generated display having two or more dimensions
  • A63F 13/44 - Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment involving timing of operations, e.g. performing an action within a time slot
  • A63F 13/46 - Computing the game score
  • A63F 13/798 - Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories for assessing skills or for ranking players, e.g. for generating a hall of fame
  • A63F 13/85 - Providing additional services to players
  • A63F 13/79 - Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories

27.

Systems and Methods for Providing Media Content

      
Application Number 16995200
Status Pending
Filing Date 2020-08-17
First Publication Date 2020-12-03
Owner Disney Enterprises, Inc. (USA)
Inventor Arana, Mark

Abstract

The present disclosure provides for systems and methods for delivering and unlocking restricted media content on physical media. The disclosed methods and systems provide restricted media assets on a physical media. The restricted media assets may be ad-sponsored media content. Restrictions on the restricted media assets may be removed by providing an unlock code, either on an online or offline media player. In the ad-sponsored media context, an unlocked version might comprise an ad-free version.

IPC Classes  ?

  • G06Q 20/12 - Payment architectures specially adapted for electronic shopping systems
  • G06F 21/10 - Protecting distributed programs or content, e.g. vending or licensing of copyrighted material

28.

ASPECT RATIO ERROR RECOGNITION

      
Application Number 16416379
Status Pending
Filing Date 2019-05-20
First Publication Date 2020-11-26
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Varis Doggett, Erika Elizabeth
  • Wolak, Anna M.C.
  • Sigal, Leonid

Abstract

Techniques are disclosed for recognizing aspect ratio errors in image frames of a video and reporting the same. An aspect ratio checker application receives a video that includes multiple image frames and identifies aspect ratio changes in those image frames using a first differential of a time series that includes determined positions of the top, bottom, left, and right of content regions within the image frames. In particular, the aspect ratio checker may identify aspect ratio changes based on non-zero points of the first differential, and the aspect ratio checker further determines aspect ratios of content regions within image frames corresponding to the non-zero points of the first differential. In addition, the aspect ratio checker may generate and display a report indicating the determined aspect ratios and image frame ranges associated with those aspect ratios.

IPC Classes  ?

  • H04N 21/4402 - Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
  • H04N 7/01 - Conversion of standards

29.

LEGGED HIGH-DEXTERITY SELF-BALANCING CAPABLE ROBOT ACTOR

      
Application Number 16421742
Status Pending
Filing Date 2019-05-24
First Publication Date 2020-11-26
Owner DISNEY ENTERPRISES, INC. (USA)
Inventor
  • Lavalley, Scott Christopher
  • Thompson, Kyle Robert
  • Hopkins, Michael Anthony
  • Dickinson, Dexter J.
  • Bishop, Jared Edward
  • Rees, Jerry W.
  • Cesare, Kyle Michael

Abstract

A robot actor, or character mobility hardware platform, adapted to unleash or provide a wide variety of characters in the physical world. The robot actor enables the often screen-constrained characters to become life-like, interactive participants with nearby people in ways not presently achievable. The robot actor is an untethered, free-roaming robot that is has two (or more) legs, is adapted for high dexterity, is controlled and designed to be self-balancing, and, due to this combination of characteristics, the robot can provide characters with an illusion of life and, in many cases, in correct proportion and scale. The hardware and software of the robot actor will become a new generation of animatronic figures by providing a hardware platform capable of continuously evolving to become more capable through advances in controls and artificial intelligence (AI).

IPC Classes  ?

  • B62D 57/032 - Vehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted feet or skid
  • B25J 11/00 - Manipulators not otherwise provided for

30.

TECHNIQUES FOR CONCEALED VEHICLE RESET

      
Application Number 16901475
Status Pending
Filing Date 2020-06-15
First Publication Date 2020-11-26
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Kalama, Asa K.
  • Trowbridge, Robert S.

Abstract

Embodiments disclosed herein include an amusement park ride. The amusement park ride include a ride vehicle that can transition from a first configuration to a second configuration during the duration of the ride. The amusement park ride can include a loading area for loading passengers into the ride vehicle and a separate unloading area for disembarking passengers from the ride vehicle. After disembarking passengers from the ride vehicle the ride vehicle travels to a transition area concealed from the public where the ride transitions from the second configuration to the first configuration. In the transition area, maintenance can be performed on the ride vehicle and calibration can be performed on one or more of the ride vehicle systems (e.g., projector or audio systems).

IPC Classes  ?

  • A63G 4/00 - Accessories for roundabouts not restricted to one of groups or
  • G06F 3/16 - Sound input; Sound output
  • A63G 1/10 - Roundabouts power-driven electrically driven

31.

Automated Image Synthesis Using a Comb Neural Network Architecture

      
Application Number 16447768
Status Pending
Filing Date 2019-06-20
First Publication Date 2020-11-26
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Naruniec, Jacek
  • Weber, Romann
  • Schroers, Christopher

Abstract

An image synthesis system includes a computing platform having a hardware processor and a system memory storing a software code including a neural encoder and multiple neural decoders each corresponding to a respective persona. The hardware processor executes the software code to receive target image data, and source data that identifies one of the personas, and to map the target image data to its latent space representation using the neural encoder. The software code further identifies one of the neural decoders for decoding the latent space representation of the target image data based on the persona identified by the source data, uses the to identified neural decoder to decode the latent space representation of the target image data as the persona identified by the source data to produce a swapped image data, and blends the swapped image data with the target image data to produce one or more synthesized images.

IPC Classes  ?

  • G06T 5/50 - Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
  • G06T 7/32 - Determination of transform parameters for the alignment of images, i.e. image registration using correlation-based methods
  • G06N 3/04 - Architecture, e.g. interconnection topology
  • G06N 3/08 - Learning methods

32.

Guided interactions with books

      
Application Number 16410884
Grant Number 10850197
Status In Force
Filing Date 2019-05-13
First Publication Date 2020-11-19
Grant Date 2020-12-01
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Hellam, Taylor
  • Murdock, Malcolm
  • Poswal, Mohammad
  • Stewart, Shawnna

Abstract

A system for providing guided interactions with books includes a content delivery terminal communicatively coupled to a computing platform including a hardware processor and a memory storing a content delivery software code, a content registry, and a library of entries corresponding respectively to multiple interaction plans for guiding an interaction with a book. The hardware processor executes the content delivery software code to detect, via the content delivery terminal, a book corresponding to content included in the content registry, and to identify an interaction plan for the guided interaction based on one or more of the content and a user input received by the content delivery terminal. The content delivery software code further identifies a first portion of the content for use in initiating the interaction plan, and outputs the first portion of the content to the content delivery terminal for printing by the content delivery terminal in the book.

IPC Classes  ?

  • A63F 13/00 - Video games, i.e. games using an electronically generated display having two or more dimensions
  • A63F 13/47 - Controlling the progress of the video game involving branching, e.g. choosing one of several possible scenarios at a given point in time
  • A63F 13/48 - Starting a game, e.g. activating a game device or waiting for other players to join a multiplayer session
  • G06K 19/06 - Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code
  • A63F 13/80 - Special adaptations for executing a specific game genre or game mode
  • B42D 1/00 - Books or other bound products

33.

Content Adaptive Optimization for Neural Data Compression

      
Application Number 16413414
Status Pending
Filing Date 2019-05-15
First Publication Date 2020-11-19
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Schroers, Christopher
  • Meierhans, Simon
  • Campos, Joaquim
  • Mcphillen, Jared
  • Djelouah, Abdelaziz
  • Doggett, Erika Varis
  • Labrozzi, Scott
  • Xue, Yuanyi

Abstract

A data processing system includes a computing platform having a hardware processor and a memory storing a data compression software code. The hardware processor executes the data compression software code to receive a series of compression input data and encode a first compression input data of the series to a latent space representation of the first compression input data. The data compression software code further decodes the latent space representation to produce an input space representation of the first compression input data corresponding to the latent space representation, and generates f refined latent values for re-encoding the first compression input data based on a comparison of the first compression input data with its input space representation. The data compression software code then re-encodes the first compression input data using the refined latent values to produce a first compressed data corresponding to the first compression input data.

IPC Classes  ?

  • H04N 19/42 - Methods or arrangements for coding, decoding, compressing or decompressing digital video signals - characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation
  • G06N 3/02 - Computer systems based on biological models using neural network models
  • G06N 7/00 - Computer systems based on specific mathematical models
  • H04N 19/513 - Processing of motion vectors
  • H04N 19/186 - Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being a colour or a chrominance component

34.

SYSTEMS AND METHODS FOR INTERACTIVE RESPONSES BY TOYS AND OTHER CONNECTED DEVICES

      
Application Number 16243542
Status Pending
Filing Date 2019-01-09
First Publication Date 2020-11-12
Owner DISNEY ENTERPRISES, INC. (USA)
Inventor
  • Panec, Timothy M.
  • Thornton, Stephen A.

Abstract

The present disclosure may be embodied in systems, methods, and computer readable media, and may allow for interactive responses by network-enabled objects, programs, and machines. Embodiments described are well-suited for communicating and responding using small-sized data messages, thus allowing for their implementation in standard messaging systems and by simple devices such as toys and other low-level electronic devices that may have limited processing capacity and/or memory. The present disclosure provides in one embodiment a method comprising receiving at least one broadcast message, each broadcast message in the at least one broadcast message comprising a plurality of identifiers and determining a highest priority identifier amongst the plurality of identifiers received in the at least one broadcast message. The method may further comprise identifying a command sequence associated with the highest priority identifier and executing the command sequence.

IPC Classes  ?

  • H04L 12/18 - Arrangements for providing special services to substations for broadcast or conference
  • H04W 4/06 - Selective distribution of broadcast services, e.g. multimedia broadcast multicast service [MBMS]; Services to user groups; One-way selective calling services
  • H04W 4/80 - Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication
  • G08C 17/02 - Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link

35.

LEARNING-BASED SAMPLING FOR IMAGE MATTING

      
Application Number 16408199
Status Pending
Filing Date 2019-05-09
First Publication Date 2020-11-12
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Aydin, Tunc Ozan
  • Öztireli, Ahmet Cengiz
  • Tang, Jingwei
  • Aksoy, Yagiz

Abstract

Techniques are disclosed for image matting. In particular, embodiments decompose the matting problem of estimating foreground opacity into the targeted subproblems of estimating a background using a first trained neural network, estimating a foreground using a second neural network and the estimated background as one of the inputs into the second neural network, and estimating an alpha matte using a third neural network and the estimated background and foreground as two of the inputs into the third neural network. Such a decomposition is in contrast to traditional sampling-based matting approaches that estimated foreground and background color pairs together directly for each pixel. By decomposing the matting problem into subproblems that are easier for a neural network to learn compared to traditional data-driven techniques for image matting, embodiments disclosed herein can produce better opacity estimates than such data-driven techniques as well as sampling-based and affinity-based matting approaches.

IPC Classes  ?

  • G06T 7/90 - Determination of colour characteristics
  • G06N 3/08 - Learning methods
  • G06N 20/00 - Machine learning
  • G06T 11/60 - Editing figures and text; Combining figures or text
  • G06T 11/00 - 2D [Two Dimensional] image generation
  • G06T 7/194 - Segmentation; Edge detection involving foreground-background segmentation
  • G06T 3/40 - Scaling of a whole image or part thereof

36.

Selective audio visual synchronization for multiple displays

      
Application Number 16601434
Grant Number 10834298
Status In Force
Filing Date 2019-10-14
First Publication Date 2020-11-10
Grant Date 2020-11-10
Owner DISNEY ENTERPRISES, INC. (USA)
Inventor Koetter, Brent T.

Abstract

The present disclosure generally relates to synchronization between multiple displays for audio and/or visual content. Video, movie, television, live broadcast, streaming or online content typically include visual content and corresponding audio content synchronized to the visual content, i.e., a particular audio frame is set to be play backed at the same time a particular video frame is displayed. The present disclosure provides for delaying the presentation of visual content with respect to one or more displays in order to synchronize the presentation of the visual content on the displays.

IPC Classes  ?

  • H04N 5/932 - Regeneration of analogue synchronisation signals
  • H04N 5/04 - Synchronising
  • H04N 5/445 - Receiver circuitry for displaying additional information
  • G06F 3/0482 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance interaction with lists of selectable items, e.g. menus

37.

Systems and method for dynamic content unlock and adaptive control

      
Application Number 16147539
Grant Number 10828572
Status In Force
Filing Date 2018-09-28
First Publication Date 2020-11-10
Grant Date 2020-11-10
Owner DISNEY ENTERPRISES, INC. (USA)
Inventor
  • Kalama, Asa K.
  • Trowbridge, Robert Scott
  • King, Jacqueline E.
  • Huebner, Robert E.
  • Stepniewicz, Peter

Abstract

Systems and methods for dynamic modification of an amusement ride are disclosed herein. The system can include a simulation vehicle including a plurality of controls and at least one interface, which simulation vehicle can transit at least one passenger through a ride experience from a starting position to a terminating position. The system can include a content presentation system, and a processor. The processor can: provide content to the at least one passenger; identify a user skill level based on a plurality of user inputs received by at least some of the plurality of controls of the simulation vehicle; identify a modification to a difficulty of the ride experience based in part on the identified user skill level; and modify the difficulty of the ride experience according to the identified modification.

IPC Classes  ?

  • A63F 13/798 - Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories for assessing skills or for ranking players, e.g. for generating a hall of fame
  • A63G 31/00 - Amusement arrangements

38.

Interactive toy

      
Application Number 16421223
Grant Number 10828573
Status In Force
Filing Date 2019-05-23
First Publication Date 2020-11-10
Grant Date 2020-11-10
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Kalama, Asa K.
  • Trowbridge, Robert S.
  • Stepniewicz, Peter
  • Ging, Casey M.
  • Hampton, John C.

Abstract

Systems, methods, and devices disclosed herein relate to an improved interactive experience. The system can be for delivery of an interactive attraction experience. The system can include a passenger vehicle having a plurality of passenger locations and a content presentation system that can present a virtual portion of the attraction experience viewable from the plurality of passenger locations. The system can include a transceiver that can transmit a trigger signal and at least one processor. The at least one processor can control delivery of the virtual portion of the attraction experience via the content presentation system, detect presence of a non-ride device, and deliver a trigger signal via the transceiver to the non-ride device, the trigger signal linked with the attraction experience.

IPC Classes  ?

  • A63G 31/16 - Amusement arrangements creating illusions of travel
  • A63F 13/825 - Fostering virtual characters
  • A63F 13/235 - Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console using a wireless connection, e.g. infrared or piconet
  • A63F 13/213 - Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells

39.

HIGH SPEED BINARY COMPRESSIVE LIGHT FIELD PROJECTION SYSTEM

      
Application Number 16402007
Status Pending
Filing Date 2019-05-02
First Publication Date 2020-11-05
Owner Disney Enterprises, Inc. (USA)
Inventor Smithwick, Quinn Yorklun Jen

Abstract

Implementations of a compressive light field projection system are disclosed herein. In one embodiment, the compressive light field projection system utilizes a pair of light modulators, such as digital micromirror devices (DMDs), that interact to produce a light field. The light field is then projected via a projection lens onto a screen, which may be an angle expanding projection screen that includes a Fresnel lens for straightening the views of the light field and either a double lenticular array of Keplerian lens pairs or a single lenticular, for increasing the field of view. In addition, compression techniques are disclosed for generating patterns to place on the pair of light modulators so as to reduce the number of frames needed to recreate a light field.

IPC Classes  ?

  • H04N 13/351 - Multi-view displays for displaying three or more geometrical viewpoints without viewer tracking for displaying simultaneously
  • G03B 21/00 - Projectors or projection-type viewers; Accessories therefor
  • G02B 5/02 - Diffusing elements; Afocal elements
  • G03B 35/22 - Stereoscopic photography by simultaneous viewing using single projector with stereoscopic-base-defining system
  • G03B 21/20 - Lamp housings
  • G03B 21/625 - Lenticular translucent screens
  • H04N 13/302 - Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
  • H04N 9/31 - Projection devices for colour picture display
  • H04N 13/363 - Image reproducers using image projection screens
  • H04N 13/365 - Image reproducers using digital micromirror devices [DMD]
  • H04N 13/398 - Synchronisation thereof; Control thereof
  • H04N 13/133 - Equalising the characteristics of different image components, e.g. their average brightness or colour balance

40.

Coordination of Media Content Delivery to Multiple Media Players

      
Application Number 16933620
Status Pending
Filing Date 2020-07-20
First Publication Date 2020-11-05
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Comito, Keith
  • Lefler, Nathan
  • Corrigan-Colville, James A.

Abstract

A system for synchronizing media content playout includes a computing platform having a hardware processor and a system memory storing a software code. The hardware processor executes the software code to receive a first state message from a first media player playing a first media content and a second state message from a second media player playing a second media content, the first media content and the second media content being the same media content. The software code further determines a coordination state for playout of the first media content and the second media content based on one or more of the first and second state messages, and transmits a first coordination message including the coordination state to the first media player and a second coordination message including the coordination state to the second media player to synchronize playout of the first media content and the second media content.

IPC Classes  ?

  • H04N 21/6543 - Transmission by server directed to the client for forcing some client operations, e.g. recording
  • H04N 21/239 - Interfacing the upstream path of the transmission network, e.g. prioritizing client requests
  • H04N 21/647 - Control signaling between network components and server or clients; Network processes for video distribution between server and clients, e.g. controlling the quality of the video stream, by dropping packets, protecting content from unauthorised alteration within the network, monitoring of network load or bridging bet
  • H04N 21/8547 - Content authoring involving timestamps for synchronizing content
  • H04N 21/2387 - Stream processing in response to a playback request from an end-user, e.g. for trick-play
  • H04N 21/242 - Synchronization processes, e.g. processing of PCR [Program Clock References]
  • H04N 21/437 - Interfacing the upstream path of the transmission network, e.g. for transmitting client requests to a VOD server
  • H04N 21/45 - Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies 
  • H04N 21/43 - Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronizing decoder's clock; Client middleware
  • H04N 21/63 - Control signaling between client, server and network components; Network processes for video distribution between server and clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing

41.

SOFT-REAL-TIME HUB PROVIDING DATA TRANSPORT FOR PROCESSOR-IN-THE-LOOP (PIL) SIMULATIONS

      
Application Number 16398599
Status Pending
Filing Date 2019-04-30
First Publication Date 2020-11-05
Owner DISNEY ENTERPRISES, INC. (USA)
Inventor
  • Hofer, Christopher Carl
  • Marra, Iii, Robert Joseph
  • Milluzzi, Andrew Jesse
  • Corpuz, Jose Lugos

Abstract

A software-based (“soft”) real-time hub designed and implemented for use in simulation (or control testing) systems such as to provide a modular soft-real-time PIL. A simulation system of the present description typically may include one or more of the following useful subsystems or components: (a) a soft-real-time hub; (b) simulation interfaces; and (c) hardware emulation subsystems/devices. The soft-real-time hub is typically a combination of hardware and software adapted to provide deterministic data transport between simulations and input/output (I/O) emulation. By creating a common point, the hub enables simulation modules to be swapped out as the simulation system progresses without the operator having to worry about interface timing, forcing, or data visualization. A desirable aspect of the simulation system is it allows for testing certain conditions by forcing I/O and then seeing how the controller or system under testing responds.

IPC Classes  ?

  • G06F 17/50 - Computer-aided design
  • G05B 19/05 - Programmable logic controllers, e.g. simulating logic interconnections of signals according to ladder diagrams or function charts

42.

ILLUMINATION-BASED SYSTEM FOR DISTRIBUTING IMMERSIVE EXPERIENCE CONTENT IN A MULTI-USER ENVIRONMENT

      
Application Number 16402106
Status Pending
Filing Date 2019-05-02
First Publication Date 2020-11-05
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Chapman, Steven
  • Popp, Joseph
  • Taylor, Alice
  • Hager, Joseph

Abstract

An immersive experience system is provided. The immersive experience system has a processor that determines a position of a first head-mounted display. Further, the processor determines a position of a second head-mounted display. The processor also generates a first image for a first immersive experience corresponding to the position of the first head-mounted display. Moreover, the process encodes the first image into a first infrared spectrum illumination having a first wavelength. In addition, the processor generates a second image for a second immersive experience corresponding to the position of the second head-mounted display. Finally, the processor encodes the second image into a second infrared spectrum illumination having a second wavelength. The first wavelength is distinct from the second wavelength.

IPC Classes  ?

  • H04N 13/194 - Transmission of image signals
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • H04N 13/167 - Synchronising or controlling image signals
  • H04N 13/368 - Image reproducers using viewer tracking for two or more viewers
  • H04N 13/398 - Synchronisation thereof; Control thereof
  • H04B 10/114 - Indoor or close-range type systems

43.

Stability Controlled Systems

      
Application Number 16392410
Status Pending
Filing Date 2019-04-23
First Publication Date 2020-10-29
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Wong, Clifford
  • Nocon, Nathan D.

Abstract

A stability controlled system includes a computing platform having a hardware processor, a memory storing a software code, a moveable component, and a tilt sensor. The hardware processor executes the software code to monitor the tilt sensor to determine whether the system is at a tilt with respect to a support surface for the system. When the tilt sensor is sensing the tilt with respect to the support surface: when the moveable component is off, the software code prevents the moveable component from turning on, and when the moveable component is on, the software code performs one of (a) turning off the moveable component, and (b) slowing down a regular rate of motion of the moveable component. When the tilt sensor is not sensing the tilt with respect to the support surface, the software code permits the moveable component to be turned on and have the regular rate of motion.

IPC Classes  ?

  • G05D 15/00 - Control of mechanical force or stress; Control of mechanical pressure
  • G05B 15/02 - Systems controlled by a computer electric

44.

SYSTEMS AND METHODS TO SYNCHRONIZE REAL-WORLD MOTION OF PHYSICAL OBJECTS WITH PRESENTATION OF VIRTUAL CONTENT

      
Application Number 16393781
Status Pending
Filing Date 2019-04-24
First Publication Date 2020-10-29
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Panec, Timothy M.
  • Rosenthal, Janice
  • Gibson, Hunter J.
  • Nocon, Nathan D.
  • Thornton, Stephen A.

Abstract

This disclosure presents systems and methods to synchronize real-world motion of physical objects with presentation of virtual content. Individual physical objects may be detected and/or identified based on image information defining one or more images of a real-world environment. Individual network connections may be established between individual computing platforms and individual physical objects. A network connection may facilitate a synchronization of a presentation of virtual content on a computing platform with motion of one or more physical objects in the real-world environment.

IPC Classes  ?

  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • A63F 13/335 - Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using wide area network [WAN] connections using Internet
  • A63F 13/2145 - Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens

45.

Toy with build-time effects

      
Application Number 16659767
Grant Number 10814242
Status In Force
Filing Date 2019-10-22
First Publication Date 2020-10-27
Grant Date 2020-10-27
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Hampton, John C.
  • Wilde, Alex Christopher

Abstract

A design for build-your-own (BYO) toys in which each build participant is provided a toy body or base part and asked to select a data key for their toy. The key has associated with it a set of build-time functions or special effects as well as post build-time functions. The toy body includes a controller that is operable to sense or detect when the key is properly installed or attached to the toy body (e.g., in a key receptacle or interface) and, in response, to read an identifier (ID) (e.g., a static code as may be provided in an RFID tag/chip). The controller may then operate onboard functional elements such as lights and a sound system to provide functions or special effects linked to that key's ID. Each key type may have different special effects associated with it for use during build time and during post build time.

IPC Classes  ?

  • A63H 33/26 - Magnetic or electric toys
  • A63H 33/22 - Optical, colour, or shadow toys
  • A63H 5/00 - Musical or noise-producing devices for additional toy effects other than acoustical

46.

Pose Estimation and Body Tracking Using an Artificial Neural Network

      
Application Number 16386173
Status Pending
Filing Date 2019-04-16
First Publication Date 2020-10-22
Owner
  • Disney Enterprises, Inc. (USA)
  • ETH Zürich (Switzerland)
Inventor
  • Öztireli, Ahmet Cengiz
  • Chandran, Prashanth
  • Gross, Markus

Abstract

According to one implementation, a pose estimation and body tracking system includes a computing platform having a hardware processor and a system memory storing a software code including a tracking module trained to track motions. The software code receives a series of images of motion by a subject, and for each image, uses the tracking module to determine locations corresponding respectively to two-dimensional (2D) skeletal landmarks of the subject based on constraints imposed by features of a hierarchical skeleton model intersecting at each 2D skeletal landmark. The software code further uses the tracking module to infer joint angles of the subject based on the locations and determine a three-dimensional (3D) pose of the subject based on the locations and the joint angles, resulting in a series of 3D poses. The software code outputs a tracking image corresponding to the motion by the subject based on the series of 3D poses.

IPC Classes  ?

  • G06T 7/20 - Analysis of motion
  • G06T 7/70 - Determining position or orientation of objects or cameras
  • G06T 11/00 - 2D [Two Dimensional] image generation
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06N 3/08 - Learning methods
  • G06N 20/00 - Machine learning

47.

ANIMATION STREAMING FOR MEDIA INTERACTION

      
Application Number 16391137
Status Pending
Filing Date 2019-04-22
First Publication Date 2020-10-22
Owner Disney Enterprises, Inc. (USA)
Inventor Mitchell, Kenneth J.

Abstract

Embodiments provide for animation streaming for media interaction by receiving, at a generator, inputs from a target device presenting of a virtual environment; updating, based on the user inputs, a model of the virtual environment; determining network conditions between the generator and target device; generating a packet that includes a forecasted animation set for a virtual object in the updated model that comprises rig updates for the virtual object for at least two different states, and a number of states included in the packet is based on the network conditions; and streaming the packet to the target device, where the target device: receives a second input to interact with the virtual environment that changes the virtual environment to a given state; selects and applies a rig update associated with the given state a local model of the virtual object; and outputs the updated local model on the target device.

IPC Classes  ?

  • G06T 13/40 - 3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
  • A63F 13/355 - Performing operations on behalf of clients with restricted processing capabilities, e.g. servers transform changing game scene into an MPEG-stream for transmitting to a mobile phone or a thin client
  • A63F 13/358 - Adapting the game course according to the network or server load, e.g. for reducing latency due to different connection speeds between clients

48.

SYSTEM AND METHOD OF GENERATING EFFECTS DURING LIVE RECITATIONS OF STORIES

      
Application Number 16904550
Status Pending
Filing Date 2020-06-17
First Publication Date 2020-10-08
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Hellam, Taylor
  • Murdock, Malcolm E.
  • Poswal, Mohammad
  • Peck, Nicolas

Abstract

One aspect of this disclosure relates to presentation of a first effect on one or more presentation devices during an oral recitation of a first story. The first effect is associated with a first trigger point, first content, and/or first story. The first trigger point being one or more specific syllables from a word and/or phrase in the first story. A first transmission point associated with the first effect can be determined based on a latency of a presentation device and user speaking profile. The first transmission point being one or more specific syllables from a word and/or phrase before the first trigger point in the first story. Control signals for instructions to present the first content at the first trigger point are transmitted to the presentation device when a user recites the first transmission point such that first content is presented at the first trigger point.

IPC Classes  ?

  • G10L 15/22 - Procedures used during a speech recognition process, e.g. man-machine dialog
  • G10L 15/18 - Speech classification or search using natural language modelling
  • G10L 15/02 - Feature extraction for speech recognition; Selection of recognition unit

49.

Automated Determination of Expressions for an Interactive Social Agent

      
Application Number 16374478
Status Pending
Filing Date 2019-04-03
First Publication Date 2020-10-08
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Kennedy, James R.
  • Lehman, Jill Fain
  • El Haddad, Kevin
  • Fidaleo, Douglas A.

Abstract

A system providing an interactive social agent can include a computing platform having a hardware processor and a memory storing a training content standardization software code configured to receive content depicting human expressions and including annotation data describing the human expressions from multiple content annotation sources, generate a corresponding content descriptor for each content annotation source to translate the annotation data into a standardized data format, and transform the annotation data into the standardized data format using the corresponding content descriptor. The content and the annotation data in the to standardized format are stored as training data for use in training expressions for the interactive social agent. The memory may also store a character remapping software code configured to receive data identifying an expression for the interactive social agent, identify a character persona of the interactive social agent, and determine a modified expression based on expressive idiosyncrasies of the character persona.

IPC Classes  ?

50.

Systems and methods for enhancing accuracy of spatial location and rotational orientation determination of wearable head-mounted display device

      
Application Number 16376994
Grant Number 10816813
Status In Force
Filing Date 2019-04-05
First Publication Date 2020-10-08
Grant Date 2020-10-27
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Goslin, Michael P.
  • Panec, Timothy M.
  • Drake, Corey D.
  • Yeung, Jason

Abstract

Systems and methods for enhancing the accuracy of spatial location and rotational orientation determination of a wearable head-mounted display device while in a motion simulating vehicle are disclosed. Exemplary implementations may: generate output signals conveying vehicle information; generate output signals conveying user information of a user; obtain presentation information; determine, based on the user information and the vehicle information, spatial location and rotational orientation of the wearable head-mounted display device with respect to a reference frame such that accuracy of the determination is enhanced with respect to only using the user information; determine a view of the virtual space that corresponds to the spatial location and the rotational orientation of the wearable head-mounted display device determined; and effectuate, via the wearable head-mounted display device, presentation of the view of the virtual space.

IPC Classes  ?

51.

PERSONALIZED STYLIZED AVATARS

      
Application Number 16363640
Status Pending
Filing Date 2019-03-25
First Publication Date 2020-10-01
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Comploi, Dumene
  • Gonzalez, Francisco E.

Abstract

The present disclosure is related to a method to generate user representative avatars that fit within a design paradigm. The method includes receiving depth information corresponding to multiple user features of the user, determining one or more feature landmarks for the user based on the depth information, utilizing the one or more feature landmarks to classify a first user feature relative to an avatar feature category, selecting a first avatar feature from the avatar feature category based on the classification of the first user feature, combining the first avatar feature within an avatar representation to generate a user avatar, and output the user avatar for display.

IPC Classes  ?

  • G06T 13/40 - 3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
  • G06T 17/20 - Wire-frame description, e.g. polygonalisation or tessellation
  • G06K 9/00 - Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
  • G06T 7/50 - Depth or shape recovery

52.

SYSTEMS AND METHODS FOR GAME PROFILE DEVELOPMENT BASED ON VIRTUAL AND/OR REAL ACTIVITIES

      
Application Number 16366456
Status Pending
Filing Date 2019-03-27
First Publication Date 2020-10-01
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Goslin, Michael P.
  • Hsu, Jonathan R.
  • Drake, Corey D.
  • Medrano, Tritia

Abstract

Systems and methods to facilitate game profile development based on virtual and real activities of users are described herein. A user may perform virtual and/or real activities in order to develop a game profile so that the user can play one or more games where the game profile may be implemented. Users may be motivated to perform the virtual and/or real activities to develop the game profiles and show off and/or test their developed game profiles. The one or more games where the game profile may be implemented may be specific to one or more geo-graphic locations.

IPC Classes  ?

  • A63F 13/798 - Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories for assessing skills or for ranking players, e.g. for generating a hall of fame
  • A63F 13/65 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
  • A63F 13/69 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor by enabling or updating specific game elements, e.g. unlocking hidden features, items, levels or versions
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • A63F 13/822 - Strategy games; Role-playing games 

53.

DISPLAY SYSTEM FOR PRODUCING DAYLIGHT-VISIBLE HOLOGRAPHIC OR FLOATING 3D IMAGERY

      
Application Number 16371266
Status Pending
Filing Date 2019-04-01
First Publication Date 2020-10-01
Owner DISNEY ENTERPRISES, INC. (USA)
Inventor
  • Joseph, Daniel M.
  • Klouda, Jessica Anne
  • Matz, Donald Leo
  • Ellis, Jacob A.

Abstract

A system for displaying three dimensional (3D) images. The system includes a 3D display operating in a first state to display a 3D image by outputting light into a viewing space and operating in a second state in which the 3D image is not displayed. The system further includes a screen element positioned between the 3D display and the viewing space. The screen element reflects light from the viewing space to appear opaque to a viewer in the viewing space when the 3D display operates in the second state. The screen element transmits the light output by the 3D display, whereby the 3D display image is perceivable by the viewer in the viewing space. The screen element includes a sheet of mesh or netting material that transmits light output by the 3D display through its pores or openings and may be a planar sheet of scrim, tulle, or chiffon.

IPC Classes  ?

  • G02B 27/22 - Other optical systems; Other optical apparatus for producing stereoscopic or other three-dimensional effects

54.

TOUCHABLE AND 360-DEGREE PLAYABLE HOLOGRAPHIC DISPLAY

      
Application Number 16371284
Status Pending
Filing Date 2019-04-01
First Publication Date 2020-10-01
Owner DISNEY ENTERPRISES, INC. (USA)
Inventor
  • Joseph, Daniel M.
  • Klouda, Jessica Anne

Abstract

A system for displaying three dimensional (3D) images. The system includes a 3D display operating in a first state to display a 3D image by outputting light into a viewing space and operating in a second state in which the 3D image is not displayed. The system further includes a screen element positioned between the 3D display and the viewing space. The screen element reflects light from the viewing space to appear opaque to a viewer in the viewing space when the 3D display operates in the second state. The screen element transmits the light output by the 3D display, whereby the 3D display image is perceivable by the viewer in the viewing space. The screen element includes a sheet of mesh or netting material that transmits light output by the 3D display through its pores or openings and may be a planar sheet of scrim or tulle.

IPC Classes  ?

  • G02B 27/22 - Other optical systems; Other optical apparatus for producing stereoscopic or other three-dimensional effects

55.

Perceptual data association

      
Application Number 16745145
Grant Number 10796195
Status In Force
Filing Date 2020-01-16
First Publication Date 2020-10-01
Grant Date 2020-10-06
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Papon, Jeremie A.
  • Freeman, Kyle G.

Abstract

Embodiments provide for perceptual data association from at least a first and a second sensor disposed at different positions in an environment, in respective series of local scene graphs that identify characteristics of objects in the environment that are updated asynchronously and merging the series of local scene graphs to form a coherent image of the environment from multiple perspectives.

IPC Classes  ?

  • G06T 13/80 - 2D animation, e.g. using sprites
  • G06K 9/46 - Extraction of features or characteristics of the image
  • G06K 9/62 - Methods or arrangements for recognition using electronic means
  • G06T 11/00 - 2D [Two Dimensional] image generation
  • G06T 7/70 - Determining position or orientation of objects or cameras

56.

MENU NAVIGATION MODE FOR MEDIA DISCS

      
Application Number 16900569
Status Pending
Filing Date 2020-06-12
First Publication Date 2020-10-01
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Kwan, Brian
  • Jessen, David M.
  • Madden, James J.

Abstract

Systems and methods are provided for reordering and/or bypassing certain informational content or menus that are conventionally presented prior to playback of media content stored on physical media discs. Upon initial use of a physical media disc, certain information content or menus may be presented to a user or viewer, for example, piracy warnings, language selection menus, etc. However, upon subsequent use of the physical media disc, such informational content or menus may be bypassed. The user or viewer is given an option to immediately begin consuming the media content stored on the physical media disc. Conventional content, such as trailers are not played prior to playback of the media content.

IPC Classes  ?

  • G11B 31/00 - Arrangements for the associated working of recording or reproducing apparatus with related apparatus
  • H04N 21/472 - End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification or for manipulating displayed content
  • G11B 27/10 - Indexing; Addressing; Timing or synchronising; Measuring tape travel
  • G11B 27/00 - Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
  • H04N 5/445 - Receiver circuitry for displaying additional information
  • H04N 21/4545 - Input to filtering algorithms, e.g. filtering a region of the image
  • H04N 5/76 - Television signal recording
  • G11B 27/11 - Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information not detectable on the record carrier

57.

Content promotion using a conversational agent

      
Application Number 16357185
Grant Number 10856041
Status In Force
Filing Date 2019-03-18
First Publication Date 2020-09-24
Grant Date 2020-12-01
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Farre Guiu, Miquel Angel
  • Aparicio Isarn, Albert
  • Badia Pujol, Jordi
  • Martin, Marc Junyent
  • Accardo, Anthony M.
  • Roeckle, Jason
  • Solaro, John
  • Swerdlow, Avner

Abstract

A content promotion system includes a computing platform having a hardware processor and a system memory storing a conversational agent software code. The hardware processor executes the conversational agent software code to receive user identification data, obtain user profile data including a content consumption history of a user associated with the user identification data, and identify a first predetermined phrase for use in interacting with the user based on the user profile data. In addition, the conversational agent software code initiates a dialog with the user based on the first predetermined phrase, detects a response or non-response to the dialog, updates the user profile data based on the response or non-response, resulting in updated user profile data, identifies a second predetermined phrase for use in interacting with the user based on the updated user profile data, and continues the dialog with the user based on the second predetermined phrase.

IPC Classes  ?

  • H04H 60/32 - Arrangements for monitoring conditions of receiving stations, e.g. malfunction or breakdown of receiving stations
  • H04N 21/442 - Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed or the storage space available from the internal hard disk
  • H04N 21/466 - Learning process for intelligent management, e.g. learning user preferences for recommending movies
  • H04N 21/45 - Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies 

58.

ASPECT RATIO CONVERSION WITH MACHINE LEARNING

      
Application Number 16360944
Status Pending
Filing Date 2019-03-21
First Publication Date 2020-09-24
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Narayan, Nimesh C.
  • Yaacob, Yazmaliza
  • Grubin, Kari M.
  • Wahlquist, Andrew J.

Abstract

Techniques are disclosed for converting image frames, such as the image frames of a motion picture, from one aspect ratio to another while predicting the pan and scan framing decisions that a human operator would make. In one configuration, one or more functions for predicting pan and scan framing decisions are determined, at least in part, via machine learning using training data that includes historical pan and scan conversions. The training data may be prepared by extracting features indicating visual and/or audio elements associated with particular shots, among other things. Function(s) may be determined, using machine learning, that take such extracted features as input and output predicted pan and scan framing decisions. Thereafter, the image frames of a received video may be converted between aspect ratios on a shot-by-shot basis, by extracting the same features and using the function(s) to make pan and scan framing predictions.

IPC Classes  ?

  • H04N 7/01 - Conversion of standards
  • G06K 9/00 - Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
  • G06K 9/62 - Methods or arrangements for recognition using electronic means
  • G06F 3/0484 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object or an image, setting a parameter value or selecting a range

59.

SYSTEMS AND METHODS FOR ALLOWING INTERACTIVE BROADCAST STREAMED VIDEO FROM DYNAMIC CONTENT

      
Application Number 16362561
Status Pending
Filing Date 2019-03-22
First Publication Date 2020-09-24
Owner Disney Enterprises, Inc. (USA)
Inventor Smithers, Andi

Abstract

Some implementations of the disclosure are directed to allowing interactive broadcast streamed video from games and other dynamic content. In accordance with some implementations, a content creator may publish a plurality of video surfaces of an environment for streaming to a plurality of client devices for video playback. The plurality of video surfaces may correspond, for example, to a cube map of a gaming environment captured from the perspective of a player. Upon receiving a stream including multiple video surfaces such as a cubemap, a media player of a viewer may generate a fully-rendered three-dimensional view of the environment.

IPC Classes  ?

  • A63F 13/5255 - Changing parameters of virtual cameras according to dedicated instructions from a player, e.g. using a secondary joystick to rotate the camera around a player's character
  • A63F 13/86 - Watching games played by other players
  • G06T 3/00 - Geometric image transformation in the plane of the image
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts

60.

Systems and methods to provide an interactive space based on vehicle-to-vehicle communications

      
Application Number 16525917
Grant Number 10785621
Status In Force
Filing Date 2019-07-30
First Publication Date 2020-09-22
Grant Date 2020-09-22
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Drake, Corey D.
  • Gibson, Hunter
  • Yeung, Jason
  • Goslin, Michael P.

Abstract

This disclosure relates to systems and methods to provide an interactive space based on vehicle-to-vehicle communications. A vehicle may store experience information and/or other information. The experience information may define virtual content to be presented to a user residing in the vehicle to create an interactive space. The virtual content may be associated with an experience location in a real-world environment. Responsive to a vehicle location of the vehicle being at or near the experience location, the user may be presented with views of the virtual content. The user may interact with the virtual content causing an update of the experience information. Upon detection of presence of a second vehicle, the vehicle may communicate the updated experience information to the second vehicle.

IPC Classes  ?

  • H04W 4/46 - Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P] for vehicle-to-vehicle communication [V2V]
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • H04W 4/021 - Services related to particular areas, e.g. point of interest [POI] services, venue services or geofences
  • B60R 1/00 - Optical viewing arrangements
  • G01C 21/36 - Input/output arrangements for on-board computers
  • H04L 29/08 - Transmission control procedure, e.g. data link level control procedure

61.

TECHNIQUES FOR INFERRING THE CONFIGURATION OF A ROOM FROM SKELETON TRACKING

      
Application Number 16354094
Status Pending
Filing Date 2019-03-14
First Publication Date 2020-09-17
Owner DISNEY ENTERPRISES, INC. (USA)
Inventor
  • Baumbach, Elliott
  • Goslin, Michael

Abstract

In various embodiments, a map inference application automatically maps a user space. A camera is positioned within the user space. In operation, the map inference application determines a path of a first moving object within the user space based on a tracking dataset generated from images captured by the camera. Subsequently, the map inference application infers a walking space within the user space based on the path. The map inference application then generates a model of at least a portion of the user space based on the walking space. One or more movements of a second object within the user space are based on the model. Advantageously, unlike prior art solutions, the map inference application enables a model of a user space to be automatically and efficiently generated based on images from a single stationary camera.

IPC Classes  ?

  • G06T 7/246 - Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
  • G06N 5/04 - Inference methods or devices
  • G06K 9/00 - Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
  • G06T 15/00 - 3D [Three Dimensional] image rendering

62.

GAZE BASED RENDERING FOR AUDIENCE ENGAGEMENT

      
Application Number 16298848
Status Pending
Filing Date 2019-03-11
First Publication Date 2020-09-17
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Pan, Ye
  • Mitchell, Kenny

Abstract

The present disclosure is related to an audience engagement system and method to display images on a display. The method includes detecting a gaze direction of a designated viewer, rendering a gaze object within an image on a gaze axis corresponding to the gaze direction, rendering an audience object within the image on a normal axis corresponding to a display axis, composting the audience object and the gaze object together in a composited image, and displaying the composited image on the display.

IPC Classes  ?

  • G06T 7/70 - Determining position or orientation of objects or cameras
  • G06T 7/246 - Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 7/00 - Image analysis

63.

SYNCHRONIZED AUGMENTED REALITY GAMEPLAY ACROSS MULTIPLE GAMING ENVIRONMENTS

      
Application Number 16354848
Status Pending
Filing Date 2019-03-15
First Publication Date 2020-09-17
Owner DISNEY ENTERPRISES, INC. (USA)
Inventor
  • Goslin, Michael P.
  • Nocon, Nathan

Abstract

Various embodiments of the invention disclosed herein provide techniques for implementing augmented reality (AR) gameplay across multiple AR gaming environments. A synchronized AR gaming application executing on an AR gaming console detects that a first gaming console that is executing an AR gaming application has exited a first AR gaming environment and entered a second AR gaming environment. The synchronized AR gaming application connects to a communications network associated with the second AR gaming environment. The synchronized AR gaming application detects, via the communications network, a sensor associated with the second AR gaming environment. The synchronized AR gaming application alters execution of the AR gaming application based at least in part on sensor data received via the sensor to enable the AR gaming application to continue executing as the first gaming console exits the first AR gaming environment and enters the second AR gaming environment.

IPC Classes  ?

  • A63F 13/25 - Output arrangements for video game devices
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • H04W 4/021 - Services related to particular areas, e.g. point of interest [POI] services, venue services or geofences

64.

Content Promotion Through Automated Curation of Content Clips

      
Application Number 16880669
Status Pending
Filing Date 2020-05-21
First Publication Date 2020-09-10
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Horn, David
  • Mcnabb, Michael
  • Fong, Jason G.

Abstract

A content curation system includes a computing platform having a hardware processor and a system memory storing a content promotion software code providing a user interface. The hardware processor executes the content promotion software code to receive an initiation signal corresponding to a user action, and, in response to receiving the initiation signal, to identify multiple content items as desirable content items to the user. In addition, the content promotion software code determines a portion of the desirable content item as most desirable content to the user, and, for each most desirable content portion, obtains a content clip including that content, resulting in multiple content clips corresponding respectively to the multiple content items. The content promotion software code further outputs the content clips for playout to the user via the user interface.

IPC Classes  ?

  • H04N 21/2387 - Stream processing in response to a playback request from an end-user, e.g. for trick-play
  • H04N 21/422 - Input-only peripherals, e.g. global positioning system [GPS]
  • H04N 21/44 - Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to MPEG-4 scene graphs
  • H04N 21/8549 - Creating video summaries, e.g. movie trailer

65.

JOINT ESTIMATION FROM IMAGES

      
Application Number 16289441
Status Pending
Filing Date 2019-02-28
First Publication Date 2020-09-03
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Guay, Martin
  • Borer, Dominik Tobias
  • Öztireli, Ahmet Cengiz
  • Sumner, Robert W.
  • Buhmann, Jakob Joachim

Abstract

Techniques are disclosed for estimating poses from images. In one embodiment, a machine learning model, referred to herein as the “detector,” is trained to estimate animal poses from images in a bottom-up fashion. In particular, the detector may be trained using rendered images depicting animal body parts scattered over realistic backgrounds, as opposed to renderings of full animal bodies. In order to make appearances of the rendered body parts more realistic so that the detector can be trained to estimate poses from images of real animals, the body parts may be rendered using textures that are determined from a translation of rendered images of the animal into corresponding images with more realistic textures via adversarial learning. Three-dimensional poses may also be inferred from estimated joint locations using, e.g., inverse kinematics.

IPC Classes  ?

  • G06T 13/40 - 3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods
  • G06T 15/04 - Texture mapping

66.

SYSTEM FOR COMMUNICATING AND USING TRAFFIC ANALYSIS IN A SPACE WITH MOVING OBSTACLES

      
Application Number 16871881
Status Pending
Filing Date 2020-05-11
First Publication Date 2020-09-03
Owner DISNEY ENTERPRISES, INC. (USA)
Inventor
  • Hager, Iv, Joseph George
  • Honeck, Michael R.
  • Mika, Jeremy Andrew

Abstract

A system for improving navigation of robots in a space with a plurality of pedestrians or other movable objects or obstacles. The system includes a traffic analysis assembly that has a traffic sensor(s) sensing movement of the obstacles in the space. The traffic analysis assembly further includes a processor running a flow module that processes (such as the Gunnar-Farneback optical flow algorithm) output from the traffic sensor to generate traffic analysis results, which include density values for the obstacles in the space and motion information for the obstacles in the space (e.g., speed and direction). The system includes a robot with a controller running a navigation module selecting a navigation route between a current location of the robot and a target location in the space using the traffic analysis result. The workspace is configured such that the obstacles such as pedestrians have unregulated flow patterns in the space.

IPC Classes  ?

67.

System and method for determining activation sequences of devices

      
Application Number 16586409
Grant Number 10761180
Status In Force
Filing Date 2019-09-27
First Publication Date 2020-09-01
Grant Date 2020-09-01
Owner Disney Enterprises Inc. (USA)
Inventor Nocon, Nathan D.

Abstract

A system includes a host device having a hardware processor and a host wireless transceiver, and client devices having client wireless transceivers for wireless communications with the host device. The hardware processor receives wireless signals transmitted by the client wireless transceivers using the host wireless transceiver, and determines locations of the client devices relative to the host device based on angles of arrival of the of the wireless signals. The hardware processor further determines an activation sequence for activating the client devices based on the locations relative to the host device, and transmits control signals using the host wireless transceiver, according to the activation sequence, to activate the client devices.

IPC Classes  ?

  • H04W 4/02 - Services making use of location information
  • G01S 5/02 - Position-fixing by co-ordinating two or more direction or position-line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
  • H04N 21/482 - End-user interface for program selection
  • H04W 4/80 - Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication
  • H04W 56/00 - Synchronisation arrangements

68.

System for guiding a user through an immersive experience

      
Application Number 16536840
Grant Number 10762878
Status In Force
Filing Date 2019-08-09
First Publication Date 2020-09-01
Grant Date 2020-09-01
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Gomez Diaz, Jose Luis
  • Gipson, Jeffrey

Abstract

An immersive experience system has a display device and a processor. The processor receives media content including one or more normal video frames. Further, the processor tracks an orientation of the display device with respect to an intended focal point within an immersive experience. Moreover, the processor determines whether the orientation of the display device exceeds an angular displacement threshold. In response to a determination that the orientation of the display device does not exceed the angular displacement threshold, the processor renders a normal video frame from the received media content on a display device. Conversely, in response to a determination that the orientation of the display device exceeds the angular displacement threshold, the processor modifies one or more properties of the normal video frame to generate a modified video frame.

IPC Classes  ?

  • G09G 5/37 - Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of individual graphic patterns using a bit-mapped memory - Details of the operation on graphic patterns
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

69.

SYSTEMS AND METHODS TO ELICIT PHYSICAL ACTIVITY IN USERS ACTING AS CARETAKERS OF PHYSICAL OBJECTS

      
Application Number 16284863
Status Pending
Filing Date 2019-02-25
First Publication Date 2020-08-27
Owner Disney Enterprises, Inc. (USA)
Inventor Goslin, Michael P.

Abstract

Systems and methods to elicit physical activity in users acting as caretakers of physical objects. Caretaking information for a physical object may define a set of caretaking criteria of a physical object. Individual caretaking criteria may comprise one or more caretaking requirements of the physical object to be satisfied based on physical activity of a user. The caretaking requirements of the physical object may “trick” the user into performing fitness/wellness behavior. In caring for the physical object, the user may actually be caring for themselves.

IPC Classes  ?

  • A63F 13/825 - Fostering virtual characters
  • A63F 13/216 - Input arrangements for video game devices characterised by their sensors, purposes or types using geographical information, e.g. location of the game device or player using GPS
  • A63F 13/212 - Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
  • G16H 20/30 - ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising

70.

TECHNIQUES FOR PERFORMING CONTEXTUAL PHRASE GROUNDING

      
Application Number 16285115
Status Pending
Filing Date 2019-02-25
First Publication Date 2020-08-27
Owner DISNEY ENTERPRISES, INC. (USA)
Inventor
  • Dogan, Pelin
  • Sigal, Leonid
  • Gross, Markus

Abstract

In various embodiments, a phrase grounding model automatically performs phrase grounding for a source sentence and a source image. The phrase grounding model determines that a first phase included in the source sentence matches a first region of the source image based on the first phrase and at least a second phrase included in the source sentence. The phrase grounding model then generates a matched pair that specifies the first phrase and the first region. Subsequently, one or more annotation operations are performed on the source image based on the matched pair. Advantageously, the accuracy of the phrase grounding model is increased relative to prior art solutions where the interrelationships between phrases are typically disregarded.

IPC Classes  ?

  • G06F 17/27 - Automatic analysis, e.g. parsing, orthograph correction
  • G06N 3/08 - Learning methods

71.

SIGN LANGUAGE VIDEO ENCODING FOR DIGITAL CINEMA

      
Application Number 16757700
Status Pending
Filing Date 2019-01-17
First Publication Date 2020-08-27
Owner Disney Enterprises, Inc. (USA)
Inventor Radford, Michael A.

Abstract

A method and apparatus providing variable rate auxiliary video data in a digital cinema package is disclosed. The digital cinema package has primary video information and primary audio information carried on a plurality of fixed bit rate primary audio channels, each fixed bit rate primary audio channel represented by a sequence of audio channel data blocks. In one embodiment, the method comprises generating a chunk of the auxiliary video data, the chunk representing time duration Dc of the auxiliary video data, generating an auxiliary video data block of Lb length, and providing the auxiliary video data block as an audio channel data block of at least one of the fixed bit rate primary audio channels.

IPC Classes  ?

  • H04N 19/46 - Embedding additional information in the video signal during the compression process
  • H04N 19/176 - Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a block, e.g. a macroblock
  • H04N 21/235 - Processing of additional data, e.g. scrambling of additional data or processing content descriptors
  • H04N 21/242 - Synchronization processes, e.g. processing of PCR [Program Clock References]
  • H04N 21/2347 - Processing of video elementary streams, e.g. splicing of video streams or manipulating MPEG-4 scene graphs involving video stream encryption

72.

GAMEPLAY USING MOBILE DEVICE SENSORS

      
Application Number 16275093
Status Pending
Filing Date 2019-02-13
First Publication Date 2020-08-13
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Nocon, Nathan D.
  • Gough, R. Hunter

Abstract

Embodiments relate to gameplay using mobile devices. Embodiments include receiving, by a first device, input from a player initiating a targeted action. Embodiments include determining, by the first device, an orientation of the first device. Embodiments include determining, by the first device, a location of at least a second device based at least on a message received from the second device. Embodiments include identifying, by the first device, that a target of the targeted action is associated with the second device based on the orientation of the first device and the location of the second device. Embodiments include transmitting, by the first device, an indication of the targeted action to the second device.

IPC Classes  ?

  • A63F 13/24 - Constructional details thereof, e.g. game controllers with detachable joystick handles
  • G06F 3/0346 - Pointing devices displaced or positioned by the user; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
  • A63F 13/21 - Input arrangements for video game devices characterised by their sensors, purposes or types

73.

TECHNIQUES FOR AUTOMATICALLY REMOVING CONTENT FROM CLOSED-CAPTION DATA EMBEDDED IN A VIDEO SIGNAL

      
Application Number 16275144
Status Pending
Filing Date 2019-02-13
First Publication Date 2020-08-13
Owner DISNEY ENTERPRISES, INC. (USA)
Inventor
  • Strein, Michael
  • Mclaughlin, William

Abstract

Content is automatically removed from closed-caption data embedded in a video signal. A closed-caption compliance system decodes a first portion of a first video frame that includes a closed-caption data packet. The system extracts a first character string that includes at least a portion of the closed-caption data packet. The system determines whether the first character string is suitable for distribution to viewers. If the first character string is suitable for distribution to viewers, then the system encodes at least a first portion of the first character string as a second closed-caption data packet to include in a second video frame. Otherwise, the system modifies the first character string to generate a second character string that is suitable for distribution to viewers; and encodes at least a first portion of the second character string as a third closed-caption data packet to include in the second video frame.

IPC Classes  ?

  • H04N 21/235 - Processing of additional data, e.g. scrambling of additional data or processing content descriptors
  • H04N 21/236 - Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator ] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
  • H04N 21/8405 - Generation or processing of descriptive data, e.g. content descriptors represented by keywords

74.

INTELLIGENT PHOTOGRAPHY WITH MACHINE LEARNING

      
Application Number 16743855
Status Pending
Filing Date 2020-01-15
First Publication Date 2020-08-13
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Adams, Mathew G.
  • Kirkley, Jere M.
  • Jenkins, Jason B.

Abstract

Embodiments of the present disclosure relate to intelligent photography with machine learning. Embodiments include receiving a video stream from a control camera. Embodiments include providing inputs to a trained machine learning model based on the video stream. Embodiments include determining, based on data output by the trained machine learning model in response to the inputs, at least a first time for capturing a first picture during a session. Embodiments include programmatically instructing a first camera to capture the first picture at the first time during the session.

IPC Classes  ?

  • H04N 5/232 - Devices for controlling television cameras, e.g. remote control
  • G06N 3/08 - Learning methods

75.

PANORAMIC, MULTIPLANE, AND TRANSPARENT COLLIMATED DISPLAY SYSTEM

      
Application Number 16856967
Status Pending
Filing Date 2020-04-23
First Publication Date 2020-08-13
Owner DISNEY ENTERPRISES, INC. (USA)
Inventor
  • Laduke, Thomas F.
  • Reichow, Mark A.

Abstract

A display system for creating a multiplane display. The display system includes a viewing space for viewers. The display system includes a convex screen and a mirror element spaced apart from the convex screen to provide a collimated display. The mirror element is both reflective and transmissive of light, and a fraction of light from the convex screen that strikes a front concave surface of the mirror element is reflected into the viewing space. The convex screen and the front concave surface of the mirror element are each shaped to have an optical prescription defined for a collimated display whereby light reflected into the viewing space is collimated to provide variable depth imagery. The display system includes a background space behind the mirror element, and light from the background space from projection screens and illuminated objects is transmitted through the mirror element to viewers in the viewing space.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 30/26 - Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer’s left and right eyes of the autostereoscopic type

76.

Display screen or portion thereof with graphical user interface

      
Application Number 29687141
Grant Number D0892844
Status In Force
Filing Date 2019-04-10
First Publication Date 2020-08-11
Grant Date 2020-08-11
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Paull, Michael
  • Jimerson, Jerrell
  • Sokolowski, Anthony
  • Lutjens, Ole
  • Perlove, Lucas
  • Harmon, Peter
  • Korn, Reed
  • Ziffer, Brian
  • Sichon, Juan
  • Brockett, Kurt

77.

SYSTEMS AND METHODS TO CONTROL SOUNDS PRODUCED BY A REMOTE CONTROLLED VEHICLE DURING OPERATION OF THE REMOTE CONTROLLED VEHICLE

      
Application Number 16267176
Status Pending
Filing Date 2019-02-04
First Publication Date 2020-08-06
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Nocon, Nathan
  • Wong, Clifford

Abstract

Systems and methods control sounds produced by a remote-controlled vehicle during operation of the remote-controlled vehicle are described herein. One or more components of the remote-controlled vehicle may be controlled and/or manipulated so that the remote-controlled vehicle produces sounds during operation that match a predetermined set of sounds. The control and/or manipulation of the one or more components of the remote-controlled vehicle may be effectuated without alternating a path of the remote-controlled vehicle.

IPC Classes  ?

  • B64C 39/02 - Aircraft not otherwise provided for characterised by special use
  • G06F 9/30 - Arrangements for executing machine instructions, e.g. instruction decode
  • G06F 3/0484 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object or an image, setting a parameter value or selecting a range
  • G05D 1/00 - Control of position, course, altitude, or attitude of land, water, air, or space vehicles, e.g. automatic pilot

78.

Entertainment System Including Performative Figurines

      
Application Number 16266966
Status Pending
Filing Date 2019-02-04
First Publication Date 2020-08-06
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Rosenthal, Janice
  • Nocon, Nathan
  • Thornton, Stephen A.
  • Goslin, Michael P.
  • Panec, Timothy

Abstract

An entertainment system includes multiple figurines configured for wireless communication, and a control device having a hardware processor, a system memory storing a control application, and a transceiver. The hardware processor executes the control application to detect each of the figurines via the transceiver and to identify a predetermined entertainment for performance by two or more of the figurines. The hardware processor further executes the control application to transmit control signals to the two or more figurines via the transceiver, wherein a first control signal instructs a first of the two or more figurines to perform a first portion of the predetermined entertainment, and a second control signal instructs a second of the two or more figurines to perform a second portion of the predetermined entertainment. The two or more of the figurines are configured to perform the predetermined entertainment according to the control signals.

IPC Classes  ?

  • A63H 30/04 - Electrical arrangements using wireless transmission
  • B25J 11/00 - Manipulators not otherwise provided for
  • B25J 13/00 - Controls for manipulators
  • A63H 5/00 - Musical or noise-producing devices for additional toy effects other than acoustical
  • A63H 3/00 - Dolls

79.

SYSTEMS AND METHODS FOR MODIFYING LABELED CONTENT

      
Application Number 16268443
Status Pending
Filing Date 2019-02-05
First Publication Date 2020-08-06
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Mitchell, Kenneth
  • Brito, Caio Jose Dos Santos

Abstract

Systems and methods are disclosed for modifying labeled target content for a capture device. A computer-implemented method may use a computer system that includes non-transient electronic storage, a graphical user interface, and one or more physical computer processors. The computer-implemented method may include: obtaining labeled target content, the labeled target content including one or more facial features that have been labeled; modifying the labeled target content to match dynamically captured content from a first capture device to generate modified target content; and storing the modified target content. The dynamically captured content may include the one or more facial features.

IPC Classes  ?

  • G06K 9/34 - Segmentation of touching or overlapping patterns in the image field
  • G06K 9/00 - Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
  • G06K 9/62 - Methods or arrangements for recognition using electronic means

80.

Coordination of media content delivery to multiple media players

      
Application Number 16270467
Grant Number 10735825
Status In Force
Filing Date 2019-02-07
First Publication Date 2020-08-04
Grant Date 2020-08-04
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Comito, Keith
  • Lefler, Nathan
  • Corrigan-Colville, James A.

Abstract

A system for synchronizing media content playout includes a computing platform having a hardware processor and a system memory storing a software code. The hardware processor executes the software code to receive a first state message from a first media player playing a first media content and a second state message from a second media player playing a second media content, the first media content and the second media content being the same media content. The software code further determines a coordination state for playout of the first media content and the second media content based on one or more of the first and second state messages, and transmits a first coordination message including the coordination state to the first media player and a second coordination message including the coordination state to the second media player to synchronize playout of the first media content and the second media content.

IPC Classes  ?

  • H04N 21/6543 - Transmission by server directed to the client for forcing some client operations, e.g. recording
  • H04N 21/242 - Synchronization processes, e.g. processing of PCR [Program Clock References]
  • H04N 21/239 - Interfacing the upstream path of the transmission network, e.g. prioritizing client requests
  • H04N 21/2387 - Stream processing in response to a playback request from an end-user, e.g. for trick-play
  • H04N 21/8547 - Content authoring involving timestamps for synchronizing content
  • H04N 21/63 - Control signaling between client, server and network components; Network processes for video distribution between server and clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
  • H04N 21/43 - Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronizing decoder's clock; Client middleware
  • H04N 21/437 - Interfacing the upstream path of the transmission network, e.g. for transmitting client requests to a VOD server
  • H04N 21/45 - Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies 
  • H04N 21/647 - Control signaling between network components and server or clients; Network processes for video distribution between server and clients, e.g. controlling the quality of the video stream, by dropping packets, protecting content from unauthorised alteration within the network, monitoring of network load or bridging bet

81.

EXTENDED ON-SCREEN GAMEPLAY VIA AUGMENTED REALITY

      
Application Number 16257047
Status Pending
Filing Date 2019-01-24
First Publication Date 2020-07-30
Owner DISNEY ENTERPRISES, INC. (USA)
Inventor
  • Drake, Corey
  • Baumbach, Elliot
  • Hsu, Jonathan R.
  • Medrano, Tritia V.
  • Nocon, Nathan
  • Panec, Timothy M.
  • Wong, Clifford W.
  • Yeung, Jason

Abstract

Various embodiments of the invention disclosed herein provide techniques for extending on-screen gameplay via an augmented reality (AR) system. An extended AR application executing on an AR headset system receives, via a game controller, first data associated with a first object associated with a computer-generated game. The extended AR application renders an augmented reality object based on the first data associated with the first object. The extended AR application displays at least a first portion of the augmented reality object via an augmented reality headset system. Further, an image associated with the computer-generated game is simultaneously rendered on a display monitor

IPC Classes  ?

  • G06F 3/0484 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object or an image, setting a parameter value or selecting a range
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 3/14 - Digital output to display device
  • G02B 27/01 - Head-up displays

82.

MODEL AND DETACHABLE CONTROLLER FOR AUGMENTED REALITY / VIRTUAL REALITY EXPERIENCE

      
Application Number 16262751
Status Pending
Filing Date 2019-01-30
First Publication Date 2020-07-30
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Schmidt, Craig J.
  • Barrett, Kyle W.

Abstract

Embodiments include a method and associated system for providing an augmented reality experience. The method comprises receiving identification information from circuitry of a model removably attached to a controller device. A power source of the controller device provides power to the circuitry. The method further comprises receiving orientation information from one or more sensors of the controller device, and identifying, using a visual sensor, one or more external visual indicators of the model. The method further comprises maintaining a virtual model representing a model type indicated by the identification information. An orientation of the virtual model is based on the orientation information and referenced to the one or more external visual indicators, The method further comprises, responsive to receiving an input, displaying one or more visual effects referenced to the virtual model.

IPC Classes  ?

  • G06F 3/0346 - Pointing devices displaced or positioned by the user; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
  • G06F 3/03 - Arrangements for converting the position or the displacement of a member into a coded form
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

83.

Electrical Charger For A Spinning Device

      
Application Number 16840895
Status Pending
Filing Date 2020-04-06
First Publication Date 2020-07-30
Owner Disney Enterprises, Inc. (USA)
Inventor Nocon, Nathan

Abstract

There is provided a system that includes a base providing a wireless power source, a rotor situated over the base and configured to spin, and a device coupled to the rotor and configured to spin with the rotor, the device having a display and a wireless power receiver, where the wireless power source and the wireless power receiver are configured to power the device to show an image on the display while the rotor is spinning.

IPC Classes  ?

  • G09F 19/02 - Advertising or display means not otherwise provided for incorporating moving display members
  • H02J 50/10 - Circuit arrangements or systems for wireless supply or distribution of electric power using inductive coupling
  • H02J 7/00 - Circuit arrangements for charging or depolarising batteries or for supplying loads from batteries
  • G09F 27/00 - Combined visual and audible advertising or displaying, e.g. for public address
  • G09F 13/04 - Signs, boards, or panels, illuminated from behind the insignia
  • H02J 50/12 - Circuit arrangements or systems for wireless supply or distribution of electric power using inductive coupling of the resonant type
  • G09F 13/02 - Signs, boards, or panels, illuminated by artificial light sources positioned in front of the insignia
  • G09F 13/20 - Illuminated signs; Luminous advertising with luminescent surfaces or parts

84.

Controller

      
Application Number 29648497
Grant Number D0891429
Status In Force
Filing Date 2018-05-22
First Publication Date 2020-07-28
Grant Date 2020-07-28
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Hsu, Jonathan Rd
  • Nocon, Nathan D.

85.

Techniques for concealed vehicle reset

      
Application Number 16421117
Grant Number 10722805
Status In Force
Filing Date 2019-05-23
First Publication Date 2020-07-28
Grant Date 2020-07-28
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Kalama, Asa K.
  • Trowbridge, Robert S.

Abstract

Embodiments disclosed herein include an amusement park ride. The amusement park ride include a ride vehicle that can transition from a first configuration to a second configuration during the duration of the ride. The amusement park ride can include a loading area for loading passengers into the ride vehicle and a separate unloading area for disembarking passengers from the ride vehicle. After disembarking passengers from the ride vehicle the ride vehicle travels to a transition area concealed from the public where the ride transitions from the second configuration to the first configuration. In the transition area, maintenance can be performed on the ride vehicle and calibration can be performed on one or more of the ride vehicle systems (e.g., projector or audio systems).

IPC Classes  ?

  • A63G 4/00 - Accessories for roundabouts not restricted to one of groups or
  • A63G 1/10 - Roundabouts power-driven electrically driven
  • G06F 3/16 - Sound input; Sound output

86.

Water slide tube with braking while hydroplaning

      
Application Number 16371491
Grant Number 10723420
Status In Force
Filing Date 2019-04-01
First Publication Date 2020-07-28
Grant Date 2020-07-28
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Jaegers, Tyler M.
  • Foster, Aaron M.

Abstract

A water slide tube adapted for braking during hydroplaning in a catch or landing pool of a water slide. The water slide tube includes a tube body and a tube speed control assembly, which has a floor attached to a bottom surface of the tube body. The assembly includes one or more drag-inducing elements, provided in or on the floor, configured to produce drag when an outer surface of the floor travels over an upper surface of the catch or landing pool. The floor includes a sheet of flexible material joined along its peripheral edge to the tube body. The sheet of the floor includes a planar portion arranged to be tangential to the bottom surface of the tube body. The one or more drag-inducing elements may include at least one hole extending through the floor providing a passageway for water to an interior space of the tube body.

IPC Classes  ?

  • B63B 34/50 - Body-supporting buoyant devices, e.g. bathing boats or water cycles
  • A63G 21/18 - Water-chutes

87.

Aerial show system using unmanned aerial vehicle (UAV) energy to animate creative show element

      
Application Number 16536401
Grant Number 10723454
Status In Force
Filing Date 2019-08-09
First Publication Date 2020-07-28
Grant Date 2020-07-28
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Hovey, Pehr L
  • Zupan, Madeline R

Abstract

An aerial show system for leveraging downwash and other forces to use an unmanned aerial vehicle (UAV) as a creative element in a show. UAVs in the aerial show system each include a propulsion and lift mechanism, which generates downwash as it moves the UAV about a show's airspace. The aerial show system also includes one-to-many show effect devices adapted to make use of the downwash to activate or animate one or more movable components to generate a desired show effect, e.g., a spinning propeller or fan on an object carried or tethered beneath the UAV chassis/body. The movable component would otherwise be static or passive and relies on the potential and/or kinetic energy created by the UAV in airspace for actuation or animation.

IPC Classes  ?

  • G09F 21/12 - Mobile visual advertising by aeroplanes, airships, balloons, or kites the advertising matter being towed by the aircraft
  • B64C 39/02 - Aircraft not otherwise provided for characterised by special use
  • B64D 1/18 - Dropping or releasing powdered, liquid or gaseous matter, e.g. for fire-fighting by spraying, e.g. insecticides
  • A63J 99/00 - Subject matter not provided for in other groups of this subclass
  • B64D 5/00 - Aircraft transported by aircraft, e.g. for release or reberthing during flight
  • A63G 33/00 - Devices allowing competitions between several persons, not otherwise provided for

88.

Aerial show system with dynamic participation of unmanned aerial vehicles (UAVs) with distributed show systems

      
Application Number 16558517
Grant Number 10723455
Status In Force
Filing Date 2019-09-03
First Publication Date 2020-07-28
Grant Date 2020-07-28
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Hovey, Pehr L.
  • Zupan, Madeline R.
  • Dohi, Anthony Paul
  • Christensen, David Loyal
  • Snoddy, Jon Hayes
  • Pope, Morgan T.

Abstract

An aerial show system that includes unmanned aerial vehicles (UAVs), show systems onboard the UAVs, non-UAV or “ground” show systems, and a global ground control system. The control system is configured to actively track a UAV's operations during a show performance and to react to make the UAV truly a part of the larger show performance. The system achieves dynamic show participation of the UAV with the distributed show systems, which may include other UAVs and non-UAV show systems on the ground but launch or provide effects in the airspace through which the UAV flies. For example, the control system may track a UAV with a show effect element to determine whether the UAV properly hits its cue or mark with respect to position and orientation in the show space and with respect to timing and, in response to location tracking, trigger show effects early, late, or on time.

IPC Classes  ?

  • B64D 1/02 - Dropping, ejecting, or releasing articles
  • B64C 39/02 - Aircraft not otherwise provided for characterised by special use
  • A63J 5/02 - Arrangements for making stage effects; Auxiliary stage appliances
  • B64D 1/18 - Dropping or releasing powdered, liquid or gaseous matter, e.g. for fire-fighting by spraying, e.g. insecticides

89.

STREAMABLE COMPRESSED GEOMETRY FOR LIVE BROADCAST

      
Application Number 16250924
Status Pending
Filing Date 2019-01-17
First Publication Date 2020-07-23
Owner Disney Enterprises, Inc. (USA)
Inventor Smithers, Andi

Abstract

A system and method for high-quality, three-dimensional live streaming of graphical content. Graphical geometry data generated through execution of multimedia software, e.g. a video game, is converted into a geometry streaming format that is independent of the operating system of the host system running the software. The converted data is broadcast over a network to one or more spectator systems, which are configured to execute the converted data to render locally the three-dimensional content. A spectator is therefore enabled to change a viewing angle within the three-dimensional content regardless of the viewing angle associated with the host system.

IPC Classes  ?

  • A63F 13/77 - Game security or game management aspects involving data related to game devices or game servers, e.g. configuration data, software version or amount of memory
  • A63F 13/86 - Watching games played by other players
  • A63F 13/335 - Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using wide area network [WAN] connections using Internet
  • A63F 13/338 - Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using wide area network [WAN] connections using television networks

90.

AUTOMATED CONTENT COMPILATION

      
Application Number 16245011
Status Pending
Filing Date 2019-01-10
First Publication Date 2020-07-16
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Feldman, Vladislav
  • Strein, Michael J.
  • Stefanidis, Efthimis

Abstract

In one implementation, an automated content compilation system includes a computing platform having a hardware processor and a system memory storing a content integration software code. The hardware processor executes the content integration software code to receive commercials for presentation with a media content including primary content segments and predetermined advertising intervals, and to assemble commercial clusters corresponding respectively to the predetermined advertising intervals, each of the commercial clusters including a subset of the commercials. The content integration software code further compiles a content file including a data structure having the commercial clusters linked with the primary content segments, the data structure including one or more playlist(s) identifying a location of each of the commercials in the data structure, and provides the content file for playout of the media content as an integrated content stream including the primary content segments and the commercial clusters.

IPC Classes  ?

  • H04N 21/234 - Processing of video elementary streams, e.g. splicing of video streams or manipulating MPEG-4 scene graphs
  • G06F 16/45 - Clustering; Classification
  • H04N 21/845 - Structuring of content, e.g. decomposing content into time segments
  • H04N 21/81 - Monomedia components thereof

91.

SYSTEMS AND METHODS FOR IMAGE COMPRESSION AT MULTIPLE, DIFFERENT BITRATES

      
Application Number 16249861
Status Pending
Filing Date 2019-01-16
First Publication Date 2020-07-16
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Schroers, Christopher
  • Doggett, Erika
  • Mandt, Stephan Marcel
  • Mcphillen, Jared
  • Labrozzi, Scott
  • Weber, Romann
  • Bamert, Mauro

Abstract

Systems and methods for predicting a target set of pixels are disclosed. In one embodiment, a method may include obtaining target content. The target content may include a target set of pixels to be predicted. The method may also include convolving the target set of pixels to generate an estimated set of pixels. The method may include matching a second set of pixels in the target content to the target set of pixels. The second set of pixels may be within a distance from the target set of pixels. The method may include refining the estimated set of pixels to generate a refined set of pixels using a second set of pixels in the target content.

IPC Classes  ?

  • G06T 9/20 - Contour coding, e.g. using detection of edges

92.

SYSTEMS AND METHODS TO FACILITATE BI-DIRECTIONAL ARTIFICIAL INTELLIGENCE COMMUNICATIONS

      
Application Number 16829500
Status Pending
Filing Date 2020-03-25
First Publication Date 2020-07-16
Owner Disney Enterprises, Inc. (USA)
Inventor Goslin, Michael P.

Abstract

This disclosure presents systems and methods to facilitate artificial intelligence communications. One or more sensors may be configured to generate output signals conveying user behavior information and/or other information. The user behavior information may represent one or more communicative behaviors of a user in a real world. The one or more communicative behaviors may be interpreted based on communication structures of the one or more communicative behaviors to determine a meaning of a message conveyed by the one or more communicative behaviors. The communication structures may be categorized by structure type. The structure types may include one or more of a verbal type communication structure and/or a non-verbal type communication structure. Verbal type communication structure may refer to the conveyance of words via speech and/or non-speech communications. Non-verbal communication structure may refer to the conveyance of one or more of feelings, emotions, and/or impressions.

IPC Classes  ?

  • G06F 16/2452 - Query translation
  • G06N 5/02 - Knowledge representation
  • G10L 15/18 - Speech classification or search using natural language modelling
  • G10L 25/63 - Speech or voice analysis techniques not restricted to a single one of groups specially adapted for particular use for comparison or discrimination for estimating an emotional state
  • G10L 15/22 - Procedures used during a speech recognition process, e.g. man-machine dialog

93.

Mounted displays that autorotate to match content on display

      
Application Number 16245901
Grant Number 10775838
Status In Force
Filing Date 2019-01-11
First Publication Date 2020-07-16
Grant Date 2020-09-15
Owner DISNEY ENTERPRISES, INC. (USA)
Inventor
  • Patel, Mehul
  • Chapman, Steven M.
  • Popp, Joseph
  • Deuel, Matthew

Abstract

Some implementations of the disclosure are directed to automatically rotating displays to display media content based on metadata extracted from the media content that provides an indication of a target display orientation to display the media content. In one implementation, a method includes: detecting media content for display on a display, wherein the display is mounted on a rotatable display mount; extracting metadata from the detected media content, the extracted metadata providing an indication of a target display orientation to display the media content; using at least the extracted metadata, automatically causing the rotatable display mount to rotate the display to the target orientation; and displaying the media content on the rotated display.

IPC Classes  ?

  • G06F 1/16 - Constructional details or arrangements
  • G06F 16/783 - Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content

94.

SMART SPRAY PAINTING NOZZLE AND CALIBRATION METHOD FOR USE WITH MOBILE ROBOTS

      
Application Number 16246737
Status Pending
Filing Date 2019-01-14
First Publication Date 2020-07-16
Owner DISNEY ENTERPRISES, INC. (USA)
Inventor
  • Beardsley, Paul A.
  • Wezel, Jan
  • Reusser, Dorothea

Abstract

A smart nozzle assembly includes a nozzle, a nozzle control mechanism, and camera rigidly attached to the nozzle for use with a mobile robot in an autonomous spray painting system. The nozzle control mechanism is configured to control flowrate, control the shape of the spray pattern, mix two or more colors, and clean dried paint at the nozzle tip. The nozzle assembly further includes a process for running software to manage or initiate the nozzle control mechanism's functionality and to provide the nozzle calibration. The calibration method for the nozzle uses a novel algorithm that measures the spray pattern, the distribution of paint within the spray pattern, and the relative position of the nozzle and camera. The distribution of paint within the spray pattern is measured in terms of physical quantity of delivered paint per unit area.

IPC Classes  ?

  • B05B 12/08 - Arrangements for controlling delivery; Arrangements for controlling the spray area responsive to condition of liquid or other fluent material discharged, of ambient medium or of target

95.

Systems and methods to provide views of a virtual space

      
Application Number 16241791
Grant Number 10839607
Status In Force
Filing Date 2019-01-07
First Publication Date 2020-07-09
Grant Date 2020-11-17
Owner Disney Enterprises, Inc. (USA)
Inventor Baumbach, Elliott

Abstract

Systems and methods configured to provide views of a virtual space are presented herein. A display device may include a display screen, one or more sensors, and/or other component. The one or more sensors being configured to generate output signals conveying gaze information including one or more of a viewpoint, a gaze direction, and/or other information. An instance of a virtual space may be executed to determine a view of the virtual space based on the gaze information and/or other information. The view of the virtual space may correspond to a field of view within the virtual space determined based on the gaze information. The instance of the virtual space may be presented on the display screen according to the determined view of the virtual space.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G02B 27/02 - Viewing or reading apparatus
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

96.

PIXEL ERROR DETECTION SYSTEM

      
Application Number 16243650
Status Pending
Filing Date 2019-01-09
First Publication Date 2020-07-09
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Doggett, Erika
  • Wolak, Anna
  • Tsatsoulis, Penelope Daphne
  • Mccarthy, Nicholas
  • Mandt, Stephan

Abstract

A process receives, with a processor, video content. Further, the process splices, with the processor, the video content into a plurality of video frames. In addition, the process splices, with the processor, at least one of the plurality of video frames into a plurality of image patches. Moreover, the process performs, with a neural network, an image reconstruction of at least one of the plurality of image patches to generate a reconstructed image patch. The process also compares, with the processor, the reconstructed image patch with the at least one of the plurality of image patches. Finally, the process determines, with the processor, a pixel error within the at least one of the plurality of image patches based on a discrepancy between the reconstructed image patch and the at least one of the plurality of image patches.

IPC Classes  ?

  • G06T 7/00 - Image analysis
  • G06K 9/00 - Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
  • G06K 9/62 - Methods or arrangements for recognition using electronic means

97.

System for generating augmented reality content from a perspective view of an unmanned aerial vehicle

      
Application Number 16513169
Grant Number 10706634
Status In Force
Filing Date 2019-07-16
First Publication Date 2020-07-07
Grant Date 2020-07-07
Owner Disney Enterprises, Inc. (USA)
Inventor
  • Baumbach, Elliott
  • Nocon, Nathan
  • Gibson, Hunter

Abstract

A system has an augmented reality device and an unmanned aerial vehicle. The augmented reality device captures, from a perspective view of the augmented reality device, real-world imagery of a user within an augmented reality environment. Further, the augmented reality device generates virtual imagery to overlay the real-world imagery captured by the augmented reality image capture device from the perspective view of the augmented reality device. Finally, the augmented reality device determines a position and an orientation of an augmented reality device accessory within a common coordinate space. Moreover, the unmanned aerial vehicle captures, from a perspective view of the unmanned aerial vehicle, real-world imagery of the user within the augmented reality environment. The unmanned aerial vehicle determines a position of the unmanned aerial vehicle within the common coordinate space. Further, the unmanned aerial vehicle generates virtual imagery to overlay the real-world imagery captured by the unmanned aerial vehicle.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • B64C 39/02 - Aircraft not otherwise provided for characterised by special use
  • G06K 9/00 - Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
  • H04W 4/021 - Services related to particular areas, e.g. point of interest [POI] services, venue services or geofences

98.

Selective Reduction of Pixel Intensity to Enhance Energy Efficiency During Display of an Image

      
Application Number 16717879
Status Pending
Filing Date 2019-12-17
First Publication Date 2020-07-02
Owner Disney Enterprises, Inc. (USA)
Inventor Mendes De Souza, Alex

Abstract

According to one implementation, a system for enhancing energy efficiency during display of an image through selective reduction of pixel intensity includes a computing platform having a hardware processor and a memory storing a software code. The hardware processor is configured to execute the software code to receive a first image including multiple pixels and having a first display power consumption when displayed on a display, and to change the intensity of each of a predetermined subset of the pixels of the first image into a predetermined intensity to generate a second image. The second image has a second display power consumption when displayed on the display, the predetermined intensity being such that the second display power consumption is lower than the first display power consumption.

IPC Classes  ?

  • G09G 5/10 - Intensity circuits
  • G06F 1/3234 - Power saving characterised by the action undertaken
  • G09G 5/02 - Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed

99.

SIMULATION EXPERIENCE WITH PHYSICAL OBJECTS

      
Application Number 16817217
Status Pending
Filing Date 2020-03-12
First Publication Date 2020-07-02
Owner DISNEY ENTERPRISES, INC. (USA)
Inventor
  • Arana, Mark
  • Havey, Benjamin F.
  • Drake, Edward
  • Chen, Alexander C.

Abstract

A method for generating a simulation with a simulation device is disclosed. The method includes receiving from a simulation adapter, a characteristics profile associated with a physical object connected to the simulation adapter; generating a simulation experience based on the characteristics profile of the physical object; receiving movement data from the simulation adapter corresponding to movement of the physical object; modifying a physical object representation in the simulation experience based on the movement data; and generating a simulation event including the modified physical object representation in the simulation experience based on the movement data.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • A63F 13/65 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
  • A63F 13/5255 - Changing parameters of virtual cameras according to dedicated instructions from a player, e.g. using a secondary joystick to rotate the camera around a player's character
  • A63F 13/25 - Output arrangements for video game devices
  • A63F 13/212 - Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
  • A63F 13/213 - Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
  • A63F 13/211 - Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
  • A63F 13/80 - Special adaptations for executing a specific game genre or game mode

100.

Automated dynamic adaptive controls

      
Application Number 16147437
Grant Number 10695682
Status In Force
Filing Date 2018-09-28
First Publication Date 2020-06-30
Grant Date 2020-06-30
Owner DISNEY ENTERPRISES INC. (USA)
Inventor
  • Trowbridge, Robert Scott
  • Kalama, Asa K.
  • Huebner, Robert E.

Abstract

Systems and methods for automated dynamic adaptive control are disclosed herein. The system can include a simulation vehicle that can transit at least one participant through an entertainment experience from a starting positon to a terminating position. The simulation vehicle can include a plurality of user controls. The system can include a processor that can: provide content to the at least one participant; receive user inputs via the plurality of controls of the simulation vehicle. The processor can: affect the entertainment experience based on the received user inputs; identify an intervention based on a determined discrepancy between received user inputs and expected user inputs; and modify an effect of the user inputs on the entertainment experience according to the identified intervention.

IPC Classes  ?

  • A63G 31/02 - Amusement arrangements with moving substructures
  • A63G 31/16 - Amusement arrangements creating illusions of travel
  1     2     3     ...     23        Next Page