SAS Institute Inc.

United States of America

Back to Profile

1-100 of 737 for SAS Institute Inc. Sort by
Query
Excluding Subsidiaries
Aggregations Reset Report
IP Type
        Patent 665
        Trademark 72
Jurisdiction
        United States 670
        Canada 32
        World 27
        Europe 8
Date
New (last 4 weeks) 1
2023 March (MTD) 1
2023 February 1
2023 January 3
2022 December 5
See more
IPC Class
G06F 17/30 - Information retrieval; Database structures therefor 151
G06N 20/00 - Machine learning 68
H04L 29/08 - Transmission control procedure, e.g. data link level control procedure 68
G06F 9/50 - Allocation of resources, e.g. of the central processing unit [CPU] 67
G06F 17/18 - Complex mathematical operations for evaluating statistical data 63
See more
NICE Class
09 - Scientific and electric apparatus and instruments 45
16 - Paper, cardboard and goods made from these materials 27
42 - Scientific, technological and industrial services, research and design 24
38 - Telecommunications services 10
41 - Education, entertainment, sporting and cultural services 10
See more
Status
Pending 15
Registered / In Force 722
  1     2     3     ...     8        Next Page

1.

Process to Geographically Associate Potential Water Quality Stressors to Monitoring Stations

      
Application Number 17945428
Status Pending
Filing Date 2022-09-15
First Publication Date 2023-03-23
Owner SAS Institute Inc. (USA)
Inventor
  • Griffith, Philip David
  • Hodge, Andie
  • Lyall, Amir Naveed
  • Thomas, Kirby Ann
  • Valisekkagari, Srinivas Reddy
  • Wendt, Ryan Todd

Abstract

A computing device obtains data indicating a topography for an area comprising water and receives an indication of an identified data object representing a stressor to the area or a first monitoring station configurable to monitor the stressor. The computing device also determines a location for the identified data object in the topography and selects one or more related data objects to be related to the identified data object by determining a classification indicating whether the identified data object operates in water and selecting the one or more related data objects based on the location and the classification. The computing device also generates one or more controls for monitoring the area based on the selected one or more related data objects.

IPC Classes  ?

2.

Experiment Design Variants Evaluation Table GUI

      
Application Number 17883065
Status Pending
Filing Date 2022-08-08
First Publication Date 2023-02-09
Owner SAS Institute Inc. (USA)
Inventor
  • Lekivetz, Ryan Adam
  • Morgan, Joseph Albert
  • King, Caleb Bridges
  • Jones, Bradley Allen
  • Bailey, Mark Wallace
  • Rhyne, Jacob Davis

Abstract

An apparatus includes a processor to: generate variants of an experiment design based on varied parameters; for each variant, estimate terms based on the model, and derive an optimality value; present a table of the variants including a column for each varied parameter and a column for the optimality value, a row for each variant, and a bar graph for each column depicting a distribution of the values therein; present function controls operable to select a function to perform on row(s) of the table in response to selection of a bar of a bar graph of a column; in response to selection of a function, change the current function to the selected function; and in response to a selection of a bar of a bar graph of a column, perform the current function on row(s) based on instances of the value associated with selected bar.

IPC Classes  ?

  • G06F 30/20 - Design optimisation, verification or simulation

3.

Quality Prediction Using Process Data

      
Application Number 17944291
Status Pending
Filing Date 2022-09-14
First Publication Date 2023-01-26
Owner SAS Institute Inc. (USA)
Inventor
  • Kakde, Deovrat Vijay
  • Wang, Haoyu
  • Mcguirk, Anya Mary

Abstract

A computing device accesses a machine learning model trained on training data of first bonding operations (e.g., a ball and/or stitch bond). The first bonding operations comprise operations to bond a first set of multiple wires to a first set of surfaces. The machine learning model is trained by supervised learning. The device receives input data indicating process data generated from measurements of second bonding operations. The second bonding operations comprise operations to bond a second set of multiple wires to a second set of surfaces. The device weights the input data according to the machine learning model. The device generates an anomaly predictor indicating a risk for an anomaly occurrence in the second bonding operations based on weighting the input data according to the machine learning model. The device outputs the anomaly predictor to control the second bonding operations.

IPC Classes  ?

  • G06K 9/62 - Methods or arrangements for recognition using electronic means
  • H01L 21/66 - Testing or measuring during manufacture or treatment
  • G06N 20/00 - Machine learning

4.

QUALITY PREDICTION USING PROCESS DATA

      
Application Number US2022013319
Publication Number 2023/003595
Status In Force
Filing Date 2022-01-21
Publication Date 2023-01-26
Owner SAS INSTITUTE INC. (USA)
Inventor
  • Kakde, Deovrat Vijay
  • Wang, Haoyu
  • Mcguirk, Anya Mary

Abstract

A computing device (2002) accesses a machine learning model (2050) trained on training data (2032) of first bonding operations (1308, 2040A) (e.g., a ball and/or stitch bond). The first bonding operations comprise operations to bond a first set of wires (1504) to a first set of surfaces (1506, 1508). The machine learning model is trained by supervised learning. The device receives input data (2070) indicating process data (2074) generated from measurements of second bonding operations (2040B). The second bonding operations comprise operations to bond a second set of wires to a second set of surfaces. The device weights the input data according to the machine learning model. The device generates an anomaly predictor (2052) indicating a risk for an anomaly occurrence in the second bonding operations based on weighting the input data according to the machine learning model. The device outputs the anomaly predictor to control the second bonding operations.

IPC Classes  ?

  • G06E 1/00 - Devices for processing exclusively digital data

5.

Automated streaming data model generation with parallel processing capability

      
Application Number 17879893
Grant Number 11550643
Status In Force
Filing Date 2022-08-03
First Publication Date 2023-01-10
Grant Date 2023-01-10
Owner SAS Institute Inc. (USA)
Inventor
  • Enck, Steven William
  • Cavalier, Charles Michael
  • Gauby, Sarah Jeanette
  • Kolodzieski, Scott Joseph

Abstract

An event stream processing (ESP) model is read that describes computational processes. (A) An event block object is received. (B) A new measurement value, a timestamp value, and a sensor identifier are extracted. (C) An in-memory data store is updated with the new measurement value, the timestamp value, and the sensor identifier. (A) through (C) are repeated until an output update time is reached. When the output update time is reached, data stored in the in-memory data store is processed and updated using data enrichment windows to define enriched data values that are output. The data enrichment windows include a gate window before each window that uses values computed by more than one window. The gate window sends a trigger to a next window when each value of the more than one window has been computed. The enrichment windows are included in the ESP model.

IPC Classes  ?

  • G06F 9/54 - Interprogram communication
  • G06F 9/48 - Program initiating; Program switching, e.g. by interrupt

6.

TWO-LEVEL PARALLELIZTION OF GOODNESS-OF-FIT TESTS FOR SPATIAL PROCESS MODELS

      
Application Number 17535745
Status Pending
Filing Date 2021-11-26
First Publication Date 2022-12-29
Owner SAS Institute Inc. (USA)
Inventor Mohan, Pradeep

Abstract

An apparatus includes processor(s) to: receive a request to test goodness-of-fit of a spatial process model; generate a KD tree from observed spatial point dataset including locations within a region at which instances of an event occurred; derive, from the observed spatial point dataset, multiple quadrats into which the region is divided; receive, from multiple processors, current levels of availability of processing resources including quantities of currently available execution threads; select, based on the quantity of currently available execution threads, a subset of the multiple processors to perform multiple iterations of a portion of the test in parallel; provide, to each processor of the subset, the KD tree, the spatial process model, and the multiple quadrats; receive, from each processor of the subset, per-quadrat data portions indicative of results of an iteration; derive a goodness-of-fit statistic from the per-quadrat data portions; and transmit an indication of goodness-of-fit to another device.

IPC Classes  ?

  • G06F 30/20 - Design optimisation, verification or simulation
  • G06F 9/50 - Allocation of resources, e.g. of the central processing unit [CPU]

7.

Feature Storage Manager

      
Application Number 17847361
Status Pending
Filing Date 2022-06-23
First Publication Date 2022-12-29
Owner SAS Institute Inc. (USA)
Inventor
  • Kaczynski, Piotr
  • Maksymiuk, Aneta
  • Skalski, Artur Lukasz
  • Stobieniecka, Wioletta Paulina
  • Dwivedi, Dwijendra Nath

Abstract

A computing system obtains a first preconfigured feature set. The first preconfigured feature set defines: a first feature definition defining an input variable, and first computer instructions for locating first data. The first data is available for retrieval because it is stored, or set-up to arrive, in the feature storage according to the first preconfigured feature set. The computing system receives a requested data set for the input variable. The computing system generates an availability status indicating whether the request data set is available for retrieval according to the first preconfigured feature set. Based on the availability status, generating, by the computing system, the requested data set by: retrieving historical data for the first preconfigured feature set; retrieving a data definition associated with the historical data; and generating the requested data based on the historical data and the data definition.

IPC Classes  ?

8.

User interface creation system

      
Application Number 17721427
Grant Number 11537366
Status In Force
Filing Date 2022-04-15
First Publication Date 2022-12-27
Grant Date 2022-12-27
Owner SAS Institute Inc. (USA)
Inventor
  • Jirak, Karen Christine
  • Matthews, Ii, Edward Fredrick
  • Yang, James Chunan

Abstract

A computing device create a user interface application. A user interface (UI) tag is read in a UI application. The UI tag is executed to identify a UI template tag. The identified UI template tag is executed to define a top-level container initializer for the UI application and to define a plurality of widget initializers for inclusion in a top-level container rendered using the top-level container initializer. The top-level container is rendered in a display using the top-level container initializer. Each widget of a plurality of widgets in the rendered top-level container is rendered using the defined plurality of widget initializers to create a UI.

IPC Classes  ?

  • G06F 8/38 - Creation or generation of source code for implementing user interfaces
  • G06F 8/36 - Software reuse

9.

Bias mitigating machine learning training system

      
Application Number 17837444
Grant Number 11531845
Status In Force
Filing Date 2022-06-10
First Publication Date 2022-12-20
Grant Date 2022-12-20
Owner SAS Institute Inc. (USA)
Inventor
  • Hunt, Xin Jiang
  • Wu, Xinmin
  • Abbey, Ralph Walter

Abstract

A computing device trains a fair machine learning model. A prediction model is trained to predict a target value. For a number of iterations, a weight vector is computed using the bound value based on fairness constraints defined for a fairness measure type; a weight value is assigned to each observation vector based on the target value and a sensitive attribute value; the prediction model is trained with each weighted observation vector to predict the target value; and a conditional moments vector is computed based on the fairness constraints and the target and sensitive attribute values. Conditional moments difference values are computed. When the conditional moments difference values indicate to adjust the bound value, the bound value is updated and the process is repeated with the bound value replaced with the updated bound value until the conditional moments difference values indicate no further adjustment of the bound value is needed.

IPC Classes  ?

  • G06N 20/00 - Machine learning
  • G06K 9/62 - Methods or arrangements for recognition using electronic means

10.

TECHNIQUES FOR IMAGE CONTENT EXTRACTION

      
Application Number 17889801
Status Pending
Filing Date 2022-08-17
First Publication Date 2022-12-08
Owner SAS Institute Inc. (USA)
Inventor
  • Wheaton, David James
  • Cooke, Iii, Stuart Dakari
  • Nadolski, William Robert

Abstract

Embodiments are directed to techniques for image content extraction. Some embodiments include extracting contextually structured data from document images, such as by automatically identifying document layout, document data, document metadata, and/or correlations therebetween in a document image, for instance. Some embodiments utilize breakpoints to enable the system to match different documents with internal variations to a common template. Several embodiments include extracting contextually structured data from table images, such as gridded and non-gridded tables. Many embodiments are directed to generating and utilizing a document template database for automatically extracting document image contents into a contextually structured format. Several embodiments are directed to automatically identifying and associating document metadata with corresponding document data in a document image to generate a machine-facilitated annotation of the document image. In some embodiments, the machine-facilitated annotation may be used to generate a template for the template database.

IPC Classes  ?

  • G06T 7/00 - Image analysis
  • G06F 16/81 - Indexing, e.g. XML tags; Data structures therefor; Storage structures
  • G06F 16/93 - Document management systems
  • G06F 40/284 - Lexical analysis, e.g. tokenisation or collocates
  • G06F 40/186 - Templates
  • G06F 40/169 - Annotation, e.g. comment data or footnotes
  • G06F 3/04842 - Selection of displayed objects or displayed text elements
  • G06V 10/40 - Extraction of image or video features

11.

Automated control of a manufacturing process

      
Application Number 17854264
Grant Number 11531907
Status In Force
Filing Date 2022-06-30
First Publication Date 2022-11-24
Grant Date 2022-12-20
Owner SAS Institute Inc. (USA)
Inventor
  • Oroojlooyjadid, Afshin
  • Nazari, Mohammadreza
  • Hajinezhad, Davood
  • Dizche, Amirhassan Fallah
  • Silva, Jorge Manuel Gomes Da
  • Walker, Jonathan Lee
  • Desai, Hardi
  • Blanchard, Robert
  • Valsaraj, Varunraj
  • Zhang, Ruiwen
  • Wang, Weichen
  • Liu, Ye
  • Azizsoltani, Hamoon
  • Mookiah, Prathaban

Abstract

A computing device trains a machine state predictive model. A generative adversarial network with an autoencoder is trained using a first plurality of observation vectors. Each observation vector of the first plurality of observation vectors includes state variable values for state variables and an action variable value for an action variable. The state variables define a machine state, wherein the action variable defines a next action taken in response to the machine state. The first plurality of observation vectors successively defines sequential machine states to manufacture a product. A second plurality of observation vectors is generated using the trained generative adversarial network with the autoencoder. A machine state machine learning model is trained to predict a subsequent machine state using the first plurality of observation vectors and the generated second plurality of observation vectors. A description of the machine state machine learning model is output.

IPC Classes  ?

  • G06N 5/02 - Knowledge representation; Symbolic representation

12.

Quality prediction using process data

      
Application Number 17581113
Grant Number 11501116
Status In Force
Filing Date 2022-01-21
First Publication Date 2022-11-15
Grant Date 2022-11-15
Owner SAS Institute Inc. (USA)
Inventor
  • Kakde, Deovrat Vijay
  • Wang, Haoyu
  • Mcguirk, Anya Mary

Abstract

A computing device accesses a machine learning model trained on training data of first bonding operations (e.g., a ball and/or stitch bond). The first bonding operations comprise operations to bond a first set of multiple wires to a first set of surfaces. The machine learning model is trained by supervised learning. The device receives input data indicating process data generated from measurements of second bonding operations. The second bonding operations comprise operations to bond a second set of multiple wires to a second set of surfaces. The device weights the input data according to the machine learning model. The device generates an anomaly predictor indicating a risk for an anomaly occurrence in the second bonding operations based on weighting the input data according to the machine learning model. The device outputs the anomaly predictor to control the second bonding operations.

IPC Classes  ?

  • G06K 9/00 - Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
  • G06K 9/62 - Methods or arrangements for recognition using electronic means
  • H01L 21/66 - Testing or measuring during manufacture or treatment
  • G06N 20/00 - Machine learning

13.

Leveraging text profiles to select and configure models for use with textual datasets

      
Application Number 17858634
Grant Number 11501547
Status In Force
Filing Date 2022-07-06
First Publication Date 2022-11-15
Grant Date 2022-11-15
Owner SAS INSTITUTE INC:. (USA)
Inventor
  • Jade, Teresa S.
  • Li, Xiao
  • Zuo, Chunqi
  • Kovach, Paul Jeffrey

Abstract

Text profiles can be leveraged to select and configure models according to some examples described herein. In one example, a system can analyze a reference textual dataset and a target textual dataset using text-mining techniques to generate a first text profile and a second text profile, respectively. The first text profile can contain first metrics characterizing the reference textual dataset and the second text profile can contain second metrics characterizing the target textual dataset. The system can determine a similarity value by comparing the first text profile to the second text profile. The system can also receive a user selection of a model that is to be applied to the target textual dataset. The system can then generate an insight relating to an anticipated accuracy of the model on the target textual dataset based on the similarity value. The system can output the insight to the user.

IPC Classes  ?

  • G06F 16/335 - Filtering based on additional data, e.g. user or group profiles
  • G06V 30/19 - Recognition using electronic means
  • G06F 40/10 - Text processing

14.

Graphical user interface for visualizing contributing factors to a machine-learning model's output

      
Application Number 17747139
Grant Number 11501084
Status In Force
Filing Date 2022-05-18
First Publication Date 2022-11-15
Grant Date 2022-11-15
Owner SAS INSTITUTE INC. (USA)
Inventor
  • Soleimani, Reza
  • Leeman-Munk, Samuel Paul
  • Cox, James Allen
  • Styles, David Blake

Abstract

In one example, a system can execute a first machine-learning model to determine an overall classification for a textual dataset. The system can also determine classification scores indicating the level of influence that each token in the textual dataset had on the overall classification. The system can select a first subset of the tokens based on their classification scores. The system can also execute a second machine-learning model to determine probabilities that the textual dataset falls into various categories. The system can determine category scores indicating the level of influence that each token had on a most-likely category determination. The system can select a second subset of the tokens based on their category scores. The system can then generate a first visualization depicting the first subset of tokens color-coded to indicate their classification scores and a second visualization depicting the second subset of tokens color-coded to indicate their category scores.

IPC Classes  ?

15.

Interactive Graphical User Interface for Monitoring Computer Models

      
Application Number 17860501
Status Pending
Filing Date 2022-07-08
First Publication Date 2022-11-10
Owner SAS Institute Inc. (USA)
Inventor
  • Roberts, Terisa
  • Katiyar, Vipul Manoj
  • Malani, Amol Kishor

Abstract

A computing system establishes a hierarchy for monitoring model(s). The hierarchy comprises an association between each of multiple measures of a measure level of the hierarchy and intermediate level(s) of the hierarchy. An intermediate level comprises one or more of a measurement category or analysis type. The hierarchy comprises an association between the intermediate level(s) and at least one model. The system monitors the model(s) by generating health measurements. Each of the health measurements corresponds to one of the multiple measures. Each of the health measurements indicates a performance of a monitored model according to a measurement category or analysis type associated in the hierarchy with the respective measure of the multiple measures. The system generates a visualization in a graphical user interface. The visualization comprises a graphical representation of an indication of a health measurement for each of measure(s), and associations in the hierarchy.

IPC Classes  ?

  • G06F 3/0482 - Interaction with lists of selectable items, e.g. menus
  • G06F 11/34 - Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation
  • G06T 11/20 - Drawing from basic elements, e.g. lines or circles
  • G06F 3/04847 - Interaction techniques to control parameter settings, e.g. interaction with sliders or dials

16.

Flexible program functions usable for customizing execution of a sequential Monte Carlo process in relation to a state space model

      
Application Number 17730476
Grant Number 11501041
Status In Force
Filing Date 2022-04-27
First Publication Date 2022-11-03
Grant Date 2022-11-15
Owner SAS INSTITUTE INC. (USA)
Inventor
  • Chen, Xilong
  • Zhao, Yang
  • Kabisa, Sylvie T.
  • Elsheimer, David Bruce

Abstract

One example described herein involves a system receiving task data and distribution criteria for a state space model from a client device. The task data can indicate a type of sequential Monte Carlo (SMC) task to be implemented. The distribution criteria can include an initial distribution, a transition distribution, and a measurement distribution for the state space model. The system can generate a set of program functions based on the task data and the distribution criteria. The system can then execute an SMC module to generate a distribution and a corresponding summary, where the SMC module is configured to call the set of program functions during execution of an SMC process and apply the results returned from the set of program functions in one or more subsequent steps of the SMC process. The system can then transmit an electronic communication to the client device indicating the distribution and its corresponding summary.

IPC Classes  ?

  • G06F 30/23 - Design optimisation, verification or simulation using finite element methods [FEM] or finite difference methods [FDM]
  • G06F 111/10 - Numerical modelling
  • G06F 30/27 - Design optimisation, verification or simulation using machine learning, e.g. artificial intelligence, neural networks, support vector machines [SVM] or training a model

17.

Speech segmentation based on combination of pause detection and speaker diarization

      
Application Number 17851264
Grant Number 11538481
Status In Force
Filing Date 2022-06-28
First Publication Date 2022-10-20
Grant Date 2022-12-27
Owner SAS INSTITUTE INC. (USA)
Inventor
  • Li, Xiaolong
  • Henderson, Samuel Norris
  • Cheng, Xiaozhuo
  • Yang, Xu

Abstract

An apparatus includes at least one processor to, in response to a request to perform speech-to-text conversion: perform a pause detection technique including analyzing speech audio to identify pauses, and analyzing lengths of the pauses to identify likely sentence pauses; perform a speaker diarization technique including dividing the speech audio into fragments, analyzing vocal characteristics of speech sounds of each fragment to identify a speaker of a set of speakers, and identifying instances of a change in speakers between each temporally consecutive pair of fragments to identify likely speaker changes; and perform speech-to-text operations including dividing the speech audio into segments based on at least the likely sentence pauses and likely speaker changes, using at least an acoustic model with each segment to identify likely speech sounds in the speech audio, and generating a transcript of the speech audio based at least on the likely speech sounds.

IPC Classes  ?

  • G10L 15/04 - Segmentation; Word boundary detection
  • G10L 15/16 - Speech classification or search using artificial neural networks
  • G10L 15/26 - Speech to text systems
  • G10L 25/78 - Detection of presence or absence of voice signals
  • G10L 25/30 - Speech or voice analysis techniques not restricted to a single one of groups characterised by the analysis technique using neural networks
  • G10L 15/02 - Feature extraction for speech recognition; Selection of recognition unit

18.

DYNAMIC PER-NODE PRE-PULLING IN DISTRIBUTED COMPUTING

      
Application Number 17560656
Status Pending
Filing Date 2021-12-23
First Publication Date 2022-10-13
Owner SAS Institute Inc. (USA)
Inventor Steadman, Jody Bridges

Abstract

An apparatus includes a processor to: receive an indication of ability of a node device to provide a resource for executing application routines, at least one identifier of at least one image including an executable routine stored within a cache of the node device, and an indication of at least one revision level of the at least one image; analyze the ability to provide the resource; in response to being able to support execution of the application routine, identify a first image in a repository; compare identifiers to determine whether there is a second image including a matching executable routine; in response to a match, compare revision levels; and in response to the revision level of the most recent version of the first image being more recent, retrieve the most recent version of the first image from the repository, and store it within the node device.

IPC Classes  ?

  • G06T 3/40 - Scaling of a whole image or part thereof
  • G06T 1/20 - Processor architectures; Processor configuration, e.g. pipelining
  • G06F 16/535 - Filtering based on additional data, e.g. user or group profiles

19.

MACHINE-LEARNING TECHNIQUES FOR AUTOMATICALLY IDENTIFYING TOPS OF GEOLOGICAL LAYERS IN SUBTERRANEAN FORMATIONS

      
Application Number US2021051596
Publication Number 2022/216311
Status In Force
Filing Date 2021-09-22
Publication Date 2022-10-13
Owner SAS INSTITUTE INC. (USA)
Inventor
  • Peredriy, Sergiy
  • Holdaway, Keith Richard

Abstract

Tops of geological layers can be automatically identified using machine-learning techniques as described herein. In one example, a system can receive well log records associated with wellbores drilled through geological layers. The system can generate well clusters by applying a clustering process to the well log records. The system can then obtain a respective set of training data associated with a well cluster, train a machine-learning model based on the respective set of training data, select a target well-log record associated with a target wellbore of the well cluster, and provide the target well-log record as input to the trained machine-learning model. Based on an output from the trained machine-learning model, the system can determine the geological tops of the geological layers in a region surrounding the target wellbore. The system may then transmit an electronic signal indicating the geological tops of the geological layers associated with the target wellbore.

IPC Classes  ?

  • G06F 16/587 - Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using geographical or spatial information, e.g. location
  • G06F 16/909 - Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using geographical or spatial information, e.g. location
  • G01V 1/40 - Seismology; Seismic or acoustic prospecting or detecting specially adapted for well-logging
  • G06F 16/387 - Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using geographical or spatial information, e.g. location

20.

User interfaces for converting node-link data into audio outputs

      
Application Number 17717661
Grant Number 11460973
Status In Force
Filing Date 2022-04-11
First Publication Date 2022-10-04
Grant Date 2022-10-04
Owner SAS INSTITUTE INC:. (USA)
Inventor
  • Mealin, Sean Patrick
  • Summers, Ii, Claude Edward
  • Soltys, Ii, Mitchel Stanley
  • Marshall, Jr., Ralph Johnson
  • Sookne, Jesse Daniel
  • Smith, Brice Joseph
  • Kraus, Gregory David
  • Bolender, Eric Colin
  • Langston, Julianna Elizabeth
  • Robinson, Lisa Beth Morton

Abstract

Node-link data can be converted into audio outputs. For example, a system can generate a graphical user interface (GUI) depicting a node-link diagram having nodes and links. The GUI can include a virtual reference point in the node-link diagram and a virtual control element that is rotatable around the virtual reference point by a user to contact one or more of the nodes in the node-link diagram. The system can receive user input for rotating the virtual control element around the virtual reference point, which can generate a contact between the virtual control element and a particular node of the node-link diagram. In response to detecting the contact, the system can determine a sound characteristic configured to indicate an attribute associated with the particular node. The system can then generate a sound having the sound characteristic, for example to assist the user in exploring the node-link diagram.

IPC Classes  ?

  • G06F 3/04817 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
  • G06F 3/16 - Sound input; Sound output
  • G06F 3/14 - Digital output to display device

21.

AUTOMATED MACHINE LEARNING TEST SYSTEM

      
Application Number 17840745
Status Pending
Filing Date 2022-06-15
First Publication Date 2022-09-29
Owner SAS Institute Inc. (USA)
Inventor
  • Gardner, Steven Joseph
  • Dunbar, Connie Stout
  • Elsheimer, David Bruce
  • Dunbar, Gregory Scott
  • Griffin, Joshua David
  • Gao, Yan

Abstract

A computing device selects new test configurations for testing software. (A) First test configurations are generated using a random seed value. (B) Software under test is executed with the first test configurations to generate a test result for each. (C) Second test configurations are generated from the first test configurations and the test results generated for each. (D) The software under test is executed with the second test configurations to generate the test result for each. (E) When a restart is triggered based on a distance metric value computed between the second test configurations, a next random seed value is selected as the random seed value and (A) through (E) are repeated. (F) When the restart is not triggered, (C) through (F) are repeated until a stop criterion is satisfied. (G) When the stop criterion is satisfied, the test result is output for each test configuration.

IPC Classes  ?

  • G06F 11/36 - Preventing errors by testing or debugging of software

22.

SPEECH-TO-ANALYTICS FRAMEWORK WITH SUPPORT FOR LARGE N-GRAM CORPORA

      
Application Number CN2021082572
Publication Number 2022/198474
Status In Force
Filing Date 2021-03-24
Publication Date 2022-09-29
Owner SAS INSTITUTE INC. (USA)
Inventor
  • Yang, Xu
  • Li, Xiaolong
  • Wilsey, Biljana Belamaric
  • Liu, Haipeng
  • Peterson, Jared

Abstract

An apparatus includes processor (s) to: generate a set of candidate n-grams based on probability distributions from an acoustic model for candidate graphemes of a next word most likely spoken following at least one preceding word spoken within speech audio; provide the set of candidate n-grams to multiple devices; provide, to each node device, an indication of which candidate n-grams are to be searched for within the n-gram corpus by each node device to enable searches for multiple candidate n-grams to be performed, independently and at least partially in parallel, across the node devices; receive, from each node device, an indication of a probability of occurrence of at least one candidate n-gram within the speech audio; based on the received probabilities of occurrence, identify the next word most likely spoken within the speech audio; and add the next word most likely spoken to a transcript of the speech audio.

IPC Classes  ?

  • G10L 15/32 - Multiple recognisers used in sequence or in parallel; Score combination systems therefor, e.g. voting systems
  • G10L 15/30 - Distributed recognition, e.g. in client-server systems, for mobile phones or network applications
  • G10L 15/04 - Segmentation; Word boundary detection
  • G10L 15/16 - Speech classification or search using artificial neural networks
  • G10L 15/187 - Phonemic context, e.g. pronunciation rules, phonotactical constraints or phoneme n-grams
  • G10L 15/197 - Probabilistic grammars, e.g. word n-grams

23.

Directed acyclic graph machine learning system

      
Application Number 17522062
Grant Number 11443198
Status In Force
Filing Date 2021-11-09
First Publication Date 2022-09-13
Grant Date 2022-09-13
Owner SAS Institute, Inc. (USA)
Inventor
  • Chen, Xilong
  • Huang, Tao
  • Chvosta, Jan

Abstract

A computing device learns a directed acyclic graph (DAG). An SSCP matrix is computed from variable values defined for observation vectors. A topological order vector is initialized that defines a topological order for the variables. A loss value is computed using the topological order vector and the SSCP matrix. (A) A neighbor determination method is selected. (B) A next topological order vector is determined relative to the initialized topological order vector using the neighbor determination method. (C) A loss value is computed using the next topological order vector and the SSCP matrix. (D) (B) and (C) are repeated until each topological order vector is determined in (B) based on the neighbor determination method. A best topological vector is determined from each next topological order vector based on having a minimum value for the computed loss value. An adjacency matrix is computed using the best topological vector and the SSCP matrix.

IPC Classes  ?

  • G06N 5/02 - Knowledge representation; Symbolic representation
  • G06F 17/16 - Matrix or vector computation

24.

Machine-learning techniques for automatically identifying tops of geological layers in subterranean formations

      
Application Number 17481839
Grant Number 11435499
Status In Force
Filing Date 2021-09-22
First Publication Date 2022-09-06
Grant Date 2022-09-06
Owner SAS INSTITUTE INC. (USA)
Inventor
  • Peredriy, Sergiy
  • Holdaway, Keith Richard

Abstract

Tops of geological layers can be automatically identified using machine-learning techniques as described herein. In one example, a system can receive well log records associated with wellbores drilled through geological layers. The system can generate well clusters by applying a clustering process to the well log records. The system can then obtain a respective set of training data associated with a well cluster, train a machine-learning model based on the respective set of training data, select a target well-log record associated with a target wellbore of the well cluster, and provide the target well-log record as input to the trained machine-learning model. Based on an output from the trained machine-learning model, the system can determine the geological tops of the geological layers in a region surrounding the target wellbore. The system may then transmit an electronic signal indicating the geological tops of the geological layers associated with the target wellbore.

IPC Classes  ?

  • E21B 49/00 - Testing the nature of borehole walls; Formation testing; Methods or apparatus for obtaining samples of soil or well fluids, specially adapted to earth drilling or wells
  • G01V 99/00 - Subject matter not provided for in other groups of this subclass
  • G06N 20/00 - Machine learning

25.

Tabular data generation for machine learning model training system

      
Application Number 17559735
Grant Number 11436438
Status In Force
Filing Date 2021-12-22
First Publication Date 2022-09-06
Grant Date 2022-09-06
Owner SAS Institute Inc. (USA)
Inventor
  • Zhang, Ruiwen
  • Wang, Weichen
  • Gomes Da Silva, Jorge Manuel
  • Liu, Ye
  • Azizsoltani, Hamoon
  • Mookiah, Prathaban

Abstract

(A) Conditional vectors are defined. (B) Latent observation vectors are generated using a predefined noise distribution function. (C) A forward propagation of a generator model is executed with the conditional vectors and the latent observation vectors as input to generate an output vector. (D) A forward propagation of a decoder model of a trained autoencoder model is executed with the generated output vector as input to generate a plurality of decoded vectors. (E) Transformed observation vectors are selected from transformed data based on the defined plurality of conditional vectors. (F) A forward propagation of a discriminator model is executed with the transformed observation vectors, the conditional vectors, and the decoded vectors as input to predict whether each transformed observation vector and each decoded vector is real or fake. (G) The discriminator and generator models are updated and (A) through (G) are repeated until training is complete.

IPC Classes  ?

  • G06K 9/62 - Methods or arrangements for recognition using electronic means
  • G06N 3/04 - Architecture, e.g. interconnection topology

26.

Bias mitigating machine learning training system

      
Application Number 17557298
Grant Number 11436444
Status In Force
Filing Date 2021-12-21
First Publication Date 2022-09-06
Grant Date 2022-09-06
Owner SAS Institute Inc. (USA)
Inventor
  • Wu, Xinmin
  • Hunt, Xin Jiang

Abstract

A computing device trains a fair machine learning model. A prediction model is trained to predict a target value. For a number of iterations, a weight vector is computed using the bound value based on fairness constraints defined for a fairness measure type; a weight value is assigned to each observation vector based on the target value and a sensitive attribute value; the prediction model is trained with each weighted observation vector to predict the target value; and a conditional moments vector is computed based on the fairness constraints and the target and sensitive attribute values. Conditional moments difference values are computed. When the conditional moments difference values indicate to adjust the bound value, the bound value is updated and the process is repeated with the bound value replaced with the updated bound value until the conditional moments difference values indicate no further adjustment of the bound value is needed.

IPC Classes  ?

  • G06N 20/00 - Machine learning
  • G06K 9/62 - Methods or arrangements for recognition using electronic means

27.

Leveraging text profiles to select and configure models for use with textual datasets

      
Application Number 17565824
Grant Number 11423680
Status In Force
Filing Date 2021-12-30
First Publication Date 2022-08-23
Grant Date 2022-08-23
Owner SAS INSTITUTE INC. (USA)
Inventor
  • Jade, Teresa S.
  • Li, Xiao
  • Zuo, Chunqi
  • Kovach, Paul Jeffrey

Abstract

Text profiles can be leveraged to select and configure models according to some examples described herein. In one example, a system can analyze a reference textual dataset and a target textual dataset using text-mining techniques to generate a first text profile and a second text profile, respectively. The first text profile can contain first metrics characterizing the reference textual dataset and the second text profile can contain second metrics characterizing the target textual dataset. The system can determine a similarity value by comparing the first text profile to the second text profile. The system can also receive a user selection of a model that is to be applied to the target textual dataset. The system can then generate an insight relating to an anticipated accuracy of the model on the target textual dataset based on the similarity value. The system can output the insight to the user.

IPC Classes  ?

  • G06F 16/335 - Filtering based on additional data, e.g. user or group profiles
  • G06V 30/19 - Recognition using electronic means
  • G06F 40/10 - Text processing

28.

Automated job flow generation to provide object views in container-supported many task computing

      
Application Number 17733196
Grant Number 11544110
Status In Force
Filing Date 2022-04-29
First Publication Date 2022-08-18
Grant Date 2023-01-03
Owner SAS INSTITUTE INC. (USA)
Inventor
  • Bequet, Henry Gabriel Victor
  • Stogner, Ronald Earl
  • Yang, Eric Jian
  • Zhang, Chaowang “ricky”

Abstract

An apparatus includes a processor to receive a request to provide a view of an object associated with a job flow, and in response to determining that the object is associated with a task type requiring access to a particular resource not accessible to a first interpretation routine: store, within a job queue, a job flow generation request message to cause generation of a job flow definition the defines another job flow for generating the requested view; within a task container in which a second interpretation routine that does have access to the particular resource is executed, generate the job flow definition; store, within a task queue, a job flow generation completion message that includes a copy of the job flow definition; use the job flow definition to perform the other job flow to generate the requested view; and transmit the requested view to the requesting device.

IPC Classes  ?

  • G06F 9/48 - Program initiating; Program switching, e.g. by interrupt

29.

Tabular data generation with attention for machine learning model training system

      
Application Number 17560474
Grant Number 11416712
Status In Force
Filing Date 2021-12-23
First Publication Date 2022-08-16
Grant Date 2022-08-16
Owner SAS Institute, Inc. (USA)
Inventor
  • Dizche, Amirhassan Fallah
  • Liu, Ye
  • Hunt, Xin Jiang
  • Gomes Da Silva, Jorge Manuel

Abstract

A computing device generates synthetic tabular data. Until a convergence parameter value indicates that training of an attention generator model is complete, conditional vectors are defined; latent vectors are generated using a predefined noise distribution function; a forward propagation of an attention generator model that includes an attention model integrated with a conditional generator model is executed to generate output vectors; transformed observation vectors are selected; a forward propagation of a discriminator model is executed with the transformed observation vectors, the conditional vectors, and the output vectors to predict whether each transformed observation vector and each output vector is real or fake; a discriminator model loss value is computed based on the predictions; the discriminator model is updated using the discriminator model loss value; an attention generator model loss value is computed based on the predictions; and the attention generator model is updated using the attention generator model loss value.

IPC Classes  ?

  • G06K 9/62 - Methods or arrangements for recognition using electronic means
  • G06N 3/04 - Architecture, e.g. interconnection topology

30.

Automated virtual machine resource management in container-supported many task computing

      
Application Number 17733090
Grant Number 11544109
Status In Force
Filing Date 2022-04-29
First Publication Date 2022-08-11
Grant Date 2023-01-03
Owner SAS INSTITUTE INC. (USA)
Inventor
  • Bequet, Henry Gabriel Victor
  • Stogner, Ronald Earl
  • Yang, Eric Jian
  • Zhang, Chaowang “ricky”

Abstract

An apparatus includes a processor to: receive a request to perform a job flow; within a performance container, based on the data dependencies among a set of tasks of the job flow, derive an order of performance of the set of tasks that includes a subset able to be performed in parallel, and derive a quantity of task containers to enable the parallel performance of the subset; based on the derived quantity of task containers, derive a quantity of virtual machines (VMs) to enable the parallel performance of the subset; provide, to a VM allocation routine, an indication of a need for provision of the quantity of VMs; and store, within a task queue, multiple task routine execution request messages to enable parallel execution of task routines within the quantity of task containers to cause the parallel performance of the subset.

IPC Classes  ?

  • G06F 9/48 - Program initiating; Program switching, e.g. by interrupt

31.

Automated trending input recognition and assimilation in forecast modeling

      
Application Number 17554281
Grant Number 11409966
Status In Force
Filing Date 2021-12-17
First Publication Date 2022-08-09
Grant Date 2022-08-09
Owner SAS INSTITUTE INC. (USA)
Inventor
  • Phand, Anand Arun
  • Guhaneogi, Sudeshna
  • Veeraraghavan, Narender Ceechamangalam
  • Chauhan, Ravinder Singh
  • Bhat, Shikha
  • Khandwe, Kaustubh Yashvant
  • Sinha, Shalini
  • Roy, Vineet
  • Asadullina, Alina Olegovna
  • Plekhanov, Vitaly Igorevich
  • Lavrenova, Elizaveta Alekseevna
  • Bodunov, Dmitry Sergeevich
  • Kubaeva, Assol Raufjonovna
  • Ondrik, Stephen Joseph
  • Schlüter, Steffen-Horst
  • Martino, Joseph Michael
  • Zhao, John Zhiqiang
  • Bhalerao, Pravinkumar
  • Larina, Valentina

Abstract

An apparatus to: analyze a data set to identify a candidate topic not in a set of topics; determine whether the prominence of the candidate topic within the data set meets a threshold; in response to meeting the threshold, retrieve a rate of increase in frequency of the candidate topic in online searches; in response to meeting a threshold rate of increase, retrieve the keyword most frequently used in online searches for the candidate topic, use the keyword to retrieve a supplemental data set, and analyze input data extracted from the supplemental data set to determine whether the candidate topic can change the accuracy of a forecast model; and in response to determining that the candidate topic can change the accuracy, add the candidate topic to the set of topics and replace the forecast model with a forecast model trained for the set of topics augmented with the candidate topic.

IPC Classes  ?

32.

Speech-to-analytics framework with support for large n-gram corpora

      
Application Number 17370441
Grant Number 11404053
Status In Force
Filing Date 2021-07-08
First Publication Date 2022-08-02
Grant Date 2022-08-02
Owner SAS INSTITUTE INC. (USA)
Inventor
  • Cheng, Xiaozhuo
  • Yang, Xu
  • Li, Xiaolong
  • Wilsey, Biljana Belamaric
  • Liu, Haipeng
  • Peterson, Jared

Abstract

An apparatus includes processor(s) to: generate a set of candidate n-grams based on probability distributions from an acoustic model for candidate graphemes of a next word most likely spoken following at least one preceding word spoken within speech audio; provide the set of candidate n-grams to multiple devices; provide, to each node device, an indication of which candidate n-grams are to be searched for within the n-gram corpus by each node device to enable searches for multiple candidate n-grams to be performed, independently and at least partially in parallel, across the node devices; receive, from each node device, an indication of a probability of occurrence of at least one candidate n-gram within the speech audio; based on the received probabilities of occurrence, identify the next word most likely spoken within the speech audio; and add the next word most likely spoken to a transcript of the speech audio.

IPC Classes  ?

  • G06N 3/02 - Neural networks
  • G06N 7/00 - Computing arrangements based on specific mathematical models
  • G10L 15/04 - Segmentation; Word boundary detection
  • G10L 15/16 - Speech classification or search using artificial neural networks
  • G10L 15/18 - Speech classification or search using natural language modelling
  • G10L 15/197 - Probabilistic grammars, e.g. word n-grams
  • G10L 15/22 - Procedures used during a speech recognition process, e.g. man-machine dialog
  • G10L 15/30 - Distributed recognition, e.g. in client-server systems, for mobile phones or network applications

33.

Recommendation system with implicit feedback

      
Application Number 17715214
Grant Number 11544767
Status In Force
Filing Date 2022-04-07
First Publication Date 2022-07-28
Grant Date 2023-01-03
Owner SAS Institute Inc. (USA)
Inventor
  • Liao, Xuejun
  • Koch, Patrick Nathan

Abstract

A computing device determines a recommendation. A confidence matrix is computed using a predefined weight value. (A) A first parameter matrix is updated using the confidence matrix, a predefined response matrix, a first step-size parameter value, and a first direction matrix. The predefined response matrix includes a predefined response value by each user to each item and at least one matrix value for which a user has not provided a response to an item. (B) A second parameter matrix is updated using the confidence matrix, the predefined response matrix, a second step-size parameter value, and a second direction matrix. (C) An objective function value is updated based on the first and second parameter matrices. (D) The first and second parameter matrices are trained by repeating (A) through (C). The first and second parameter matrices output for use in predicting a recommended item for a requesting user.

IPC Classes  ?

  • G06Q 30/06 - Buying, selling or leasing transactions
  • G06F 17/16 - Matrix or vector computation
  • G06Q 30/02 - Marketing; Price estimation or determination; Fundraising

34.

Federated area coherency across multiple devices in many-task computing

      
Application Number 17682783
Grant Number 11474863
Status In Force
Filing Date 2022-02-28
First Publication Date 2022-06-23
Grant Date 2022-10-18
Owner SAS INSTITUTE INC. (USA)
Inventor
  • Bequet, Henry Gabriel Victor
  • Zhang, Chaowang “ricky”

Abstract

An apparatus includes a processor to: derive an order of performance of a set of tasks of a job flow; based on the order of performance, store, within a task queue, a first task routine execution request message to cause a first task to be performed; within a first task container, and in response to storage of the first task routine execution request message, execute instructions of a first task routine of a set of task routines, store a mid-flow data set output of the first task within a federated area, and store a first task completion message within the task queue after completion of storage of the mid-flow data set; and in response to the storage of the first task completion message, and based on the order of performance, store, within the task queue, a second task routine execution request message to cause a second task to be performed.

IPC Classes  ?

  • G06F 9/48 - Program initiating; Program switching, e.g. by interrupt

35.

AUTOMATED MACHINE LEARNING TEST SYSTEM

      
Application Number 17523607
Status Pending
Filing Date 2021-11-10
First Publication Date 2022-06-23
Owner SAS Institute Inc. (USA)
Inventor
  • Gao, Yan
  • Griffin, Joshua David
  • Lin, Yu-Min
  • Pederson, Bengt Wisen
  • Tharrington,, Jr., Ricky Dee
  • Tan, Pei-Yi
  • Wright, Raymond Eugene

Abstract

A computing device selects new test configurations for testing software. Software under test is executed with first test configurations to generate a test result for each test configuration. Each test configuration includes a value for each test parameter where each test parameter is an input to the software under test. A predictive model is trained using each test configuration of the first test configurations in association with the test result generated for each test configuration based on an objective function value. The predictive model is executed with second test configurations to predict the test result for each test configuration of the second test configurations. Test configurations are selected from the second test configurations based on the predicted test results to define third test configurations. The software under test is executed with the defined third test configurations to generate the test result for each test configuration of the third test configurations.

IPC Classes  ?

  • G06N 20/20 - Ensemble learning
  • G06N 7/00 - Computing arrangements based on specific mathematical models

36.

Handling bulk requests for resources

      
Application Number 17670104
Grant Number 11366699
Status In Force
Filing Date 2022-02-11
First Publication Date 2022-06-21
Grant Date 2022-06-21
Owner SAS INSTITUTE INC. (USA)
Inventor
  • Gulcu, Altan
  • Yao, Xiaodong

Abstract

Some examples describes herein relate to handling bulk requests for resources. In one example, a system can determine a bulk request parameter-value associated with a bulk request. The system can then predict a baseline benefit value, which can be a benefit value when the bulk request parameter-value is used as a lower boundary for a unit parameter-value. The system can also determine a lower boundary constraint on the unit parameter-value independently of the bulk request parameter-value. The system can then execute an iterative process using the baseline benefit value and the lower boundary constraint. Based on a result of the iterative process, the system can determine whether and how much the bulk request parameter-value should be adjusted. The system may adjust the bulk request parameter-value accordingly or output a recommendation to do so.

IPC Classes  ?

  • G06F 9/46 - Multiprogramming arrangements
  • G06F 9/50 - Allocation of resources, e.g. of the central processing unit [CPU]
  • G06F 9/455 - Emulation; Interpretation; Software simulation, e.g. virtualisation or emulation of application or operating system execution engines

37.

Causal inference and policy optimization system based on deep learning models

      
Application Number 17507376
Grant Number 11354566
Status In Force
Filing Date 2021-10-21
First Publication Date 2022-06-07
Grant Date 2022-06-07
Owner SAS Institute Inc. (USA)
Inventor
  • Chen, Xilong
  • Cairns, Douglas Allan
  • Chvosta, Jan
  • Elsheimer, David Bruce
  • Zhao, Yang
  • Chang, Ming-Chun
  • Walton, Gunce Eryuruk
  • Lamm, Michael Thomas

Abstract

(2)). A value is computed for the predefined parameter of interest using the computed influence function value.

IPC Classes  ?

  • G06N 3/04 - Architecture, e.g. interconnection topology
  • G16H 50/50 - ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders
  • G06N 5/04 - Inference or reasoning models
  • G06N 3/00 - Computing arrangements based on biological models

38.

Coordinated performance controller failover in many task computing

      
Application Number 17563697
Grant Number 11513850
Status In Force
Filing Date 2021-12-28
First Publication Date 2022-05-26
Grant Date 2022-11-29
Owner SAS INSTITUTE INC. (USA)
Inventor
  • Bequet, Henry Gabriel Victor
  • Stogner, Ronald Earl
  • Yang, Eric Jian
  • Gong, Qing

Abstract

An apparatus includes a processor to: within a first container, and prior to its uninstantiation, execute a first instance of a routine to cause the processor to monitor for and detect a job performance request in a job queue; and within a second container, execute a second instance of the routine to cause the processor to search the job queue for a job performance request, and in response to a combination of the uninstantiation of the first container, the storage of the job performance request in the job queue and there being no indication of completion of the job flow in the job queue, perform a combination of store an indication of the job flow performance commencing in the job queue, derive an order of performance of the set of tasks of the job flow and store a first task execution request in a task queue.

IPC Classes  ?

  • G06F 9/48 - Program initiating; Program switching, e.g. by interrupt
  • G06F 9/50 - Allocation of resources, e.g. of the central processing unit [CPU]

39.

Risk Monitoring Approach for a Micro Service

      
Application Number 17527889
Status Pending
Filing Date 2021-11-16
First Publication Date 2022-05-19
Owner SAS Institute Inc. (USA)
Inventor
  • Roberts, Terisa
  • Katiyar, Vipul Manoj
  • Malani, Amol Kishor

Abstract

A computer-implemented system and method includes a visualization in a graphical user interface providing a circular hierarchical modeling view for monitoring the health of a model at each circular hierarchy level of a plurality of circular hierarchy levels. The visualization includes a first presentation of heath rules of the model at each hierarchy level of the circular hierarchy, the heath rules comprising a measure of a category, an analysis type, and an analysis model, and a second presentation of one or more health indicators at each hierarchy level. The health indicators comprise one or more colors in the visualization representing an indication of a health goal or a health status of the model.

IPC Classes  ?

  • G06T 11/20 - Drawing from basic elements, e.g. lines or circles
  • G06F 3/04817 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
  • G06T 11/00 - 2D [Two Dimensional] image generation

40.

Automatic spatial regression system

      
Application Number 17524406
Grant Number 11328225
Status In Force
Filing Date 2021-11-11
First Publication Date 2022-05-10
Grant Date 2022-05-10
Owner SAS Institute Inc. (USA)
Inventor
  • Wu, Guohui
  • Chvosta, Jan
  • Xu, Wan
  • Walton, Gunce Eryuruk
  • Chen, Xilong

Abstract

A computing device selects a trained spatial regression model. A spatial weights matrix defined for observation vectors is selected, where each element of the spatial weights matrix indicates an amount of influence between respective pairs of observation vectors. Each observation vector is spatially referenced. A spatial regression model is selected from spatial regression models, initialized, and trained using the observation vectors and the spatial weights matrix to fit a response variable using regressor variables. Each observation vector includes a response value for the response variable and a regressor value for each regressor variable of the regressor variables. A fit criterion value is computed for the spatial regression model and the spatial regression model selection, initialization, and training are repeated until each spatial regression model is selected. A best spatial regression model is selected and output as the spatial regression model having an extremum value of the fit criterion value.

IPC Classes  ?

41.

Recommendation system

      
Application Number 17386853
Grant Number 11379743
Status In Force
Filing Date 2021-07-28
First Publication Date 2022-05-05
Grant Date 2022-07-05
Owner SAS Institute Inc. (USA)
Inventor
  • Liao, Xuejun
  • Koch, Patrick Nathan
  • Huang, Shunping
  • Xu, Yan

Abstract

A computing device determines a recommendation. (A) A first parameter matrix is updated using a first direction matrix and a first step-size parameter value that is greater than one. The first parameter matrix includes a row dimension equal to a number of users of a plurality of users included in a ratings matrix and the ratings matrix includes a missing matrix value. (B) A second parameter matrix is updated using a second direction matrix and a second step-size parameter value that is greater than one. The second parameter matrix includes a column dimension equal to a number of items of a plurality of items included in the ratings matrix. (C) An objective function value is updated based on the first parameter matrix and the second parameter matrix. (D) (A) through (C) are repeated until the first parameter matrix and the second parameter matrix satisfy a convergence test.

IPC Classes  ?

42.

Computer-vision techniques for time-series recognition and analysis

      
Application Number 17518274
Grant Number 11321954
Status In Force
Filing Date 2021-11-03
First Publication Date 2022-05-03
Grant Date 2022-05-03
Owner SAS INSTITUTE INC. (USA)
Inventor
  • Lee, Taiyeong
  • Leonard, Michael James

Abstract

Some examples herein describe time-series recognition and analysis techniques with computer vision. In one example, a system can access an image depicting data lines representing time series datasets. The system can execute a clustering process to assign pixels in the image to pixel clusters. The system can generate image masks based on attributes of the pixel clusters, and identify a respective set of line segments defining the respective data line associated with each image mask. The system can determine pixel sets associated with the time series datasets based on the respective set of line segments associated with each image mask, and provide one or more pixel sets as input for a computing operation that processes the pixel sets and returns a processing result. The system may then display the processing result on a display device or perform another task based on the processing result.

IPC Classes  ?

  • G06K 9/00 - Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
  • G06V 30/41 - Analysis of document content
  • G06T 7/90 - Determination of colour characteristics
  • G06T 3/40 - Scaling of a whole image or part thereof
  • G06T 11/20 - Drawing from basic elements, e.g. lines or circles
  • G06T 7/13 - Edge detection
  • G06T 7/162 - Segmentation; Edge detection involving graph-based methods
  • G06V 10/26 - Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
  • G06V 10/48 - Extraction of image or video features by mapping characteristic values of the pattern into a parameter space, e.g. Hough transformation
  • G06V 10/72 - Data preparation, e.g. statistical preprocessing of image or video features
  • G06V 10/762 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using clustering, e.g. of similar faces in social networks

43.

Diagnostic techniques for monitoring physical devices and resolving operational events

      
Application Number 17501218
Grant Number 11322976
Status In Force
Filing Date 2021-10-14
First Publication Date 2022-05-03
Grant Date 2022-05-03
Owner SAS INSTITUTE INC. (USA)
Inventor
  • Anderson, Thomas Dale
  • Sharma, Priyadarshini
  • Konya, Mark Joseph
  • Caton, James M.

Abstract

Operational events associated with a target physical device can be detected for mitigation by implementing some aspects described herein. For example, a system can apply a sliding window to received sensor measurements at successive time intervals to generate a set of data windows. The system can determine a set of eigenvectors associated with the set of data windows by performing principal component analysis on a set of data points in the set of data windows. The system can determine a set of angle changes between pairs of eigenvectors. The system can generate a measurement profile by executing an integral transform on the set of angle changes. One or more trained machine-learning models are configured to detect an operational event associated with the target physical device based on the measurement profile and generate an output indicating the operational event.

IPC Classes  ?

  • H02J 13/00 - Circuit arrangements for providing remote indication of network conditions, e.g. an instantaneous record of the open or closed condition of each circuitbreaker in the network; Circuit arrangements for providing remote control of switching means in a power distribution network, e.g. switching in and out of current consumers by using a pulse code signal carried by the network
  • G06N 3/04 - Architecture, e.g. interconnection topology
  • G06K 9/62 - Methods or arrangements for recognition using electronic means

44.

Incremental singular value decomposition in support of machine learning

      
Application Number 17504634
Grant Number 11314844
Status In Force
Filing Date 2021-10-19
First Publication Date 2022-04-26
Grant Date 2022-04-26
Owner SAS Institute Inc. (USA)
Inventor
  • Jiang, Hansi
  • Chaudhuri, Arin

Abstract

A singular value decomposition (SVD) is computed of a first matrix to define a left matrix, a diagonal matrix, and a right matrix. The left matrix, the diagonal matrix, and the right matrix are updated using an arrowhead matrix structure defined from the diagonal matrix and by adding a next observation vector to a last row of the first matrix. The updated left matrix, the updated diagonal matrix, and the updated right matrix are updated using a diagonal-plus-rank-one (DPR1) matrix structure defined from the updated diagonal matrix and by removing an observation vector from a first row of the first matrix. Eigenpairs of the DPR1 matrix are computed based on whether a value computed from the updated left matrix is positive or negative. The left matrix updated in (C), the diagonal matrix updated in (C), and the right matrix updated in (C) are output.

IPC Classes  ?

45.

Automatically generating rules for event detection systems

      
Application Number 17225238
Grant Number 11354583
Status In Force
Filing Date 2021-04-08
First Publication Date 2022-04-21
Grant Date 2022-06-07
Owner SAS INSTITUTE INC. (USA)
Inventor
  • Azizsoltani, Hamoon
  • Mookiah, Prathaban
  • Wang, Weichen
  • O'Connell, Thomas J.

Abstract

Logical rules can be automatically generated for use with event detection systems according to some aspects of the present disclosure. For example, a computing device can extract a group of logical rules from trained decision trees and apply a test data set to the group of logical rules to determine count values corresponding to the logical rules. The computing device can then determine performance metric values based on the count values, select a subset of logical rules from among the group of logical rules based on the performance metric values, and provide at least one logical rule in the subset for use with an event detection system. The event detection system can be configured to detect an event in relation to a target data set that was not used to train the decision trees.

IPC Classes  ?

  • G06N 5/02 - Knowledge representation; Symbolic representation
  • G06N 5/00 - Computing arrangements using knowledge-based models
  • G06F 11/34 - Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation
  • G06F 11/36 - Preventing errors by testing or debugging of software
  • G06K 9/62 - Methods or arrangements for recognition using electronic means

46.

Neural network training system

      
Application Number 17499972
Grant Number 11403527
Status In Force
Filing Date 2021-10-13
First Publication Date 2022-04-14
Grant Date 2022-08-02
Owner SAS Institute Inc. (USA)
Inventor
  • Wu, Xinmin
  • Wang, Yingjian
  • Hu, Xiangqian

Abstract

A computing device trains a neural network machine learning model. A forward propagation of a first neural network is executed. A backward propagation of the first neural network is executed from a last layer to a last convolution layer to compute a gradient vector. A discriminative localization map is computed for each observation vector with the computed gradient vector using a discriminative localization map function. An activation threshold value is selected for each observation vector from at least two different values based on a prediction error of the first neural network. A biased feature map is computed for each observation vector based on the activation threshold value selected for each observation vector. A masked observation vector is computed for each observation vector using the biased feature map. A forward and a backward propagation of a second neural network is executed a predefined number of iterations using the masked observation vector.

IPC Classes  ?

  • G06N 3/08 - Learning methods
  • G06N 3/04 - Architecture, e.g. interconnection topology
  • G06V 10/75 - Image or video pattern matching; Proximity measures in feature spaces using context analysis; Selection of dictionaries

47.

Method to increase discovery pipeline hit rates and lab to field translation

      
Application Number 17483093
Grant Number 11373121
Status In Force
Filing Date 2021-09-23
First Publication Date 2022-04-14
Grant Date 2022-06-28
Owner SAS Institute Inc. (USA)
Inventor
  • Gottula, John Wesley
  • Mutell, Bryan Matthew
  • Henderson, Ii, Michael Lee

Abstract

The computing device transforms lab data and field data into a first format suitable for execution with a supervised machine learning model to determine an input variable importance for a first set of input variables in predicting a field outcome. Based on the determination, the computing device generates one or more logical rules of decision metrics, selects the one or more input variables that yields a higher input variable importance, and generates one or more pass-fail indicators. The computing device combines the one or more pass-fail indicators and generates one or more prediction factor rules. The computing device transforms the field data and the one or more prediction factor rules into a second format suitable for execution with a model to determine a treatment effect for the one or more prediction factor rules. The computing device selects the prediction factor rule that maximizes the treatment effect.

IPC Classes  ?

48.

Implicit status in many task computing

      
Application Number 17558237
Grant Number 11455190
Status In Force
Filing Date 2021-12-21
First Publication Date 2022-04-14
Grant Date 2022-09-27
Owner SAS INSTITUTE INC. (USA)
Inventor
  • Bequet, Henry Gabriel Victor
  • Stogner, Ronald Earl
  • Yang, Eric Jian
  • Gong, Qing

Abstract

An apparatus includes a processor to: within a performance container, execute a performance routine to derive an order of performance of tasks of a job flow based on dependencies, begin performing the tasks, and store, within a job queue, a job performance status indication including task performance statuses; identify a set of sub flows within the job flow based on branches in the job flow; correlate each of the task performance statuses to a corresponding sub flow performance status; reduce the job performance status indication size by, for each sub flow in which all tasks have been completed, replace the corresponding task performance statuses with the corresponding sub flow performance status of completed, and for each sub flow with no task performed, replace the corresponding task performance statuses with the corresponding sub flow performance status of not executed; and transmit the job performance status indication to the requesting device.

IPC Classes  ?

  • G06F 9/48 - Program initiating; Program switching, e.g. by interrupt
  • H05B 6/06 - Control, e.g. of temperature, of power
  • H05B 6/12 - Cooking devices

49.

Dataset overlap query system

      
Application Number 17529483
Grant Number 11301473
Status In Force
Filing Date 2021-11-18
First Publication Date 2022-04-12
Grant Date 2022-04-12
Owner SAS Institute Inc. (USA)
Inventor Swain, Pradeep Kumar

Abstract

A computing device responds to a membership overlap query. A list of unique member identifiers included in a plurality of datasets is created. A list of datasets of the plurality of datasets is defined for each unique member identifier. Each dataset included in the list of datasets includes a unique member associated with a respective unique member identifier. A unique list of datasets is defined from each list of datasets. A number of occurrences of each unique list of datasets is determined. A number of datasets included in each unique list of datasets is determined. Intersection data is created that includes a dataset list of each unique list of datasets in association with the number of occurrences of each respective, unique list of datasets and with the number of datasets included in each respective, unique list of datasets. An overlap response is determined using the created intersection data.

IPC Classes  ?

50.

Distributed interaction feature generation system

      
Application Number 17504738
Grant Number 11281689
Status In Force
Filing Date 2021-10-19
First Publication Date 2022-03-22
Grant Date 2022-03-22
Owner SAS Institute Inc. (USA)
Inventor
  • Gebremariam, Biruk
  • He, Taiping

Abstract

A computing system creates interaction features from variable values in a transformed dataset that includes a variable value computed for each variable of transformed variables computed from a prior execution of a transformation flow applied to an input dataset. An interaction transformation flow definition indicates a subset of the transformed variables, a synthesis definition, and interaction transformation operations to apply to the transformed variables. The synthesis definition describes how the subset of the transformed variables are combined to compute a value input to the interaction transformation operations. A plurality of variable combinations of the subset is defined. A computation is defined for each combination and interaction transformation operation. An operation data value is computed for each computation from the transformed dataset. An observation vector is read from the transformed dataset and a current interaction variable value is synthesized for each combination. A result value is computed for each combination.

IPC Classes  ?

  • G06F 16/00 - Information retrieval; Database structures therefor; File system structures therefor
  • G06F 16/25 - Integrating or interfacing systems involving database management systems
  • G06F 17/18 - Complex mathematical operations for evaluating statistical data
  • G06N 20/00 - Machine learning

51.

Dual use of audio noise level in speech-to-text framework

      
Application Number 17498966
Grant Number 11335350
Status In Force
Filing Date 2021-10-12
First Publication Date 2022-01-27
Grant Date 2022-05-17
Owner SAS INSTITUTE INC. (USA)
Inventor
  • Li, Xiaolong
  • Cheng, Xiaozhuo
  • Yang, Xu

Abstract

An apparatus includes processor(s) to: perform pre-processing operations including derive an audio noise level of speech audio of a speech data set, derive a first relative weighting for first and second segmentation techniques for identifying likely sentence pauses in the speech audio based on the audio noise level, and select likely sentence pauses for a converged set of likely sentence pauses from likely sentence pauses identified by the first and/or second segmentation techniques based on the first relative weighting; and perform speech-to-text processing operations including divide the speech data set into data segments representing speech segments of the speech audio based on the converged set of likely sentence pauses, and derive a second relative weighting based on the audio noise level for selecting words indicated by an acoustic model or by a language model as being most likely spoken in the speech audio for inclusion in a transcript.

IPC Classes  ?

  • G10L 15/26 - Speech to text systems
  • G10L 15/16 - Speech classification or search using artificial neural networks
  • G10L 15/04 - Segmentation; Word boundary detection
  • G10L 25/78 - Detection of presence or absence of voice signals
  • G06N 3/08 - Learning methods
  • G06N 3/04 - Architecture, e.g. interconnection topology
  • G10L 25/30 - Speech or voice analysis techniques not restricted to a single one of groups characterised by the analysis technique using neural networks
  • G10L 15/02 - Feature extraction for speech recognition; Selection of recognition unit

52.

Dual use of acoustic model in speech-to-text framework

      
Application Number 17498811
Grant Number 11373655
Status In Force
Filing Date 2021-10-12
First Publication Date 2022-01-27
Grant Date 2022-06-28
Owner SAS INSTITUTE INC. (USA)
Inventor
  • Li, Xiaolong
  • Cheng, Xiaozhuo
  • Yang, Xu

Abstract

An apparatus includes processor(s) to: perform preprocessing operations of a segmentation technique including divide speech data set into data chunks representing chunks of speech audio, use an acoustic model with each data chunk to identify pauses in the speech audio, and analyze a length of time of each identified pause to identify a candidate set of likely sentence pauses in the speech audio; and perform speech-to-text operations including divide the speech data set into data segments that each representing segments of the speech audio based on the candidate set of likely sentence pauses, use the acoustic model with each data segment to identify likely speech sounds in the speech audio, analyze the identified likely speech sounds to identify candidate sets of words likely spoken in the speech audio, and generate a transcript of the speech data set based at least on the candidate sets of words likely spoken.

IPC Classes  ?

  • G10L 15/16 - Speech classification or search using artificial neural networks
  • G10L 25/78 - Detection of presence or absence of voice signals
  • G10L 15/26 - Speech to text systems
  • G10L 15/04 - Segmentation; Word boundary detection
  • G06N 3/08 - Learning methods
  • G06N 3/04 - Architecture, e.g. interconnection topology
  • G10L 25/30 - Speech or voice analysis techniques not restricted to a single one of groups characterised by the analysis technique using neural networks
  • G10L 15/02 - Feature extraction for speech recognition; Selection of recognition unit

53.

Distributed classification computing system

      
Application Number 17368941
Grant Number 11227223
Status In Force
Filing Date 2021-07-07
First Publication Date 2022-01-18
Grant Date 2022-01-18
Owner SAS Institute Inc. (USA)
Inventor Wang, Yingjian

Abstract

A computing system trains a classification model using distributed training data. In response to receipt of a first request, a training data subset is accessed and sent to each higher index worker computing device, the training data subset sent by each lower index worker computing device is received, and a first kernel matrix block and a second kernel matrix block are computed using a kernel function and the accessed or received training data subsets. (A) In response to receipt of a second request from the controller device, a first vector is computed using the first and second kernel matrix blocks, a latent function vector and an objective function value are computed, and the objective function value is sent to the controller device. (A) is repeated until the controller device determines training of a classification model is complete. Model parameters for the trained classification model are output.

IPC Classes  ?

54.

Graphical user interface and debugger system for selecting and testing alterations to source code for software applications

      
Application Number 17458881
Grant Number 11200151
Status In Force
Filing Date 2021-08-27
First Publication Date 2021-12-14
Grant Date 2021-12-14
Owner SAS INSTITUTE INC. (USA)
Inventor Cates, Claire Smith

Abstract

Testing for software applications can be implemented according to some aspects described herein. For example, a system can receive override data, including a location of a logical statement in source code and an override command, that is associated with a software application. The system can generate debugging data based on the override data, the debugging data including a breakpoint associated with the location and a debugger command corresponding to the override command. The system can then provide the debugging data as input to debugging software, the debugging software being configured to monitor execution of the software application during a software test. The debugging software can determine that the breakpoint has been reached and responsively execute the debugger command for testing a target portion of source code for the software application.

IPC Classes  ?

  • G06F 11/36 - Preventing errors by testing or debugging of software

55.

Semi-supervised classification system

      
Application Number 17342825
Grant Number 11200514
Status In Force
Filing Date 2021-06-09
First Publication Date 2021-12-14
Grant Date 2021-12-14
Owner SAS Institute Inc. (USA)
Inventor
  • Chen, Xu
  • Wu, Xinmin

Abstract

Unclassified observations are classified. Similarity values are computed for each unclassified observation and for each target variable value. A confidence value is computed for each unclassified observation using the similarity values. A high-confidence threshold value and a low-confidence threshold value are computed from the confidence values. For each observation, when the confidence value is greater than the high-confidence threshold value, the observation is added to a training dataset and, when the confidence value is greater than the low-confidence threshold value and less than the high-confidence threshold value, the observation is added to the training dataset based on a comparison between a random value drawn from a uniform distribution and an inclusion percentage value. A classification model is trained with the training dataset and classified observations. The trained classification model is executed with the unclassified observations to determine a label assignment.

IPC Classes  ?

56.

Neural network training system

      
Application Number 17198737
Grant Number 11195084
Status In Force
Filing Date 2021-03-11
First Publication Date 2021-12-07
Grant Date 2021-12-07
Owner SAS Institute Inc. (USA)
Inventor
  • Wu, Xinmin
  • Wang, Yingjian
  • Hu, Xiangqian

Abstract

A computing device trains a neural network machine learning model. A forward propagation of a first neural network is executed. A backward propagation of the first neural network is executed from a last layer to a last convolution layer of a plurality of convolutional layers to compute a gradient vector for first weight values of the last convolution layer using observation vectors. A discriminative localization map is computed for each observation vector with the gradient vector using a discriminative localization map function. A forward and a backward propagation of a second neural network is executed to compute a second weight value for each neuron of the second neural network using the discriminative localization map computed for each observation vector. A predefined number of iterations of the forward and the backward propagation of the second neural network is repeated.

IPC Classes  ?

57.

Machine learning classification system

      
Application Number 17386706
Grant Number 11379685
Status In Force
Filing Date 2021-07-28
First Publication Date 2021-11-18
Grant Date 2022-07-05
Owner SAS Institute Inc. (USA)
Inventor Chen, Xu

Abstract

A computing device classifies unclassified observations. A first batch of unclassified observation vectors and a first batch of classified observation vectors are selected. A prior regularization error value and a decoder reconstruction error value are computed. A first batch of noise observation vectors is generated. An evidence lower bound (ELBO) value is computed. A gradient of an encoder neural network model is computed, and the ELBO value is updated. A decoder neural network model and an encoder neural network model are updated. The decoder neural network model is trained. The target variable value is determined for each observation vector of the unclassified observation vectors based on an output of the trained decoder neural network model. The target variable value is output.

IPC Classes  ?

  • G06K 9/62 - Methods or arrangements for recognition using electronic means
  • G06N 3/08 - Learning methods

58.

Hyperparameter tuning system results viewer

      
Application Number 17099846
Grant Number 11151480
Status In Force
Filing Date 2020-11-17
First Publication Date 2021-10-19
Grant Date 2021-10-19
Owner SAS Institute Inc. (USA)
Inventor
  • Golovidov, Oleg Borisovich
  • Wujek, Brett Alan
  • Koch, Patrick Nathan
  • Singh, Rajendra Prasad

Abstract

A visualization is presented while tuning a machine learning model. A model tuning process writes tuning data to a history table. The model tuning process is repeatedly training and scoring a model type with different sets of values of hyperparameters defined based on the model type. An objective function value is computed for each set of values of the hyperparameters. Data stored in the history table is accessed and used to identify the hyperparameters. (A) A page template is selected from page templates that describe graphical objects presented in the display. (B) The page template is updated with the accessed data. (C) The display is updated using the updated page template. (D) At the end of a refresh time period, new data stored in the history table by the model tuning process is accessed. (E) (B) through (D) are repeated with the accessed data replaced with the accessed new data.

IPC Classes  ?

  • G06N 20/10 - Machine learning using kernel methods, e.g. support vector machines [SVM]
  • G06F 3/0483 - Interaction with page-structured environments, e.g. book metaphor
  • G06F 16/958 - Organisation or management of web site content, e.g. publishing, maintaining pages or automatic linking
  • G06F 16/904 - Browsing; Visualisation therefor

59.

Predicting and managing requests for computing resources or other resources

      
Application Number 17088403
Grant Number 11556791
Status In Force
Filing Date 2020-11-03
First Publication Date 2021-10-07
Grant Date 2023-01-17
Owner SAS INSTITUTE INC. (USA)
Inventor
  • Prabhudesai, Kedar Shriram
  • Valsaraj, Varunraj
  • Yi, Jinxin
  • Woo, Daniel Keongson
  • Baldridge, Jr., Roger Lee

Abstract

Requests for computing resources and other resources can be predicted and managed. For example, a system can determine a baseline prediction indicating a number of requests for an object over a future time-period. The system can then execute a first model to generate a first set of values based on seasonality in the baseline prediction, a second model to generate a second set of values based on short-term trends in the baseline prediction, and a third model to generate a third set of values based on the baseline prediction. The system can select a most accurate model from among the three models and generate an output prediction by applying the set of values output by the most accurate model to the baseline prediction. Based on the output prediction, the system can cause an adjustment to be made to a provisioning process for the object.

IPC Classes  ?

60.

Range overlap query response system for graph data

      
Application Number 17225228
Grant Number 11132364
Status In Force
Filing Date 2021-04-08
First Publication Date 2021-09-28
Grant Date 2021-09-28
Owner SAS Institute Inc. (USA)
Inventor
  • Galati, Matthew Victor
  • Reese, Brandon Michael

Abstract

A computing system determines a response to a query. A bin start value and a bin stop value is defined for each bin based on an input bin option. End nodes are split based on the bin start value and the bin stop value of each bin to define a second plurality of end nodes. Each start node of a plurality of start nodes that is connected to each end node of the second plurality of end nodes is identified based on the respective link attributes of a plurality of link attributes. Overlapping start nodes of the plurality of start nodes that overlap at an end node of the second plurality of end nodes are identified based on a predefined overlap query graph that defines a connectivity to identify between a start node and the end node. The identified overlapping start nodes are output as a response to the predefined overlap query graph.

IPC Classes  ?

61.

Graphical user interface for searching on a network pattern

      
Application Number 17122349
Grant Number 11231830
Status In Force
Filing Date 2020-12-15
First Publication Date 2021-09-23
Grant Date 2022-01-25
Owner SAS Institute Inc. (USA)
Inventor
  • Morris, James Byron
  • Ablitt, Nicholas Akbar
  • Chari, Manoj Keshavmurthi

Abstract

A computing system displays an initial graph with icons. Each icon graphically represents data associated with a respective entity. The first icon is connected in the initial graph to other icon(s). The system receives an indication of a graphical network pattern. The graphical network pattern is defined by a user selection of a second icon in the initial graph and: a user selection of a third icon in the initial graph; or a user selection of a graphical representation in the initial graph of a relationship between the second icon and the third icon. The system sends computer instructions indicating a network pattern query for searching an electronic database for electronic record(s) corresponding to a queried network pattern. The system receives a dataset indicating located electronic record(s) corresponding to the queried network pattern. The system generates output data indicating an output graph for a graphical representation of the located record(s).

IPC Classes  ?

  • G06F 3/0481 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
  • G06F 16/245 - Query processing
  • G06F 16/901 - Indexing; Data structures therefor; Storage structures
  • G06F 16/248 - Presentation of query results
  • G06F 16/2458 - Special types of queries, e.g. statistical queries, fuzzy queries or distributed queries
  • G06F 16/28 - Databases characterised by their database models, e.g. relational or object models
  • G06F 9/451 - Execution arrangements for user interfaces
  • G06F 3/0482 - Interaction with lists of selectable items, e.g. menus

62.

Dynamic model selection in speech-to-text processing

      
Application Number 17205871
Grant Number 11145309
Status In Force
Filing Date 2021-03-18
First Publication Date 2021-09-23
Grant Date 2021-10-12
Owner SAS INSTITUTE INC. (USA)
Inventor Yang, Xu

Abstract

An apparatus includes processor(s) to: use an acoustic model to generate a first set of probabilities of speech sounds uttered within speech audio; derive at least a first candidate word most likely spoken in the speech audio using the first set; analyze the first set to derive a degree of uncertainty therefor; compare the degree of uncertainty to a threshold; in response to at least the degree of uncertainty being less than the threshold, select the first candidate word as a next word most likely spoken in the speech audio; in response to at least the degree of uncertainty being greater than the threshold, select, as the next word most likely spoken in the speech audio, a second candidate word indicated as being most likely spoken based on a second set of probabilities generated by a language model; and add the next word most likely spoken to a transcript.

IPC Classes  ?

  • G10L 15/16 - Speech classification or search using artificial neural networks
  • G10L 15/22 - Procedures used during a speech recognition process, e.g. man-machine dialog
  • G10L 15/26 - Speech to text systems
  • G10L 25/30 - Speech or voice analysis techniques not restricted to a single one of groups characterised by the analysis technique using neural networks
  • G10L 15/04 - Segmentation; Word boundary detection
  • G06N 3/04 - Architecture, e.g. interconnection topology
  • G10L 25/78 - Detection of presence or absence of voice signals
  • G06N 3/08 - Learning methods

63.

User interfaces for converting geospatial data into audio outputs

      
Application Number 17001195
Grant Number 11257396
Status In Force
Filing Date 2020-08-24
First Publication Date 2021-09-23
Grant Date 2022-02-22
Owner SAS INSTITUTE INC. (USA)
Inventor
  • Summers, Ii, Claude Edward
  • Mealin, Sean Patrick
  • Langston, Julianna Elizabeth
  • Kraus, Gregory David
  • Williamson, Jonathan Tyler
  • Robinson, Lisa Beth Morton
  • Sookne, Jesse Daniel
  • Smith, Brice Joseph

Abstract

Geospatial data can be converted into audio outputs. For example, a system can receive a dataset indicating geospatial locations of objects within a region. Based on the dataset, the system can generate a virtual map representing the region and including virtual points representing the objects. The virtual points can be spatially positioned at locations in the virtual map corresponding to the geospatial locations of the objects in the region. The system can receive a user input via a user input device for interacting with a particular virtual point among the virtual points in the virtual map. The system can determine one or more sound characteristics for a sound based on receiving the user input. The system can then transmit an audio signal to an audio device for causing the audio device to generate the sound having the one or more sound characteristics, which may assist with exploring the virtual map.

IPC Classes  ?

  • G09B 21/00 - Teaching, or communicating with, the blind, deaf or mute
  • G06F 3/16 - Sound input; Sound output
  • A61H 3/06 - Walking aids for blind persons

64.

Speech audio pre-processing segmentation

      
Application Number 17138445
Grant Number 11138979
Status In Force
Filing Date 2020-12-30
First Publication Date 2021-09-23
Grant Date 2021-10-05
Owner SAS INSTITUTE INC. (USA)
Inventor
  • Cheng, Xiaozhuo
  • Yang, Xu
  • Li, Xiaolong

Abstract

An apparatus includes processor(s) to: divide a speech data set into multiple data chunks that each represent a chunk of speech audio; derive a threshold amplitude based on at least one peak amplitude of the speech audio; designate each data chunk with a peak amplitude below the threshold amplitude a pause data chunk; within a set of temporally consecutive data chunks of the multiple data chunks, identify a longest subset of temporally consecutive pause data chunks; within the set of temporally consecutive data chunks, designate the longest subset of temporally consecutive pause data chunks as a likely sentence pause of a candidate set of likely sentence pauses; based on at least the candidate set, divide the speech data set into multiple data segments that each represent a speech segment of the speech audio; and perform speech-to-text conversion, to identify a sentence spoken in each speech segment.

IPC Classes  ?

  • G10L 15/16 - Speech classification or search using artificial neural networks
  • G10L 15/02 - Feature extraction for speech recognition; Selection of recognition unit
  • G10L 15/26 - Speech to text systems
  • G10L 15/04 - Segmentation; Word boundary detection
  • G10L 25/78 - Detection of presence or absence of voice signals
  • G06N 3/08 - Learning methods
  • G06N 3/04 - Architecture, e.g. interconnection topology
  • G10L 25/30 - Speech or voice analysis techniques not restricted to a single one of groups characterised by the analysis technique using neural networks

65.

Distributable event prediction and machine learning recognition system

      
Application Number 17178798
Grant Number 11151463
Status In Force
Filing Date 2021-02-18
First Publication Date 2021-09-16
Grant Date 2021-10-19
Owner SAS Institute Inc. (USA)
Inventor
  • Chen, Xu
  • Da Silva, Jorge Manuel Gomes
  • Wujek, Brett Alan

Abstract

Data is classified using semi-supervised data. Sparse coefficients are computed using a decomposition of a Laplacian matrix. (B) Updated parameter values are computed for a dimensionality reduction method using the sparse coefficients, the Laplacian matrix, and a plurality of observation vectors. The updated parameter values include a robust estimator of a decomposition matrix determined from the decomposition of the Laplacian matrix. (B) is repeated until a convergence parameter value indicates the updated parameter values for the dimensionality reduction method have converged. A classification matrix is defined using the sparse coefficients and the robust estimator of the decomposition of the Laplacian matrix. The target variable value is determined for each observation vector based on the classification matrix. The target variable value is output for each observation vector of the plurality of unclassified observation vectors and is defined to represent a label for a respective unclassified observation vector.

IPC Classes  ?

66.

High dimensional to low dimensional data transformation and visualization system

      
Application Number 17182424
Grant Number 11120072
Status In Force
Filing Date 2021-02-23
First Publication Date 2021-09-14
Grant Date 2021-09-14
Owner SAS Institute Inc. (USA)
Inventor
  • Shen, Kai
  • Wang, Haoyu
  • Chaudhuri, Arin

Abstract

A computer transforms high-dimensional data into low-dimensional data. (A) A distance matrix is computed from observation vectors. (B) A kernel matrix is computed from the distance matrix using a bandwidth value. (C) The kernel matrix is decomposed using an eigen decomposition to define eigenvalues. (D) A predefined number of largest eigenvalues are selected from the eigenvalues. (E) The selected largest eigenvalues are summed. (F) A next bandwidth value is computed based on the summed eigenvalues. (A) through (F) are repeated with the next bandwidth value until a stop criterion is satisfied. Each observation vector of the observation vectors is transformed into a second space using a kernel principal component analysis with the next bandwidth value and the kernel matrix. The second space has a dimension defined by the predefined number of first eigenvalues. Each transformed observation vector is output.

IPC Classes  ?

  • G06K 9/00 - Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
  • G06F 16/56 - Information retrieval; Database structures therefor; File system structures therefor of still image data having vectorial format
  • G06K 9/62 - Methods or arrangements for recognition using electronic means
  • G06F 17/16 - Matrix or vector computation
  • G06F 16/55 - Clustering; Classification
  • G06N 20/10 - Machine learning using kernel methods, e.g. support vector machines [SVM]

67.

Reducing consumption of computing resources in performing computerized sequence-mining on large data sets

      
Application Number 17209752
Grant Number 11120032
Status In Force
Filing Date 2021-03-23
First Publication Date 2021-09-14
Grant Date 2021-09-14
Owner SAS INSTITUTE INC. (USA)
Inventor
  • Chen, Xilong
  • Wu, Xunlei
  • Chvosta, Jan

Abstract

Computing resources consumed in performing computerized sequence-mining can be reduced by implementing some examples of the present disclosure. In one example, a system can determine weights for data entries in a data set and then select a group of data entries from the data set based on the weights. Next, the system can determine a group of k-length sequences present in the selected group of data entries by applying a shuffling algorithm. The system can then determine frequencies corresponding to the group of k-length sequences and select candidate sequences from among the group of k-length sequences based on the frequencies thereof. Next, the system can determine support values corresponding to the candidate sequences and then select output sequences from among the candidate sequences based on the support values thereof. The system may then transmit an output signal indicating the selected output sequences an electronic device.

IPC Classes  ?

  • G06F 16/2458 - Special types of queries, e.g. statistical queries, fuzzy queries or distributed queries
  • G06F 16/2453 - Query optimisation

68.

Location network analysis tool for predicting contamination change

      
Application Number 17174549
Grant Number 11109194
Status In Force
Filing Date 2021-02-12
First Publication Date 2021-08-31
Grant Date 2021-08-31
Owner SAS Institute Inc. (USA)
Inventor
  • Pinheiro, Carlos Andre Reis
  • Galati, Matthew Victor
  • Summerville, Natalia

Abstract

A computing system receives geolocation information indicating aggregated locations of mobile devices configured to move in a geographic area. The geolocation information comprises measured location(s) for a given mobile device of the mobile devices. The system generates a time series representing mobility network graphs over a first time period. The time series is generated by, for each subperiod in the time series, generating data representing estimated movement of member(s) of a population between locations within the geographic area. The estimated movement is estimated based on the geolocation information and a total population for the geographic area. The system generates metric(s) derived from the time series. The system determines contamination information indicating a respective contamination status for locations for each subperiod of the time series. The system generates a computer model to predict changes in the contamination information in a second time period subsequent to the first time period.

IPC Classes  ?

  • H04W 4/029 - Location-based management or tracking services
  • H04W 4/021 - Services related to particular areas, e.g. point of interest [POI] services, venue services or geofences

69.

Computerized pipelines for transforming input data into data structures compatible with models

      
Application Number 17173308
Grant Number 11106694
Status In Force
Filing Date 2021-02-11
First Publication Date 2021-08-26
Grant Date 2021-08-31
Owner SAS INSTITUTE INC. (USA)
Inventor
  • Cox, James Allen
  • Rausch, Nancy Anne

Abstract

Computerized pipelines can transform input data into data structures compatible with models in some examples. In one such example, a system can obtain a first table that includes first data referencing a set of subjects. The system can then execute a sequence of processing operations on the first data in a particular order defined by a data-processing pipeline to modify an analysis table to include features associated with the set of subjects. Executing each respective processing operation in the sequence to generate the modified analysis table may involve: deriving a respective set of features from the first data by executing a respective feature-extraction operation on the first data; and adding the respective set of features to the analysis table. The system may then execute a predictive model on the modified analysis table for generating a predicted value based on the modified analysis table.

IPC Classes  ?

  • G06F 16/25 - Integrating or interfacing systems involving database management systems
  • G06N 3/04 - Architecture, e.g. interconnection topology
  • G06N 3/08 - Learning methods

70.

Exchange of data objects between task routines via shared memory space

      
Application Number 17308355
Grant Number 11204809
Status In Force
Filing Date 2021-05-05
First Publication Date 2021-08-19
Grant Date 2021-12-21
Owner SAS INSTITUTE INC. (USA)
Inventor
  • Bequet, Henry Gabriel Victor
  • Stogner, Ronald Earl
  • Yang, Eric Jian
  • Gong, Qing
  • Dutta, Partha
  • Arfaoui, Kais

Abstract

An apparatus includes a processor to: based on data dependencies specified in a job flow definition, identify first and second tasks of the corresponding job flow to be performed sequentially, wherein the first task outputs a data object used as an input to the second; store, within a task queue, at least one message conveying at least an identifier of the first task, and an indication that the data object is to be exchanged through a shared memory space; within a task container, in response to storage of the at least one message within the task queue, sequentially execute first and second task routines to sequentially perform the first and second tasks, respectively, and instantiate the shared memory space to be accessible to the first and second task routines during their executions; and upon completion of the job flow, transmit an indication of completion to another device via a network.

IPC Classes  ?

  • G06F 9/50 - Allocation of resources, e.g. of the central processing unit [CPU]
  • G06F 9/46 - Multiprogramming arrangements
  • G06F 9/54 - Interprogram communication
  • G06F 9/48 - Program initiating; Program switching, e.g. by interrupt
  • G06F 9/455 - Emulation; Interpretation; Software simulation, e.g. virtualisation or emulation of application or operating system execution engines

71.

Distributable feature analysis and tree model training system

      
Application Number 17093826
Grant Number 11093864
Status In Force
Filing Date 2020-11-10
First Publication Date 2021-08-17
Grant Date 2021-08-17
Owner SAS Institute Inc. (USA)
Inventor Reese, Brandon Michael

Abstract

A computing system computes a variable relevance using a trained tree model. (A) A next child node is selected. (B) A number of observations associated with the next child node is computed. (C) A population ratio value is computed. (D) A next leaf node is selected. (E) First observations are identified. (F) A first impurity value is computed for the first observations. (G) Second observations are identified when the first observations are associated with the descending child nodes. (H) A second impurity value is computed for the second observations. (I) A gain contribution is computed. (J) A node gain value is updated. (K) (D) through (J) are repeated. (L) A variable gain value is updated for a variable associated with the split test. (M) (A) through (L) are repeated. (N) A set of relevant variables is selected based on the variable gain value.

IPC Classes  ?

72.

Multi-objective distributed hyperparameter tuning system

      
Application Number 17081118
Grant Number 11093833
Status In Force
Filing Date 2020-10-27
First Publication Date 2021-08-17
Grant Date 2021-08-17
Owner SAS Institute Inc. (USA)
Inventor
  • Gardner, Steven Joseph
  • Griffin, Joshua David
  • Xu, Yan
  • Koch, Patrick Nathan
  • Wujek, Brett Alan
  • Golovidov, Oleg Borisovich

Abstract

Tuned hyperparameter values are determined for training a machine learning model. When a selected hyperparameter configuration does not satisfy a linear constraint, if a projection of the selected hyperparameter configuration is included in a first cache that stores previously computed projections is determined. When the projection is included in the first cache, the projection is extracted from the first cache using the selected hyperparameter configuration, and the selected hyperparameter configuration is replaced with the extracted projection in the plurality of hyperparameter configurations. When the projection is not included in the first cache, a projection computation for the selected hyperparameter configuration is assigned to a session. A computed projection is received from the session for the selected hyperparameter configuration. The computed projection and the selected hyperparameter configuration are stored to the first cache, and the selected hyperparameter configuration is replaced with the computed projection.

IPC Classes  ?

  • G06N 3/12 - Computing arrangements based on biological models using genetic models
  • G06N 5/00 - Computing arrangements using knowledge-based models
  • G06N 20/00 - Machine learning

73.

Machine learning classification system

      
Application Number 17224708
Grant Number 11087215
Status In Force
Filing Date 2021-04-07
First Publication Date 2021-08-10
Grant Date 2021-08-10
Owner SAS Institute Inc. (USA)
Inventor Chen, Xu

Abstract

A computing device classifies unclassified observations. A first batch of noise observations is generated. (A) A first batch of unclassified observations is selected. (B) A first batch of classified observations is selected. (C) A discriminator neural network model trained to classify unclassified observations and noise observations is updated with observations that include the first batch of unclassified observations, the first batch of classified observations, and the first batch of noise observations. (D) A discriminator loss value is computed that includes an adversarial loss term computed using a predefined transition matrix. (E) A second batch of unclassified observations is selected. (F) A second batch of noise observations is generated. (G) A generator neural network model trained to generate a fake observation vector for the second batch of noise observations is updated with the second batch of unclassified observations and the second batch of noise observations. (H) (A) to (G) is repeated.

IPC Classes  ?

74.

Universal attention-based reinforcement learning model for control systems

      
Application Number 17177694
Grant Number 11080602
Status In Force
Filing Date 2021-02-17
First Publication Date 2021-08-03
Grant Date 2021-08-03
Owner SAS Institute Inc. (USA)
Inventor
  • Oroojlooyjadid, Afshin
  • Nazari, Mohammadreza
  • Hajinezhad, Davood
  • Silva, Jorge Manuel Gomes Da

Abstract

A computing system trains a reinforcement learning model comprising multiple different attention model components. The reinforcement learning model trains on training data of a first environment (e.g., a first traffic intersection). The reinforcement learning model trains by training a state attention computer model on the training data that weighs each of respective inputs of a respective state. The reinforcement learning model trains by training an action attention computer model that determines a probability of switching from a first action to a second action of the first set of the multiple candidate actions (e.g., changing traffic colors of traffic lights). Alternatively, or additionally, a computing system generates an indication of a selected outcome according to the reinforcement learning model and sends a selection output to the second environment (e.g., a second traffic intersection with more lanes than the first traffic intersection) to implement the selected action in the second environment.

IPC Classes  ?

  • G06N 20/00 - Machine learning
  • G06N 3/08 - Learning methods
  • G08G 1/09 - Arrangements for giving variable traffic instructions
  • G06F 17/18 - Complex mathematical operations for evaluating statistical data

75.

Machine learning classification system

      
Application Number 17202413
Grant Number 11074412
Status In Force
Filing Date 2021-03-16
First Publication Date 2021-07-27
Grant Date 2021-07-27
Owner SAS Institute Inc. (USA)
Inventor
  • Leeman-Munk, Samuel Paul
  • Cox, James Allen
  • Styles, David Blake
  • Crowell, Richard Welland

Abstract

A system trains a classification model. Text windows are defined from tokens based on a window size. A network model including a transformer network is trained with the text windows to define classification information. A first accuracy value is computed. (A) The window size is reduced using a predefined reduction factor value. (B) Second text windows are defined based on the reduced window size. (C) Retrain the network model with the second text windows to define classification information. (D) A second accuracy value is computed. (E) An accuracy reduction value is computed from the second accuracy value relative to the first accuracy value. When the computed accuracy reduction value is ≥an accuracy reduction tolerance value, repeat (A)-(E) until the accuracy reduction value is

IPC Classes  ?

76.

Per task routine distributed resolver

      
Application Number 17225023
Grant Number 11169788
Status In Force
Filing Date 2021-04-07
First Publication Date 2021-07-22
Grant Date 2021-11-09
Owner SAS INSTITUTE INC. (USA)
Inventor
  • Bequet, Henry Gabriel Victor
  • Stogner, Ronald Earl
  • Yang, Eric Jian
  • Gong, Qing
  • Dutta, Partha
  • Arfaoui, Kais

Abstract

An apparatus includes a processor to: use an identifier of a requesting device or operator thereof to identify federated area(s) to which access is authorized; based on data dependencies among a set of tasks of a job flow, derive an order of performance specifying the first task to be performed; store, within a task queue, a task routine execution request message including an identifier associated with the first task, and federated area identifier(s) of the identified federated area(s); within a resolver container, in response to storage of the task routine execution request message, use the identifier associated with the first task and identifier(s) of the federated area(s) to identify one in which a first task routine is stored; within a task container, execute the first task routine to perform the first task; and upon completion of the job flow, transmit an indication of completion to the requesting device.

IPC Classes  ?

77.

Nonlinear optimization system

      
Application Number 17106488
Grant Number 11062219
Status In Force
Filing Date 2020-11-30
First Publication Date 2021-07-13
Grant Date 2021-07-13
Owner SAS Institute Inc. (USA)
Inventor
  • Griffin, Joshua David
  • Omheni, Riadh
  • Xu, Yan

Abstract

A computer solves a nonlinear optimization problem. An optimality check is performed for a current solution to an objective function that is a nonlinear equation with constraint functions on decision variables. When the performed optimality check indicates that the current solution is not an optimal solution, a barrier parameter value is updated, and a Lagrange multiplier value is updated for each constraint function based on a result of a complementarity slackness test. The current solution to the objective function is updated using a search direction vector determined by solving a primal-dual linear system that includes a dual variable for each constraint function and a step length value determined for each decision variable and for each dual variable. The operations are repeated until the optimality check indicates that the current solution is the optimal solution or a predefined number of iterations has been performed.

IPC Classes  ?

78.

Techniques for automated software testing

      
Application Number 16985311
Grant Number 11063849
Status In Force
Filing Date 2020-08-05
First Publication Date 2021-07-13
Grant Date 2021-07-13
Owner SAS Institute Inc. (USA)
Inventor
  • Clegg, Andrew Bynum
  • Struble, Christopher Chase
  • Hackett, Ronald Andrew

Abstract

Various embodiments are generally directed to techniques for automated software testing, such as by verifying operations are complete based on user interface and/or network traffic indications, for instance. Some embodiments are particularly directed to utilizing a network sniffer to detect specific network traffic to verify completion of network requests and/or responses associated with an operation included in a workflow for performance by a software under test (SUT). In many embodiments, the detection of specific network traffic may be used to accurately time operation durations and/or efficiently perform workflows to evaluate the SUT.

IPC Classes  ?

  • G06F 15/173 - Interprocessor communication using an interconnection network, e.g. matrix, shuffle, pyramid, star or snowflake
  • H04L 12/26 - Monitoring arrangements; Testing arrangements
  • G06F 11/34 - Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation
  • H04L 29/08 - Transmission control procedure, e.g. data link level control procedure
  • H04L 29/06 - Communication control; Communication processing characterised by a protocol

79.

Optimizing manufacturing processes using one or more machine learning models

      
Application Number 17064280
Grant Number 11055639
Status In Force
Filing Date 2020-10-06
First Publication Date 2021-07-06
Grant Date 2021-07-06
Owner SAS INSTITUTE INC. (USA)
Inventor
  • Cay, Pelin
  • Karmakar, Nabaruna
  • Summerville, Natalia
  • Valsaraj, Varunraj
  • Cooper, Antony Nicholas
  • Gardner, Steven Joseph
  • Griffin, Joshua David

Abstract

Manufacturing processes can be optimized using machine learning models. For example, a system can execute an optimization model to identify a recommended set of values for configurable settings of a manufacturing process associated with an object. The optimization model can determine the recommended set of values by implementing an iterative process using an objective function. Each iteration of the iterative process can include selecting a current set of candidate values for the configurable settings from within a current region of a search space defined by the optimization model; providing the current set of candidate values as input to a trained machine learning model that can predict a value for a target characteristic of the object or the manufacturing process based on the current set of candidate values; and identifying a next region of the search space to use in a next iteration of the iterative process based on the value.

IPC Classes  ?

  • G06N 20/00 - Machine learning
  • G06N 3/08 - Learning methods
  • G06Q 10/04 - Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
  • G06F 9/54 - Interprogram communication
  • G06N 3/02 - Neural networks
  • G06N 20/20 - Ensemble learning
  • G06N 20/10 - Machine learning using kernel methods, e.g. support vector machines [SVM]
  • G06N 3/04 - Architecture, e.g. interconnection topology
  • G06F 9/50 - Allocation of resources, e.g. of the central processing unit [CPU]

80.

Techniques for extracting contextually structured data from document images

      
Application Number 17089962
Grant Number 11087077
Status In Force
Filing Date 2020-11-05
First Publication Date 2021-07-01
Grant Date 2021-08-10
Owner SAS INSTITUTE INC. (USA)
Inventor
  • Wheaton, David James
  • Nadolski, William Robert
  • Goodykoontz, Heather Michelle

Abstract

Embodiments are generally directed to techniques for extracting contextually structured data from document images, such as by automatically identifying document layout, document data, and/or document metadata in a document image, for instance. Many embodiments are particularly directed to generating and utilizing a document template database for automatically extracting document image contents into a contextually structured format. For example, the document template database may include a plurality of templates for identifying/explaining key data elements in various document image formats that can be used to extract contextually structured data from incoming document images with a matching document image format. Several embodiments are particularly directed to automatically identifying and associating document metadata with corresponding document data in a document image, such as for generating a machine-facilitated annotation of the document image. In some embodiments, the machine-facilitated annotation of a document may be used to generate a template for the template database.

IPC Classes  ?

  • G06K 9/00 - Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
  • G06F 40/169 - Annotation, e.g. comment data or footnotes
  • G06F 16/93 - Document management systems
  • G06F 40/284 - Lexical analysis, e.g. tokenisation or collocates
  • G06F 40/186 - Templates
  • G06K 9/46 - Extraction of features or characteristics of the image
  • G06F 3/0484 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range

81.

Automated message-based job flow resource management in container-supported many task computing

      
Application Number 17139364
Grant Number 11144293
Status In Force
Filing Date 2020-12-31
First Publication Date 2021-07-01
Grant Date 2021-10-12
Owner SAS INSTITUTE INC. (USA)
Inventor
  • Bequet, Henry Gabriel Victor
  • Stogner, Ronald Earl

Abstract

An apparatus includes at least one processor to retrieve a job flow definition defining a job flow as a set of tasks and dependencies thereamong, store a job performance request message to perform the job flow within a job queue, and in response to the storage of the job performance request message, execute instructions of a performance routine within a storage container to: based on the dependencies, derive an order of performance of the set of tasks that specifies a first task to perform; store, within a task queue, a first task routine execution request message requesting execution of a first task routine; and provide, to a resource allocation routine, an indication of a need for a first task container in which to execute the first task routine to perform the first task, wherein execution of the resource allocation routine causes dynamic allocation of containers based on availability of resources.

IPC Classes  ?

82.

Speech audio pre-processing segmentation

      
Application Number 17138521
Grant Number 11049502
Status In Force
Filing Date 2020-12-30
First Publication Date 2021-06-29
Grant Date 2021-06-29
Owner SAS INSTITUTE INC. (USA)
Inventor
  • Cheng, Xiaozhuo
  • Yang, Xu
  • Li, Xiaolong

Abstract

An apparatus includes processor(s) to: divide a speech data set into multiple data chunks that each represent a chunk of speech audio; configure a neural network to implement an acoustic model that includes a CTC output; provide each data chunk to the neural network and monitor the CTC output for a string of blank symbols; designate each string of blank symbols from the CTC output that is at least as long as a predetermined blank threshold length as a likely sentence pause of a candidate set of likely sentence pauses; based on at least the candidate set, divide the speech data set into multiple data segments that each represent a speech segment of the speech audio; and perform speech-to-text conversion, to identify a sentence spoken in a selected language in each speech segment.

IPC Classes  ?

  • G10L 15/26 - Speech to text systems
  • G10L 15/16 - Speech classification or search using artificial neural networks
  • G10L 15/04 - Segmentation; Word boundary detection
  • G10L 25/78 - Detection of presence or absence of voice signals
  • G06N 3/08 - Learning methods
  • G06N 3/04 - Architecture, e.g. interconnection topology
  • G10L 25/30 - Speech or voice analysis techniques not restricted to a single one of groups characterised by the analysis technique using neural networks

83.

Data monitoring system

      
Application Number 17167633
Grant Number 11036981
Status In Force
Filing Date 2021-02-04
First Publication Date 2021-06-15
Grant Date 2021-06-15
Owner SAS INSTITUTE INC. (USA)
Inventor
  • Liao, Yuwei
  • Mcguirk, Anya Mary
  • Biggs, Byron Davis
  • Chaudhuri, Arin
  • Langlois, Allen Joseph
  • Deters, Vincent L.

Abstract

A computing system determines if an event has occurred. A first window is defined that includes a subset of a plurality of observation vectors modeled as an output of an autoregressive causal system. A magnitude adjustment vector is computed from a mean computed for a matrix of magnitude values that includes a column for each window of a plurality of windows. The first window is stored in a next column of the matrix of magnitude values. Each cell of the matrix of magnitude values includes an estimated power spectrum value for a respective window and a respective frequency. A second matrix of magnitude values is updated using the magnitude adjustment vector. Each cell of the second matrix of magnitude values includes an adjusted power spectrum value for the respective window and the respective frequency. A peak is detected from the next column of the second matrix of magnitude values.

IPC Classes  ?

  • G06K 9/00 - Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
  • G06N 7/00 - Computing arrangements based on specific mathematical models
  • G06F 17/16 - Matrix or vector computation

84.

Intelligent data curation

      
Application Number 17165226
Grant Number 11341414
Status In Force
Filing Date 2021-02-02
First Publication Date 2021-05-27
Grant Date 2022-05-24
Owner SAS INSTITUTE INC. (USA)
Inventor
  • Rausch, Nancy Anne
  • Barney, Roger Jay
  • Trawinski, John P.

Abstract

An apparatus includes processor(s) to: receive a request for a data catalog; in response to the request specifying a structural feature, analyze metadata of multiple data sets for an indication of including it, and to retrieve an indicated degree of certainty of detecting it for data sets including it; in response to the request specifying a contextual aspect, analyze context data of the multiple data sets for an indication of being subject to it, and to retrieve an indicated degree of certainty concerning it for data sets subject to it; selectively include each data set in the data catalog based on the request specifying a structural feature and/or a contextual aspect, and whether each data set meets what is specified; for each data set in the data catalog, generate a score indicative of the likelihood of meeting what is specified; and transmit the data catalog to the requesting device.

IPC Classes  ?

  • G06N 3/08 - Learning methods
  • G06N 3/04 - Architecture, e.g. interconnection topology
  • G06F 16/25 - Integrating or interfacing systems involving database management systems
  • G06F 9/30 - Arrangements for executing machine instructions, e.g. instruction decode
  • H04L 67/1097 - Protocols in which an application is distributed across nodes in the network for distributed storage of data in networks, e.g. transport arrangements for network file system [NFS], storage area networks [SAN] or network attached storage [NAS]
  • H04L 29/08 - Transmission control procedure, e.g. data link level control procedure

85.

DISTRIBUTED COLUMNAR DATA SET STORAGE AND RETRIEVAL

      
Document Number 03154474
Status In Force
Filing Date 2020-11-13
Open to Public Date 2021-05-27
Grant Date 2023-01-03
Owner SAS INSTITUTE INC. (USA)
Inventor
  • Bowman, Brian Payton
  • Keener, Gordon Lyle
  • Knight, Richard Todd

Abstract

An apparatus includes a processor to: instantiate collection threads, data buffers of a queue, and aggregation threads: within each collection thread, assemble a row group from a subset of the multiple rows, reorganize the data values row-wise to columnar organization, and store the row group within a data buffer of the queue; operate the buffer queue as a FIFO buffer; within each aggregation thread, retrieve multiple row groups from multiple data buffers of the queue, assemble a data set part from the multiple row groups, transmit, to storage device(s) via a network, the data set part; and in response to each instance of retrieval of a row group from a data buffer of the buffer queue for use within an aggregation thread, analyze a level of availability of at least storage space within the node device to determine whether to dynamically adjust the quantity of data buffers of the buffer queue.

IPC Classes  ?

  • G06F 12/02 - Addressing or allocation; Relocation
  • G06F 16/13 - File access structures, e.g. distributed indices
  • G06F 16/182 - Distributed file systems
  • G06F 16/22 - Indexing; Data structures therefor; Storage structures

86.

Automated concurrency and repetition with minimal syntax

      
Application Number 17105695
Grant Number 11113064
Status In Force
Filing Date 2020-11-27
First Publication Date 2021-05-27
Grant Date 2021-09-07
Owner SAS INSTITUTE INC. (USA)
Inventor
  • Rouse, Jack Joseph
  • Pratt, Robert William
  • Erickson, Jared Carl
  • Chari, Manoj Keshavmurthi

Abstract

A processor core receives a request to execute application code including a trigger instruction and an instruction block that reads a row of data values from a data structure and outputs a data value from a function using the row as input. The data structure is divided into multiple portions and the trigger instruction indicates that multiple instances of the instruction block are to be executed concurrently. In response to the request and to identification of the instruction block and trigger instruction, the processor core generates multiple instances of a support block that causes independent repetitive execution of each instance of the instruction block until all rows of the corresponding portion of the data structure are used as input. The processor core assigns instances of the instruction and support blocks to multiple processor cores, and provides each instance of the instruction block with the corresponding portion of the data structure.

IPC Classes  ?

  • G06F 9/38 - Concurrent instruction execution, e.g. pipeline, look ahead

87.

DISTRIBUTED COLUMNAR DATA SET STORAGE AND RETRIEVAL

      
Application Number US2020060379
Publication Number 2021/101798
Status In Force
Filing Date 2020-11-13
Publication Date 2021-05-27
Owner SAS INSTITUTE INC. (USA)
Inventor
  • Bowman, Brian Payton
  • Keener, Gordon Lyle
  • Knight, Richard Todd

Abstract

An apparatus includes a processor to: instantiate collection threads, data buffers of a queue, and aggregation threads: within each collection thread, assemble a row group from a subset of the multiple rows, reorganize the data values row-wise to columnar organization, and store the row group within a data buffer of the queue; operate the buffer queue as a FIFO buffer; within each aggregation thread, retrieve multiple row groups from multiple data buffers of the queue, assemble a data set part from the multiple row groups, transmit, to storage device(s) via a network, the data set part; and in response to each instance of retrieval of a row group from a data buffer of the buffer queue for use within an aggregation thread, analyze a level of availability of at least storage space within the node device to determine whether to dynamically adjust the quantity of data buffers of the buffer queue.

IPC Classes  ?

  • G06F 12/02 - Addressing or allocation; Relocation
  • G06F 16/22 - Indexing; Data structures therefor; Storage structures
  • G06F 16/182 - Distributed file systems
  • G06F 16/13 - File access structures, e.g. distributed indices

88.

Reducing resource consumption associated with executing a bootstrapping process on a computing device

      
Application Number 17129536
Grant Number 11016871
Status In Force
Filing Date 2020-12-21
First Publication Date 2021-05-25
Grant Date 2021-05-25
Owner SAS INSTITUTE INC. (USA)
Inventor
  • Cannizzaro, Rocco Claudio
  • Macaro, Christian

Abstract

Resource consumption associated with executing a bootstrapping process on a computing device can be reduced. For example, a system can receive a dataset including observations. The system can then instantiate one or more thread objects configured to execute a bootstrapping process that involves multiple iterations. Each iteration can involve: determining a respective set of probabilities based on an observation distribution associated with the dataset, executing a function based on the respective set of probabilities to determine a respective metric value, and storing the respective metric value in memory. This iterative process may be faster and less computationally intensive than traditional bootstrapping approaches. After completing the iterative process, the system may access the memory to obtain the metric values, determine a distribution of metric values based on at least some of the metric values, and store the distribution of metric values in the memory for further use.

IPC Classes  ?

  • G06F 11/30 - Monitoring
  • G06F 11/34 - Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation
  • G06F 9/4401 - Bootstrapping
  • G06K 9/62 - Methods or arrangements for recognition using electronic means
  • G06F 9/38 - Concurrent instruction execution, e.g. pipeline, look ahead
  • G06F 9/48 - Program initiating; Program switching, e.g. by interrupt

89.

Atomic pool manager for a data pool using a memory slot for storing a data object

      
Application Number 16838441
Grant Number 11099899
Status In Force
Filing Date 2020-04-02
First Publication Date 2021-05-20
Grant Date 2021-08-24
Owner SAS Institute Inc. (USA)
Inventor Shorb, Charles S.

Abstract

A computing device receives, from a thread of a multi-thread application, a release message. Each of the threads indicates operation(s) on a memory associated with the application. The release message indicates that a data object used by the thread is released. The device indicates that a memory slot of a data pool is unlocked permitting storage of an indication of a location of the data object in the memory. Each memory slot of the data pool is individually lockable such that a locked memory slot of the data pool indicates storing a location in the locked memory slot will not be permitted even though storing the location in an unlocked memory slot of the data pool will be permitted. The device stores, in the memory slot of the data pool, an indication of a location of the data object. The data object comprises the location of the memory slot.

IPC Classes  ?

  • G06F 9/50 - Allocation of resources, e.g. of the central processing unit [CPU]
  • G06F 9/48 - Program initiating; Program switching, e.g. by interrupt

90.

Distributable event prediction and machine learning recognition system

      
Application Number 17093917
Grant Number 11010691
Status In Force
Filing Date 2020-11-10
First Publication Date 2021-05-18
Grant Date 2021-05-18
Owner SAS Institute Inc. (USA)
Inventor
  • Chen, Xu
  • Da Silva, Jorge Manuel Gomes
  • Wujek, Brett Alan

Abstract

Data is classified using semi-supervised data. A decomposition is performed to define a first decomposition matrix that includes first eigenvectors of a weight matrix, a second decomposition matrix that includes second eigenvectors of a transpose of the weight matrix, and a diagonal matrix that includes eigenvalues of the first eigenvectors. Eigenvectors are selected from the first eigenvectors to define a reduced decomposition matrix. A linear transformation matrix is computed as a function of the first decomposition matrix, the reduced decomposition matrix, the diagonal matrix, and a penalty matrix. When a rank of the linear transformation matrix is less than a number of rows of the penalty matrix, a classification matrix is computed by updating a gradient of a cost function. When the rank of the linear transformation matrix is equal to the number of rows of the penalty matrix, the classification matrix is computed using a dual formulation.

IPC Classes  ?

  • G06N 20/00 - Machine learning
  • G06F 16/28 - Databases characterised by their database models, e.g. relational or object models

91.

Automated message-based job flow resource management in container-supported many task computing

      
Application Number 17139503
Grant Number 11086608
Status In Force
Filing Date 2020-12-31
First Publication Date 2021-05-13
Grant Date 2021-08-10
Owner SAS INSTITUTE INC. (USA)
Inventor
  • Bequet, Henry Gabriel Victor
  • Stogner, Ronald Earl
  • Yang, Eric Jian
  • Gong, Qing

Abstract

An apparatus includes at least one processor to: within a kill container, execute a kill routine to monitor a task kill queue for storage of an execution status message indicating a level of a parameter of execution of a task routine to perform a task of a job flow, and in response to the level exceeding a threshold, store, within the task kill queue, a kill tasks request message; within a task container, in response to the kill tasks request message, cease execution of the task routine, and store, within a task queue, a task cancelation message; within a performance container, execute instructions of a performance routine to, in response to the task cancelation message, store, within a job queue, a job cancelation message; and in response to the job cancelation message, transmit an indication of cancelation of the job flow to a requesting device.

IPC Classes  ?

92.

Automated message-based job flow resource coordination in container-supported many task computing

      
Application Number 17139546
Grant Number 11137990
Status In Force
Filing Date 2020-12-31
First Publication Date 2021-05-13
Grant Date 2021-10-05
Owner SAS INSTITUTE INC. (USA)
Inventor
  • Bequet, Henry Gabriel Victor
  • Yang, Eric Jian
  • Gong, Qing
  • Arfaoui, Kais
  • Stogner, Ronald Earl
  • Dutta, Partha

Abstract

An apparatus includes at least on processor to: parse a job flow definition for a job flow to identify an implicit expression of a data dependency arising from a data object output by a first task of the job flow and input to a second task thereof, wherein the first task is referred to as a data object required by the second as an input; in response to identifying the implicit expression, derive an order of performance of the tasks of the job flow that includes performing the first task before the second to ensure generation of the data object prior to performance of the second; for each task, retrieve a corresponding task routine; execute the task routines in an order that follows the order of performance of the tasks; and transmit, to the requesting device via the network, an indication of successful performance of the job flow.

IPC Classes  ?

93.

Distributable clustering model training system

      
Application Number 16950041
Grant Number 11055620
Status In Force
Filing Date 2020-11-17
First Publication Date 2021-05-13
Grant Date 2021-07-06
Owner SAS Institute Inc. (USA)
Inventor
  • Wang, Yingjian
  • Wright, Raymond Eugene

Abstract

A computing system trains a clustering model. (A) Beta distribution parameter values are computed for each cluster using a mass parameter value and a responsibility parameter vector of each observation vector. (B) Parameter values are computed for a normal-Wishart distribution for each observation vector included in a batch of a plurality of observation vectors. (C) Each responsibility parameter vector defined for each observation vector of the batch is updated using the beta distribution parameter values, the parameter values for the normal-Wishart distribution, and a respective observation vector of the selected batch of plurality of observation vectors. (D) A convergence parameter value is computed. (E) (A) to (D) are repeated until the convergence parameter value indicates the responsibility parameter vector defined for each observation vector is converged. A cluster membership is determined for each observation vector using the responsibility parameter vector. The determined cluster membership is output for each observation vector.

IPC Classes  ?

94.

High dimensional to low dimensional data transformation and visualization system

      
Application Number 17069293
Grant Number 10984075
Status In Force
Filing Date 2020-10-13
First Publication Date 2021-04-20
Grant Date 2021-04-20
Owner SAS Institute Inc. (USA)
Inventor
  • Liang, Yu
  • Chaudhuri, Arin
  • Wang, Haoyu

Abstract

A computer transforms high-dimensional data into low-dimensional data. A distance is computed between a selected observation vector and each observation vector of a plurality of observation vectors, a nearest neighbors are selected using the computed distances, and a first sigmoid function is applied to compute a distance similarity value between the selected observation vector and each of the selected nearest neighbors where each of the computed distance similarity values is added to a first matrix. The process is repeated with each observation vector of the plurality of observation vectors as the selected observation vector. An optimization method is executed with an initial matrix, the first matrix, and a gradient of a second sigmoid function that computes a second distance similarity value between the selected observation vector and each of the nearest neighbors to transform each observation vector of the plurality of observation vectors into the low-dimensional space.

IPC Classes  ?

  • G06F 17/16 - Matrix or vector computation
  • G06F 16/28 - Databases characterised by their database models, e.g. relational or object models
  • G06N 20/00 - Machine learning
  • G06F 17/17 - Function evaluation by approximation methods, e.g. interpolation or extrapolation, smoothing or least mean square method
  • G06K 9/62 - Methods or arrangements for recognition using electronic means

95.

Techniques for extracting contextually structured data from document images

      
Application Number 17083568
Grant Number 11049235
Status In Force
Filing Date 2020-10-29
First Publication Date 2021-04-15
Grant Date 2021-06-29
Owner SAS INSTITUTE INC. (USA)
Inventor
  • Wheaton, David James
  • Nadolski, William Robert
  • Goodykoontz, Heather Michelle

Abstract

Embodiments are generally directed to techniques for extracting contextually structured data from document images, such as by automatically identifying document layout, document data, and/or document metadata in a document image, for instance. Many embodiments are particularly directed to generating and utilizing a document template database for automatically extracting document image contents into a contextually structured format. For example, the document template database may include a plurality of templates for identifying/explaining key data elements in various document image formats that can be used to extract contextually structured data from incoming document images with a matching document image format. Several embodiments are particularly directed to automatically identifying and associating document metadata with corresponding document data in a document image, such as for generating a machine-facilitated annotation of the document image. In some embodiments, the machine-facilitated annotation of a document may be used to generate a template for the template database.

IPC Classes  ?

  • G06K 9/00 - Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
  • G06T 7/00 - Image analysis
  • G06F 16/81 - Indexing, e.g. XML tags; Data structures therefor; Storage structures
  • G06F 16/93 - Document management systems
  • G06F 40/284 - Lexical analysis, e.g. tokenisation or collocates
  • G06F 40/186 - Templates
  • G06F 40/169 - Annotation, e.g. comment data or footnotes
  • G06K 9/68 - Methods or arrangements for recognition using electronic means using sequential comparisons of the image signals with a plurality of reference, e.g. addressable memory
  • G06K 9/62 - Methods or arrangements for recognition using electronic means

96.

System for determining user intent from text

      
Application Number 17069128
Grant Number 10978053
Status In Force
Filing Date 2020-10-13
First Publication Date 2021-04-13
Grant Date 2021-04-13
Owner SAS Institute Inc. (USA)
Inventor
  • Smythe, Jared Michael Dean
  • Crowell, Richard Welland

Abstract

A system determines user intent from a received conversation element. A plurality of distinct intent labels are generated for the received conversation element. The generated plurality of distinct intent labels are divided into a plurality of interpretation partitions with overlapping semantic content. for each interpretation partition of the plurality of interpretation partitions, a set of maximal coherent subgroups are defined that do not disagree on labels for terms in each subgroup, a score is computed for each maximal coherent subgroup of the defined set of maximal coherent subgroups, and a maximal coherent subgroup is selected from the set of maximal coherent subgroups based on the computed score. Intent labels are aggregated from the selected maximal coherent subgroup of each interpretation partition of the plurality of interpretation partitions to define a multiple intent interpretation of the received conversation element. The defined multiple intent interpretation is output for the received conversation element.

IPC Classes  ?

  • G10L 15/18 - Speech classification or search using natural language modelling
  • G10L 13/02 - Methods for producing synthetic speech; Speech synthesisers
  • G10L 15/22 - Procedures used during a speech recognition process, e.g. man-machine dialog

97.

Portion of a computer screen with an icon

      
Application Number 29674055
Grant Number D0915458
Status In Force
Filing Date 2018-12-19
First Publication Date 2021-04-06
Grant Date 2021-04-06
Owner SAS Institute Inc. (USA)
Inventor
  • Chipley, Michael Ryan
  • Barlow, Steven Todd

98.

Portion of a computer screen with an animated icon

      
Application Number 29674059
Grant Number D0915459
Status In Force
Filing Date 2018-12-19
First Publication Date 2021-04-06
Grant Date 2021-04-06
Owner SAS Institute Inc. (USA)
Inventor
  • Chipley, Michael Ryan
  • Barlow, Steven Todd

99.

Distributed decision variable tuning system for machine learning

      
Application Number 17120340
Grant Number 10963802
Status In Force
Filing Date 2020-12-14
First Publication Date 2021-03-30
Grant Date 2021-03-30
Owner SAS Institute Inc. (USA)
Inventor
  • Gardner, Steven Joseph
  • Griffin, Joshua David
  • Xu, Yan
  • Gao, Yan

Abstract

A computing device selects decision variable values. A lower boundary value and an upper boundary value is defined for a decision variable. (A) A plurality of decision variable configurations is determined using a search method. The value for the decision variable is between the lower boundary value and the upper boundary value. (B) A decision variable configuration is selected. (C) A model of the model type is trained using the decision variable configuration. (D) The model is scored to compute an objective function value. (E) The computed objective function value and the selected decision variable configuration are stored. (F) (B) through (E) is repeated for a plurality of decision variable configurations. (G) The lower boundary value and the upper boundary value are updated using the objective function value and the decision variable configuration stored. Repeat (A)-(F) with the lower boundary value and the upper boundary value updated in (G).

IPC Classes  ?

100.

Distributable event prediction and machine learning recognition system

      
Application Number 16904818
Grant Number 10956825
Status In Force
Filing Date 2020-06-18
First Publication Date 2021-03-23
Grant Date 2021-03-23
Owner SAS Institute Inc. (USA)
Inventor
  • Chen, Xu
  • Silva, Jorge Manuel Gomes Da
  • Wujek, Brett Alan

Abstract

Data is classified using semi-supervised data. A weight matrix is computed using a kernel function applied to observation vectors. A decomposition of the computed weight matrix is performed. A predefined number of eigenvectors is selected from the decomposed weight matrix to define a decomposition matrix. (A) A gradient value is computed as a function of the defined decomposition matrix, sparse coefficients, and a label vector. (B) A value of each coefficient of the sparse coefficients is updated based on the gradient value. (A) and (B) are repeated until a convergence parameter value indicates the sparse coefficients have converged. A classification matrix is defined using the converged sparse coefficients. The target variable value is determined and output for each observation vector based on the defined classification matrix to update the label vector and defined to represent the label for a respective unclassified observation vector.

IPC Classes  ?

  1     2     3     ...     8        Next Page