FARO Technologies, Inc.

United States of America

Back to Profile

1-100 of 511 for FARO Technologies, Inc. Sort by
Query
Patent
United States - USPTO
Excluding Subsidiaries
Aggregations Reset Report
Date
New (last 4 weeks) 4
2024 April (MTD) 2
2024 March 5
2024 February 5
2024 January 5
See more
IPC Class
G01B 11/00 - Measuring arrangements characterised by the use of optical techniques 169
G01S 17/89 - Lidar systems, specially adapted for specific applications for mapping or imaging 119
G01S 7/481 - Constructional features, e.g. arrangements of optical elements 101
G01B 11/25 - Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. moiré fringes, on the object 91
G01C 15/00 - Surveying instruments or accessories not provided for in groups 91
See more
Status
Pending 86
Registered / In Force 425
Found results for  patents
  1     2     3     ...     6        Next Page

1.

REALITY CAPTURE USING CLOUD BASED COMPUTER NETWORKS

      
Application Number 18364717
Status Pending
Filing Date 2023-08-03
First Publication Date 2024-04-18
Owner FARO Technologies, Inc. (USA)
Inventor
  • Wohlfeld, Denis
  • Kappes, Steffen

Abstract

Reality capture using cloud based computer networks is provided. Techniques include receiving user input of an object to capture, the user input including a location, an accuracy category, and a size category of the object, and generating at least one option to capture the object, in response to user input. Techniques include responsive to a user selecting the at least one option to capture the object, configuring a plurality of drones with a first setting for capturing at least a first portion of the object, and configuring a scanner with a second setting for capturing at least a second portion of the object. Techniques include causing the plurality of drones to capture the first portion of the object, in response to the drones being initiated at the location and causing the scanner to capture the second portion of the object, in response to the scanner being initiated at the location.

IPC Classes  ?

  • H04N 23/62 - Control of parameters via user interfaces
  • H04N 23/60 - Control of cameras or camera modules
  • H04N 23/698 - Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture

2.

CONSTRUCTION SITE DEFECT AND HAZARD DETECTION USING ARTIFICIAL INTELLIGENCE

      
Application Number 18336872
Status Pending
Filing Date 2023-06-16
First Publication Date 2024-04-18
Owner FARO Technologies, Inc. (USA)
Inventor
  • Wohlfeld, Denis
  • Schmitz, Evelyn

Abstract

A system and method for detecting construction site defects and hazards using artificial intelligence (AI) is provided. The system includes a movable base unit, a coordinate measurement scanner, a vision based sensor, and one or more processors. The one or more processors perform operations that include generating a two-dimensional (2D) map of the environment based at least in part on output from the coordinate measurement scanner, applying image recognition to the video stream data to identify and label a defect or hazard in the video data stream, correlating a location of the defect or hazard in the video stream data with the location in the 2D map, and recording the location of the defect or hazard in the 2D map.

IPC Classes  ?

  • G06F 18/214 - Generating training patterns; Bootstrap methods, e.g. bagging or boosting
  • G06N 3/08 - Learning methods

3.

GAP FILLING FOR THREE-DIMENSIONAL DATA VISUALIZATION

      
Application Number 18447617
Status Pending
Filing Date 2023-08-10
First Publication Date 2024-03-28
Owner FARO Technologies, Inc. (USA)
Inventor
  • Campanella, Marco
  • Bank, Joachim

Abstract

Examples described herein provide a method that includes receiving three-dimensional (3D) data associated with an environment. The method further includes generating a graphical representation based at least in part on at least one of the 3D data. The method further includes filling in a gap in the graphical representation using downsampled frame buffer objects.

IPC Classes  ?

  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
  • G06T 3/40 - Scaling of a whole image or part thereof

4.

FEATURE EXTRACTION USING A POINT OF A COLLECTION OF POINTS

      
Application Number 18447019
Status Pending
Filing Date 2023-08-09
First Publication Date 2024-03-28
Owner FARO Technologies, Inc. (USA)
Inventor
  • Moura, Samuel
  • Rodrigues, Jaime Duarte
  • Espanha, Raphael
  • Zolfagharnasab, Hooshiar

Abstract

An example method for feature extraction includes receiving a selection of a point from a plurality of points, the plurality of points representing an object. The method further includes identifying a feature of interest for the object based at least in part on the point. The method further includes performing edge extraction on the feature of interest. The method further includes performing pre-processing on results of the edge extraction. The method further includes classifying the feature of interest based at least in part on results of the pre-processing. The method further includes constructing, based at least in part on results of the classifying, a geometric primitive or mathematical function that has a best fit to a set of points from the plurality of points associated with the feature of interest. The method further includes generating a graphical representation of the feature of interest using the geometric primitive or mathematical function.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06T 7/13 - Edge detection
  • G06V 10/44 - Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
  • G06V 10/77 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
  • G06V 10/774 - Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting

5.

PHOTOSENSOR PROCESSING FOR IMPROVED LINE SCANNER PERFORMANCE

      
Application Number 18388808
Status Pending
Filing Date 2023-11-10
First Publication Date 2024-03-21
Owner FARO Technologies, Inc. (USA)
Inventor
  • Lankalapalli, Kishore
  • Shen, Michael
  • Atwell, Paul C.
  • Macfarlane, Keith G.
  • Barba, Jacint R.
  • Dhasmana, Nitesh

Abstract

A method includes providing a measuring device having a projector, a camera with a photosensitive array, and at least one processor, projecting with the projector a line of light onto an object, capturing with the camera an image of the projected line of light on the object within a window subregion of the photosensitive array, and calculating with the at least one processor three-dimensional (3D) coordinates of points on the object based at least in part on the projected line of light and on the captured image.

IPC Classes  ?

  • G06T 7/70 - Determining position or orientation of objects or cameras
  • G01B 11/25 - Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. moiré fringes, on the object
  • G06T 7/00 - Image analysis
  • H04N 23/667 - Camera operation mode switching, e.g. between still and video, sport and normal or high and low resolution modes

6.

COMPENSATION OF THREE-DIMENSIONAL MEASURING INSTRUMENT HAVING AN AUTOFOCUS CAMERA

      
Application Number 18369658
Status Pending
Filing Date 2023-09-18
First Publication Date 2024-03-14
Owner FARO Technologies, Inc. (USA)
Inventor
  • Ossig, Martin
  • Buback, Johannes

Abstract

A 3D measuring instrument and method of operation is provided that includes a registration camera and a an autofocus camera. The method includes capturing with the registration camera a first registration image of a first plurality of points and a first image with the first camera with the instrument in a first pose. A plurality of three-dimensional (3D) coordinates of points are determined based on the first image. A second registration image of a second plurality of points is captured in a second pose and a focal length of the autofocus camera is adjusted. A second surface image is captured with the first camera having the adjusted focal length. A compensation parameter is determined based in part on the captured second surface image. The determined compensation parameter is stored.

IPC Classes  ?

  • G06T 7/80 - Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
  • G06T 7/521 - Depth or shape recovery from the projection of structured light
  • G06T 7/571 - Depth or shape recovery from multiple images from focus
  • H04N 23/56 - Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
  • H04N 23/67 - Focus control based on electronic image sensor signals

7.

THREE-DIMENSIONAL MEASUREMENT DEVICE

      
Application Number 18389507
Status Pending
Filing Date 2023-11-14
First Publication Date 2024-03-14
Owner FARO Technologies, Inc. (USA)
Inventor
  • Döring, Daniel
  • Debitsch, Rasmus
  • Hillebrand, Gerrit
  • Ossig, Martin

Abstract

A method and system of correcting a point cloud is provided. The method includes selecting a region within the point cloud. At least two objects within the region are identified. The at least two objects are re-aligned. At least a portion of the point cloud is aligned based at least in part on the realignment of the at least two objects.

IPC Classes  ?

  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
  • G06T 17/20 - Wire-frame description, e.g. polygonalisation or tessellation

8.

GLOBAL OPTIMIZATION METHODS FOR MOBILE COORDINATE SCANNERS

      
Application Number 18356871
Status Pending
Filing Date 2023-07-21
First Publication Date 2024-02-29
Owner FARO Technologies, Inc. (USA)
Inventor
  • Frank, Aleksej
  • Waheed, Mufassar
  • Wolke, Matthias
  • Brenner, Mark

Abstract

A mobile three-dimensional (3D) measuring system includes a 3D measuring device configured to capture 3D data in a multi-level architecture, and an orientation sensor configured to estimate an altitude. One or more processing units coupled with the 3D measuring device and the orientation sensor perform a method that includes receiving a first portion of the 3D data captured by the 3D measuring device. The method further includes determining a level index based on the altitude. The level index indicates a level of the multi-level architecture at which the first portion is captured. The level index is associated with the first portion. Further, a map of the multi-level architecture is generated using the first portion, the generating comprises registering the first portion with a second portion of the 3D data responsive to the level index of the first portion being equal to the level index of the second portion.

IPC Classes  ?

  • G01S 17/89 - Lidar systems, specially adapted for specific applications for mapping or imaging

9.

COORDINATE MEASUREMENT SYSTEM WITH AUXILIARY AXIS

      
Application Number 18386099
Status Pending
Filing Date 2023-11-01
First Publication Date 2024-02-22
Owner FARO Technologies, Inc. (USA)
Inventor
  • Lankalapalli, Kishore
  • Steffey, Kenneth
  • Macfarlane, Keith G.
  • Atwell, Paul C.
  • Stanescu, Dragos M.
  • Creachbaum, John Lucas
  • Bailey, Brent
  • Mogensen, Matthew

Abstract

According to some aspects of the invention, auxiliary axis measurement systems for determining three-dimensional coordinates of an object are provided as shown and described herein. According to some aspects of the invention, methods for operating auxiliary axis measurement systems for determining three-dimensional coordinates of an object are provided as shown and described herein.

IPC Classes  ?

  • G05B 19/401 - Numerical control (NC), i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by control arrangements for measuring, e.g. calibration and initialisation, measuring workpiece for machining purposes
  • G01B 11/00 - Measuring arrangements characterised by the use of optical techniques
  • G01B 21/04 - Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant for measuring length, width, or thickness by measuring coordinates of points

10.

SYSTEM AND METHOD OF SCANNING AN ENVIRONMENT AND GENERATING TWO DIMENSIONAL IMAGES OF THE ENVIRONMENT

      
Application Number 18385833
Status Pending
Filing Date 2023-10-31
First Publication Date 2024-02-22
Owner FARO Technologies, Inc. (USA)
Inventor
  • Frank, Aleksej
  • Wolke, Matthias
  • Zweigle, Oliver

Abstract

A system and method for scanning an environment and generating an annotated 2D map is provided. The method includes acquiring, via a 2D scanner, a plurality of 2D coordinates on object surfaces in the environment, the 2D scanner having a light source and an image sensor, the image sensor being arranged to receive light reflected from the object points. A first 360° image is acquired at a first position of the environment, via a 360° camera having a plurality of cameras and a controller, the controller being operable to merge the images acquired by the plurality of cameras to generate an image having a 360° view, the 360° camera being movable from the first to a second position. A 2D map is generated based at least in part on the plurality of two-dimensional coordinates of points. The first 360° image is integrated with the 2D map.

IPC Classes  ?

  • G06T 17/05 - Geographic models
  • G06T 1/00 - General purpose image data processing
  • H04N 23/698 - Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
  • H04N 13/282 - Image signal generators for generating image signals corresponding to three or more geometrical viewpoints, e.g. multi-view systems

11.

PHOTOGRAMMETRY SYSTEM FOR GENERATING STREET EDGES IN TWO-DIMENSIONAL MAPS

      
Application Number 18359603
Status Pending
Filing Date 2023-07-26
First Publication Date 2024-02-15
Owner FARO Technologies, Inc. (USA)
Inventor
  • Wolke, Matthias
  • Sharma, Tharesh

Abstract

A computer-implemented method is provided that includes retrieving at least one selected image from a plurality of aerial images of an environment, the at least one selected image comprising surface regions that are concurrently in a three-dimensional (3D) point cloud of the environment. The method further includes detecting areas of the surface regions in the at least one selected image, such that coordinates of the areas of the surface regions are extracted from the at least one selected image. The method further includes comparing the at least one selected image to the 3D point cloud to align common locations in both the at least one selected image and the 3D point cloud. The method further includes displaying an integration of a drawing of the coordinates of the areas of the surface regions in a representation of the 3D point cloud.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods
  • G06T 5/00 - Image enhancement or restoration
  • G06T 17/05 - Geographic models
  • G06T 7/33 - Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods

12.

DRONE DATA COLLECTION OPTIMIZATION FOR EVIDENCE RECORDING

      
Application Number 18356850
Status Pending
Filing Date 2023-07-21
First Publication Date 2024-02-15
Owner FARO Technologies, Inc. (USA)
Inventor Sharma, Tharesh

Abstract

A computer-implemented method is provided that includes causing an aerial vehicle to scan an environment in a predesignated pattern, such that a first set of images are captured. The method further includes detecting an emergency scene in the first set of images of the environment. The method further includes determining locations at which the aerial vehicle is to capture a second set of images of the emergency scene in the environment. The method further includes causing the aerial vehicle to acquire the second set of images at the locations. The method further includes determining selected images of the second set of images focused on the emergency scene. The method further includes extracting the selected images from the second set of images, the selected images comprising a representation of the emergency scene.

IPC Classes  ?

  • G06V 20/52 - Surveillance or monitoring of activities, e.g. for recognising suspicious objects
  • G06V 20/17 - Terrestrial scenes taken from planes or by drones
  • G06V 10/82 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks

13.

REMOVING REFLECTION ARTIFACTS FROM POINT CLOUDS

      
Application Number 18356864
Status Pending
Filing Date 2023-07-21
First Publication Date 2024-02-15
Owner FARO Technologies, Inc. (USA)
Inventor
  • Azam, Raza Ul
  • Dube-Dallaire, Mathieu
  • Pompe, Daniel
  • Ostapchuk, Vitaliy
  • Kalburgi, Sagar

Abstract

A computer-implemented method is provided that includes detecting at least one reflective surface in at least one two-dimensional (2D) image of an environment. The method further includes generating bounding coordinates encompassing the at least one reflective surface in the 2D image. The method further includes projecting the bounding coordinates of the 2D image into a three-dimensional (3D) space of the environment. The method further includes identifying a reflection artifact encompassed by the bounding coordinates in the 3D space. The method further includes removing the reflection artifact identified in the bounding coordinates.

IPC Classes  ?

14.

MACHINE LEARNING-BASED CAMERA POSITIONING

      
Application Number 18337900
Status Pending
Filing Date 2023-06-20
First Publication Date 2024-01-25
Owner FARO Technologies, Inc. (USA)
Inventor
  • Sharma, Tharesh
  • Balatzis, Georgios

Abstract

Examples described herein provide a computer-implemented method that includes receiving a video stream from a camera. The method further includes detecting, within the video stream, an object of interest using a first trained machine learning model. The method further includes, responsive to determining that a confidence score associated with the object of interest fails to satisfy a threshold, determining, using a second trained machine learning model, a direction to move the camera to cause the confidence score to satisfy the threshold. The method further includes presenting an indication of the direction to move the camera to cause the confidence score to satisfy the threshold.

IPC Classes  ?

  • H04N 23/695 - Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
  • G06V 10/20 - Image preprocessing
  • G06V 10/82 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
  • G06T 7/70 - Determining position or orientation of objects or cameras

15.

SYSTEM AND METHOD OF IMPROVING LASER SCANNER UNAMBIGUITY

      
Application Number 17813631
Status Pending
Filing Date 2022-07-20
First Publication Date 2024-01-25
Owner FARO Technologies, Inc. (USA)
Inventor
  • Ossig, Martin
  • Horvath, Oswin

Abstract

A system and method for determining a distance is provided. The system includes a scanner that captures a scan-point by emitting a light having a base frequency and at least one measurement frequency and receiving a reflection of the light. Processors determine the distance to the scan-point by using a method that comprises: generating a signal in response to receiving the reflection of light; determining a first distance to the scan-point based on a phase-shift of the signal and the measurement frequency; determining a second distance and a third distance based on a phase-shift of the signal determined using a Fourier transform at the measurement frequency on a pair of adjacent half-cycles; determining a corrected second distance and a corrected third distance by compensating for an error in the second distance and third distance by performing the Fourier transform on the pair of adjacent half-cycles.

IPC Classes  ?

  • G01S 7/4915 - Time delay measurement, e.g. operational details for pixel components; Phase measurement
  • G01S 17/36 - Systems determining position data of a target for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated with phase comparison between the received signal and the contemporaneously transmitted signal
  • G01S 17/894 - 3D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar

16.

CORRECTION OF CURRENT SCAN DATA USING PRE-EXISTING DATA

      
Application Number 18336867
Status Pending
Filing Date 2023-06-16
First Publication Date 2024-01-18
Owner FARO Technologies, Inc. (USA)
Inventor
  • Wohlfeld, Denis
  • Schmitz, Evelyn

Abstract

A system and method for measuring coordinate values of an environment is provided. The system includes a coordinate measurement scanner that includes a light source that steers a beam of light to illuminate object points in the environment, and an image sensor arranged to receive light reflected from the object points to determine coordinates of the object points in the environment. The system also includes one or more processors for performing a method that includes receiving a previously generated map of the environment and causing the scanner to measure a plurality of coordinate values as the scanner is moved through the environment, the coordinate values forming a point cloud. The plurality of coordinate values are registered with the previously generated map into a single frame of reference. A current map of the environment is generated based at least in part on the previously generated map and the point cloud.

IPC Classes  ?

  • G06T 7/33 - Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
  • G06T 7/521 - Depth or shape recovery from the projection of structured light
  • G06T 7/00 - Image analysis
  • G06F 30/13 - Architectural design, e.g. computer-aided architectural design [CAAD] related to design of buildings, bridges, landscapes, production plants or roads
  • G01S 7/481 - Constructional features, e.g. arrangements of optical elements
  • G01S 17/89 - Lidar systems, specially adapted for specific applications for mapping or imaging

17.

ARTIFACT FILTERING USING ARTIFICIAL INTELLIGENCE

      
Application Number 18339620
Status Pending
Filing Date 2023-06-22
First Publication Date 2024-01-04
Owner FARO Technologies, Inc. (USA)
Inventor
  • Bergmann, Louis
  • Demkiv, Vadim
  • Flohr, Daniel

Abstract

A system and a method for removing artifacts from a 3D coordinate data are provided. The system includes one or more processors and a measuring device. The one or more processors are operable to receive training data and train the 3D measuring device to identify artifacts by analyzing the training data. The one or more processors are further operable to identify artifacts in live data based on the training of the processor system. The one or more processors are further operable to generate clear scan data by filtering the artifacts from the live data and output the clear scan data.

IPC Classes  ?

  • G01S 17/89 - Lidar systems, specially adapted for specific applications for mapping or imaging
  • G06V 20/64 - Three-dimensional objects
  • G06V 10/30 - Noise filtering
  • G06F 18/214 - Generating training patterns; Bootstrap methods, e.g. bagging or boosting

18.

CAPTURING ENVIRONMENTAL SCANS USING LANDMARKS BASED ON SEMANTIC FEATURES

      
Application Number 18469258
Status Pending
Filing Date 2023-09-18
First Publication Date 2024-01-04
Owner FARO Technologies, Inc. (USA)
Inventor
  • Brenner, Mark
  • Frank, Aleksej
  • Ramadneh, Ahmad
  • Zweigle, Oliver

Abstract

A method is provided that includes recording a landmark at a first scan position of a scanner, the landmark based at least in part on a semantic feature of scan data captured by the scanner. The semantic feature is identified using line-segments of the scan data. The method further includes capturing, by the scanner while moving through the environment, additional scan data at a second scan position. The method further includes, responsive to the scanner returning to the first scan position associated with the landmark, computing a measurement error. The method further includes correcting, using the measurement error, at least a portion of the scan data or the additional scan data.

IPC Classes  ?

  • G01S 17/894 - 3D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
  • G01S 17/04 - Systems determining the presence of a target
  • G01S 7/497 - Means for monitoring or calibrating

19.

IMAGE LOCALIZATION USING A DIGITAL TWIN REPRESENTATION OF AN ENVIRONMENT

      
Application Number 18337878
Status Pending
Filing Date 2023-06-20
First Publication Date 2023-12-28
Owner FARO Technologies, Inc. (USA)
Inventor Wohlfeld, Denis

Abstract

Examples described herein provide a method that includes capturing, using a camera, a first image of an environment. The method further includes performing, by a processing system, a first positioning to establish a position of the first image in a layout of the environment. The method further includes detecting, by the processing system, a feature in the first image. The method further includes performing, by the processing system, a second positioning based at least in part on the feature to refine the position of the first image in the layout. The method further includes capturing, using the camera, a second image of the environment and automatically registering the second image to the layout. The method further includes generating a digital twin representation of the environment using the first image based at least in part on the refined position of the first image in the layout and using the second image.

IPC Classes  ?

  • G06V 10/82 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
  • G06V 10/77 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
  • G06V 10/764 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects

20.

ON-SITE COMPENSATION OF MEASUREMENT DEVICES

      
Application Number 18330685
Status Pending
Filing Date 2023-06-07
First Publication Date 2023-12-14
Owner FARO Technologies, Inc. (USA)
Inventor
  • Parian, Jafar Amiri
  • Ossig, Martin
  • Kaabi, Hani
  • Buback, Johannes
  • Hargart, Fabian

Abstract

A system includes one or more processors that are configured to compensate a measurement tool by performing a method. The method includes capturing a first data using the measurement tool. The method further includes capturing a second data using the measurement tool. The method further includes detecting a first natural feature in the first data. The method further includes computing a difference in positions of the first natural feature in the first data and the second data respectively. The method further includes computing a compensation parameter to adjust the measurement tool based on the difference computed.

IPC Classes  ?

  • G01C 25/00 - Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
  • G01C 15/00 - Surveying instruments or accessories not provided for in groups
  • G01C 15/06 - Surveyors' staffs; Movable markers
  • G06F 18/2413 - Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns

21.

SEGMENTATION OF COMPUTED TOMOGRAPHY VOXEL DATA USING MACHINE LEARNING

      
Application Number 18326380
Status Pending
Filing Date 2023-05-31
First Publication Date 2023-12-07
Owner FARO Technologies, Inc. (USA)
Inventor
  • Stiebeiner, Ariane
  • Balatzis, Georgios
  • Vekariya, Vivek Vrujlal

Abstract

Examples described herein provide a method that includes creating two-dimensional (2D) slices from a plurality of computed tomography (CT) voxel data sets. The method further includes adding artificial noise to the 2D slices to generate artificially noisy 2D slices. The method further includes creating patches from the 2D slices and the artificially noisy 2D slices. The method further includes training an autoencoder using the patches.

IPC Classes  ?

  • G06T 11/00 - 2D [Two Dimensional] image generation
  • G06T 5/00 - Image enhancement or restoration

22.

SYSTEM AND METHOD OF MEASURING GAPS

      
Application Number 18328864
Status Pending
Filing Date 2023-06-05
First Publication Date 2023-12-07
Owner FARO Technologies, Inc. (USA)
Inventor Edwards, Michelle

Abstract

A method for measuring gaps between material layers include inserting a probe tip within a through-hole defined in a structural component. The probe tip is arranged at the end of a probe assembly attached to articulated arm coordinate measuring machine (AACMM). The method further includes contacting the probe tip with a hole surface of the through-hole. The method further includes translating the probe tip along the hole surface in a direction parallel to an axis through the through-hole. The probe tip passes over a gap along the through-hole. The method further includes measuring a radial position of the probe tip during the translation along the hole surface and across the gap including a deflection of radial position of the probe tip as the probe tip crosses the gap. The method further includes calculating a gap size of the gap based on the deflection and a size of the probe tip.

IPC Classes  ?

  • G01B 5/012 - Contact-making feeler heads therefor

23.

CALIBRATING SYSTEM FOR COLORIZING POINT-CLOUDS

      
Application Number 18449934
Status Pending
Filing Date 2023-08-15
First Publication Date 2023-11-30
Owner FARO Technologies, Inc. (USA)
Inventor
  • Parian, Jafar Amiri
  • Ossig, Martin
  • Kaabi, Hani

Abstract

A system includes a three-dimensional (3D) scanner that captures a 3D point cloud corresponding to one or more objects in a surrounding environment. The system further includes a camera that captures a control image by capturing a plurality of images of the surrounding environment, and an auxiliary camera configured to capture an ultrawide-angle image of the surrounding environment. One or more processors of the system colorize the 3D point cloud using the ultrawide-angle image by mapping the ultrawide-angle image to the 3D point cloud. The system performs a limited system calibration before colorizing each 3D point cloud, and a periodic full system calibration before/after a plurality of 3D point clouds are colorized.

IPC Classes  ?

  • G06T 7/90 - Determination of colour characteristics
  • G06T 7/80 - Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
  • G01S 17/86 - Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
  • G06V 10/75 - Image or video pattern matching; Proximity measures in feature spaces using context analysis; Selection of dictionaries
  • G01S 7/497 - Means for monitoring or calibrating
  • G01S 17/89 - Lidar systems, specially adapted for specific applications for mapping or imaging
  • G06T 3/40 - Scaling of a whole image or part thereof
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
  • H04N 17/00 - Diagnosis, testing or measuring for television systems or their details
  • H04N 23/698 - Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture

24.

TRACKING DATA ACQUIRED BY COORDINATE MEASUREMENT DEVICES THROUGH A WORKFLOW

      
Application Number 18362477
Status Pending
Filing Date 2023-07-31
First Publication Date 2023-11-23
Owner FARO Technologies, Inc. (USA)
Inventor
  • Ossig, Martin
  • Horvath, Oswin
  • Flohr, Daniel

Abstract

A method that includes providing a database for storing meta-data that describes steps in a workflow and an order of the steps in the workflow. The meta-data includes, for each of the steps: a reference to an input data file for the step; a description of a transaction performed at the step; and a reference to an output data file generated by the step based at least in part on applying the transaction to the input data file. Data that includes meta-data for a step in the workflow is received and the data is stored in the database. A trace of the workflow is generated based at least in part on contents of the database. The generating is based on receiving a request from a requestor for the trace of the workflow. At least a subset of the trace is output to the requestor.

IPC Classes  ?

  • G06F 21/64 - Protecting data integrity, e.g. using checksums, certificates or signatures
  • G06Q 10/083 - Shipping
  • H04L 9/32 - Arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system
  • G06F 21/60 - Protecting data
  • H04L 9/00 - Arrangements for secret or secure communications; Network security protocols

25.

LASER SCANNER FOR VERIFYING POSITIONING OF COMPONENTS OF ASSEMBLIES

      
Application Number 18129358
Status Pending
Filing Date 2023-03-31
First Publication Date 2023-10-12
Owner FARO Technologies, Inc. (USA)
Inventor
  • Chan, John
  • Haedicke, Udo
  • Dubé-Dallaire, Mathieu
  • Bédard, Renaud Gaboriault

Abstract

Examples described herein provide a method that includes receiving, from a camera, a first image captured at a first location of an environment. The method further includes receiving, by a three-dimensional (3D) coordinate measurement device, first 3D coordinate data captured at the first location of the environment. The method further includes receiving, from the camera, a second image captured at a second location of the environment. The method further includes detecting, by a processing system, first features of the first image and second features of the second image. The method further includes determining, by the processing system, whether a correspondence exists between the first image and the second image. The method further includes, responsive to determining that the correspondence exists between the first image and the second image, causing the 3D coordinate measurement device to capture, at the second location, second 3D coordinate data.

IPC Classes  ?

  • G01S 17/894 - 3D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
  • G01S 7/481 - Constructional features, e.g. arrangements of optical elements
  • G01S 7/48 - RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES - Details of systems according to groups , , of systems according to group

26.

CAPTURING THREE-DIMENSIONAL REPRESENTATION OF SURROUNDINGS USING MOBILE DEVICE

      
Application Number 18129573
Status Pending
Filing Date 2023-03-31
First Publication Date 2023-10-12
Owner FARO Technologies, Inc. (USA)
Inventor
  • Waheed, Mufassar
  • Brenner, Mark
  • Wolke, Matthias
  • Frank, Aleksej

Abstract

A mobile three-dimensional (3D) measuring system includes a 3D measuring device comprising a first sensor and a second sensor. The 3D measuring system further includes a computing system coupled with the 3D measuring device. A computing device is coupled with the computing system. The 3D measuring device continuously transmits a first data from the first sensor, and a second data from the second sensor to the computing system as it is moved in an environment. The computing system generates a 3D point cloud representing the environment. The computing system generates a 2D projection corresponding to the 3D point cloud. The computing device displays the 2D projection as a live feedback of a movement of the 3D measuring device.

IPC Classes  ?

  • G06T 7/521 - Depth or shape recovery from the projection of structured light
  • G01S 17/89 - Lidar systems, specially adapted for specific applications for mapping or imaging

27.

SENSOR FIELD-OF-VIEW MANIPULATION

      
Application Number 18131330
Status Pending
Filing Date 2023-04-05
First Publication Date 2023-10-12
Owner FARO Technologies, Inc. (USA)
Inventor Waheed, Mufassar

Abstract

A mobile 3D measuring system includes a 3D measuring device comprising a sensor that emits a plurality of scan lines in a field of view of the sensor. The 3D system further includes a field of view manipulator coupled with the 3D measuring device, the field of view manipulator comprising a passive optic element that redirects a first scan line from the plurality of scan lines. The 3D system further includes a computing system coupled with the 3D measuring device. The 3D measuring device continuously transmits a captured data from the sensor to the computing system as the 3D measuring device is moved in an environment, the captured data is based on receiving reflections corresponding to the plurality of scan lines, including a reflection of the first scan line that is redirected. The computing system generates a 3D point cloud representing the environment based on the captured data.

IPC Classes  ?

  • G01S 17/894 - 3D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
  • G01S 7/481 - Constructional features, e.g. arrangements of optical elements

28.

SUPPORT SYSTEM FOR MOBILE COORDINATE SCANNER

      
Application Number 18131526
Status Pending
Filing Date 2023-04-06
First Publication Date 2023-10-12
Owner FARO Technologies, Inc. (USA)
Inventor
  • Waheed, Mufassar
  • Wolke, Matthias
  • Frank, Aleksej
  • Brenner, Mark

Abstract

A mobile three-dimensional (3D) measuring system includes a 3D measuring device, and a support apparatus. The 3D measuring device is coupled to the support apparatus. The support apparatus includes a pole mount that includes a gimbal at the top of the pole mount, wherein the 3D measuring device is attached to the gimbal. The support apparatus further includes a counterweight at the bottom of the pole mount, the counterweight matches a weight of the 3D measuring device.

IPC Classes  ?

  • G01S 17/89 - Lidar systems, specially adapted for specific applications for mapping or imaging
  • G01S 7/481 - Constructional features, e.g. arrangements of optical elements
  • G01S 17/86 - Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders

29.

GENERATING A DIGITAL TWIN REPRESENTATION OF AN ENVIRONMENT OR OBJECT

      
Application Number 18124318
Status Pending
Filing Date 2023-03-21
First Publication Date 2023-10-12
Owner FARO Technologies, Inc. (USA)
Inventor
  • Zweigle, Oliver
  • Frank, Aleksej
  • Böehret, Tobias
  • Wolke, Matthias

Abstract

Examples described herein provide a method that includes communicatively connecting a camera to a processing system. The processing system includes a light detecting and ranging (LIDAR) sensor. The method further includes capturing, by the processing system, three-dimensional (3D) coordinate data of an environment using the LIDAR sensor while the processing system moves through the environment. The method further includes capturing, by the camera, a panoramic image of the environment. The method further includes associating the panoramic image of the environment with the 3D coordinate data of the environment to generate a dataset for the environment. The method further includes generating a digital twin representation of the environment using the dataset for the environment.

IPC Classes  ?

  • G06T 11/00 - 2D [Two Dimensional] image generation
  • G01S 17/894 - 3D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
  • G03B 37/04 - Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe with cameras or projectors providing touching or overlapping fields of view
  • G06T 7/60 - Analysis of geometric attributes
  • G06T 3/40 - Scaling of a whole image or part thereof

30.

LASER SCANNER FOR VERIFYING POSITIONING OF COMPONENTS OF ASSEMBLIES

      
Application Number 18131596
Status Pending
Filing Date 2023-04-06
First Publication Date 2023-10-12
Owner FARO Technologies, Inc. (USA)
Inventor
  • Chan, John
  • Haedicke, Udo
  • Dubé-Dallaire, Mathieu
  • Bédard, Renaud Gaboriault

Abstract

Examples described herein provide a method that includes receiving a model corresponding to an assembly. The method further includes defining an object of interest in the model. The method further includes receiving a point cloud generated based on data obtained by scanning the assembly using a laser scanner. The method further includes aligning the point cloud to the model. The method further includes determining whether a component corresponding to the object of interest is located correctly relative to the assembly based at least in part on the point cloud aligned to the model. The method further includes, responsive to determining that the component is not located correctly, taking a corrective action.

IPC Classes  ?

  • G01B 11/00 - Measuring arrangements characterised by the use of optical techniques
  • G01S 7/481 - Constructional features, e.g. arrangements of optical elements
  • G01S 17/89 - Lidar systems, specially adapted for specific applications for mapping or imaging

31.

METHOD OF REMOTELY CONTROLLING A LASER TRACKER USING A MOBILE COMPUTING DEVICE

      
Application Number 18332175
Status Pending
Filing Date 2023-06-09
First Publication Date 2023-10-05
Owner FARO Technologies, Inc. (USA)
Inventor
  • Nagalla, Kalyan
  • Zhang, Yicheng

Abstract

A laser tracker system and method of operating the laser tracker system is provided. The method includes providing a mobile computing device coupled for communication to a computer network. Identifying with the mobile computing device at least one laser tracker device on the computer network, the at least one laser tracker device including a first laser tracker device. The mobile computing device is connected to the first laser tracker device to transmit signals therebetween via the computer network in response to a first input from a user. One or more control functions are performed on the first laser tracker device in response to one or more second inputs from the user, wherein at least one of the one or more control functions includes selecting with the mobile computing device a retroreflective target and locking the first light beam on the retroreflective target.

IPC Classes  ?

  • G01S 17/66 - Tracking systems using electromagnetic waves other than radio waves
  • G01S 17/42 - Simultaneous measurement of distance and other coordinates
  • H04N 7/18 - Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
  • G01C 15/00 - Surveying instruments or accessories not provided for in groups
  • G01S 7/00 - RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES - Details of systems according to groups , ,
  • G01T 7/00 - MEASUREMENT OF NUCLEAR OR X-RADIATION - Details of radiation-measuring instruments
  • G01S 17/86 - Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders

32.

SOFTWARE CAMERA VIEW LOCK ALLOWING EDITING OF DRAWING WITHOUT ANY SHIFT IN THE VIEW

      
Application Number 18122000
Status Pending
Filing Date 2023-03-15
First Publication Date 2023-09-21
Owner FARO Technologies, Inc. (USA)
Inventor
  • Brown, Matthew T.
  • White, Derik J.

Abstract

A software camera lock is provided. A first image is displayed as a 3D image, wherein a semi-transparent second image overlays the first image. A software camera is inserted at a fixed location in the 3D image, wherein the software camera provides a field-of-view (FOV) displaying a portion of the 3D image, the FOV displaying a first reference in the FOV, the second image displaying a second reference that represents first reference and comprising an object. Software camera is locked in FOV using a lock software camera mode. A model is inserted in first image to match a location of the object in second image, wherein locking the software camera in the FOV causes the FOV of the first image to be maintained in place as the model is being moved in the first image to match the location of the object in second image.

IPC Classes  ?

  • H04N 23/667 - Camera operation mode switching, e.g. between still and video, sport and normal or high and low resolution modes
  • G06T 15/00 - 3D [Three Dimensional] image rendering
  • G06T 5/50 - Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
  • G06V 10/24 - Aligning, centring, orientation detection or correction of the image
  • G06V 10/74 - Image or video pattern matching; Proximity measures in feature spaces
  • H04N 23/63 - Control of cameras or camera modules by using electronic viewfinders

33.

AUGMENTED AND VIRTUAL REALITY

      
Application Number 18134670
Status Pending
Filing Date 2023-04-14
First Publication Date 2023-09-21
Owner FARO Technologies, Inc. (USA)
Inventor
  • Heinen, Simon
  • Tholen, Lars
  • Akbari-Hochberg, Mostafa
  • Abidin, Gloria

Abstract

A method for creating an augmented reality scene, the method comprising, by a computing device with a processor and a memory, receiving a first video image data and a second video image data; calculating an error value for a current pose between the two images by comparing the pixel colors in the first video image data and the second video image data; warping pixel coordinates into a second video image data through the use of the map of depth hypotheses for each pixel; varying the pose between the first video image data and the second video image data to find a warp that corresponds to a minimum error value; calculating, using the estimated poses, a new depth measurement for each pixel that is visible in both the first video image data and the second video image data.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06T 15/04 - Texture mapping
  • G06T 17/20 - Wire-frame description, e.g. polygonalisation or tessellation

34.

WORKSTATION WITH DYNAMIC MACHINE VISION SENSING AND AUGMENTED REALITY

      
Application Number 18075560
Status Pending
Filing Date 2022-12-06
First Publication Date 2023-09-14
Owner FARO Technologies, Inc. (USA)
Inventor
  • Balatzis, Georgios
  • Müller, Michael

Abstract

A computer-implemented method includes identifying, by a controller, a part that is being transported to a workstation. The method further includes capturing a 3D scan of the part using a dynamic machine vision sensor. The method further includes validating the part by comparing the 3D scan of the part with a 3D model of the part. The method further includes, based on a determination that the part is valid, projecting a hologram that includes a sequence of assembly steps associated with the part. The method further includes, upon completion of the sequence of assembly steps, capturing a 3D scan of an item that is assembled using the part. The method further includes validating the item by comparing the 3D scan of the item with a 3D model of the item. The method further includes notifying a validity of the item.

IPC Classes  ?

  • G05B 19/4099 - Surface or curve machining, making 3D objects, e.g. desktop manufacturing
  • B23P 19/04 - Machines for simply fitting together or separating metal parts or objects, or metal and non-metal parts, whether or not involving some deformation; Tools or devices therefor so far as not provided for in other classes for assembling or disassembling parts

35.

POINT CLOUD-DEFINED BOUNDARY

      
Application Number 18115512
Status Pending
Filing Date 2023-02-28
First Publication Date 2023-09-07
Owner FARO Technologies, Inc. (USA)
Inventor
  • Chan, John
  • Schmidt, Michael

Abstract

Examples described herein provide a method that includes receiving three-dimensional (3D) data of an object in an environment. The method further includes generating the point cloud-defined boundary around the object based at least in part on the 3D data.

IPC Classes  ?

36.

ALIGNING SCANS OF AN ENVIRONMENT USING A REFERENCE OBJECT

      
Application Number 18109977
Status Pending
Filing Date 2023-02-15
First Publication Date 2023-08-24
Owner FARO Technologies, Inc. (USA)
Inventor
  • Wohlfeld, Denis
  • Boehret, Tobias

Abstract

An example method includes receiving a first plurality of coordinate measurement points capturing a portion of an environment and a reference object within the environment, the first plurality of coordinate measurement points defining at least a portion of a first point cloud. The method further includes receiving a second plurality of coordinate measurement points from a position other than the at least one aerial position, the second plurality of coordinate measurement points capturing at least some of the portion of the environment and the reference object within the environment, the second plurality of coordinate measurement points defining at least a portion of a second point cloud. The method further includes aligning the first point cloud and the second point cloud based at least in part on the reference object captured in the first point cloud and the reference object captured the second point cloud to generate a combined point cloud.

IPC Classes  ?

  • G06T 7/33 - Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
  • G01C 15/00 - Surveying instruments or accessories not provided for in groups

37.

AUGMENTED REALITY ALIGNMENT AND VISUALIZATION OF A POINT CLOUD

      
Application Number 18108756
Status Pending
Filing Date 2023-02-13
First Publication Date 2023-08-17
Owner FARO Technologies, Inc. (USA)
Inventor
  • Chan, John
  • Korgel, Daniel
  • Wostal, Angelo
  • Müller, Michael
  • Haedicke, Udo

Abstract

An example method includes generating a graphical representation of a point cloud of an environment overlaid on a video stream of the environment. The method further includes receiving a first selection of a first point pair, the first point pair including a first virtual point of the point cloud and a first real point of the environment, the first real point corresponding to the first virtual point. The method further includes receiving a second selection of a second point pair, the second point pair including a second virtual point of the point cloud and a second real point of the environment, the second real point corresponding to the second virtual point. The method further includes aligning the point cloud to the environment based at least in part on the first point pair and the second point pair and updating the graphical representation based at least in part on the aligning.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
  • G06V 10/74 - Image or video pattern matching; Proximity measures in feature spaces
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

38.

SYSTEM AND METHOD OF COMBINING THREE DIMENSIONAL DATA

      
Application Number 18106842
Status Pending
Filing Date 2023-02-07
First Publication Date 2023-08-10
Owner FARO Technologies, Inc. (USA)
Inventor Bergqvist, Göran

Abstract

According to one aspect of the disclosure a method for generating a three-dimensional model of an environment is provided. The method includes acquiring a first plurality of 3D coordinates of surfaces in the environment in a first coordinate frame of reference using a first measurement device, the first plurality of 3D coordinates including at least one subset of 3D coordinates of a target, the first measurement device optically measuring the plurality of 3D coordinates. A second plurality of 3D coordinates of the environment are acquired in a second frame of reference using a second measurement device, the second measurement device being operably disposed in a fixed relationship to the target. The second plurality of 3D coordinates with the first plurality of 3D coordinates are registered in the first coordinate frame of reference based at least in part on the at least one subset of 3D coordinates and the fixed relationship.

IPC Classes  ?

  • G06F 30/13 - Architectural design, e.g. computer-aided architectural design [CAAD] related to design of buildings, bridges, landscapes, production plants or roads

39.

SCAN COLOR RESTORATION

      
Application Number 18102864
Status Pending
Filing Date 2023-01-30
First Publication Date 2023-08-03
Owner FARO Technologies, Inc. (USA)
Inventor
  • Ul Azam, Raza
  • Pompe, Daniel
  • Grottel, Sebastian

Abstract

Techniques are described to generate a 3D scene by mapping a point cloud with a 2D image, and colorize portions of the 3D scene synthetically. An input is received to select, from the 3D scene, a portion to be colorized synthetically. The colorizing includes generating a reflectance image based on an intensity image of the point cloud. The colorizing further includes generating an occlusion mask that identifies the selected portion in the reflectance image. The colorizing further includes estimating, using a trained machine learning model, a color for each of the one or more points in the selected portion based on the reflectance image, the occlusion mask, and the 2D image. The 3D scene is updated by using the estimated colors from the trained machine learning model to colorize the selected portion.

IPC Classes  ?

  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
  • G06T 7/90 - Determination of colour characteristics
  • G06T 5/50 - Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
  • G01S 17/894 - 3D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar

40.

MOBILE SYSTEM AND METHOD OF SCANNING AN ENVIRONMENT

      
Application Number 18123752
Status Pending
Filing Date 2023-03-20
First Publication Date 2023-07-20
Owner FARO Technologies, Inc. (USA)
Inventor
  • Buback, Johannes
  • Sapina, Igor
  • Becker, Julian
  • Ossig, Martin
  • Frank, Aleksej
  • Ramadneh, Ahmad
  • Zweigle, Oliver
  • Santos, João

Abstract

A system and method for measuring three-dimensional (3D) coordinate values of an environment is provided. The system includes a movable base unit a first scanner and a second scanner. One or more processors performing a method that includes causing the first scanner to determine first plurality of coordinate values in a first frame of reference based at least in part on a measurement by at least one sensor. The second scanner determines a second plurality of 3D coordinate values in a second frame of reference as the base unit is moved from a first position to a second position. The determining of the first coordinate values and the second plurality of 3D coordinate values being performed simultaneously. The second plurality of 3D coordinate values are registered in a common frame of reference based on the first plurality of coordinate values.

IPC Classes  ?

  • G01C 3/02 - Measuring distances in line of sight; Optical rangefinders - Details
  • G01B 5/008 - Measuring arrangements characterised by the use of mechanical techniques for measuring coordinates of points using coordinate measuring machines
  • G01B 11/00 - Measuring arrangements characterised by the use of optical techniques
  • G01S 17/06 - Systems determining position data of a target
  • G01C 21/16 - Navigation; Navigational instruments not provided for in groups by using measurement of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation

41.

USER INTERFACE FOR THREE-DIMENSIONAL MEASUREMENT DEVICE

      
Application Number 18126644
Status Pending
Filing Date 2023-03-27
First Publication Date 2023-07-20
Owner FARO Technologies, Inc. (USA)
Inventor
  • Döring, Daniel
  • Debitsch, Rasmus
  • Pfeiffer, Rene
  • Ruhland, Axel

Abstract

A system and method for providing feedback on a quality of a 3D scan is provided. The system includes a coordinate scanner configured to optically measure and determine a plurality of three-dimensional coordinates to a plurality of locations on at least one surface in the environment, the coordinate scanner being configured to move through the environment while acquiring the plurality of three-dimensional coordinates. A display having a graphical user interface. One or more processors are provided that are configured to determine a quality attribute of a process of measuring the plurality of three-dimensional coordinates based at least in part on the movement of the coordinate scanner in the environment and display a graphical quality indicator on the graphical user interface based at least in part on the quality attribute, the quality indicator is a graphical element having at least one movable element.

IPC Classes  ?

  • G01B 11/00 - Measuring arrangements characterised by the use of optical techniques
  • G06F 3/0481 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance

42.

ARTIFICIAL PANORAMA IMAGE PRODUCTION AND IN-PAINTING FOR OCCLUDED AREAS IN IMAGES

      
Application Number 17976041
Status Pending
Filing Date 2022-10-28
First Publication Date 2023-06-29
Owner FARO Technologies, Inc. (USA)
Inventor
  • Kaabi, Hani
  • Parian, Jafar Amiri

Abstract

A system includes a three-dimensional (3D) scanner, a camera with a viewpoint that is different from a viewpoint of the 3D scanner, and one or more processors coupled with the 3D scanner and the camera. The processors access a point cloud from the 3D scanner and one or more images from the camera, the point cloud comprises a plurality of 3D scan-points, a 3D scan-point represents a distance of a point in a surrounding environment from the 3D scanner, and an image comprises a plurality of pixels, a pixel represents a color of a point in the surrounding environment. The processors generate, using the point cloud and the one or more images, an artificial image that represents a portion of the surrounding environment viewed from an arbitrary position in an arbitrary direction, wherein generating the artificial image comprises colorizing each pixel in the artificial image.

IPC Classes  ?

  • H04N 13/257 - Colour aspects
  • H04N 13/271 - Image signal generators wherein the generated image signals comprise depth maps or disparity maps
  • H04N 13/25 - Image signal generators using stereoscopic image cameras using image signals from one sensor to control the characteristics of another sensor

43.

OBJECT TRACKING

      
Application Number 18077791
Status Pending
Filing Date 2022-12-08
First Publication Date 2023-06-22
Owner FARO Technologies, Inc. (USA)
Inventor
  • Haedicke, Udo
  • Wohlfeld, Denis
  • Zweigle, Oliver

Abstract

Examples described herein provide a method that includes receiving point cloud data from a three-dimensional (3D) coordinate measurement device, the point cloud data corresponding at least in part to the object. The method further includes analyzing, by a processing system, the point cloud data by comparing a point of the point cloud data to a corresponding reference point from reference data to determine a distance between the point and the corresponding reference point, wherein the point and the corresponding reference point are associated with the object. The method further includes determining, by the processing system, whether a change to a location of the object occurred by comparing the distance to a distance threshold. The method further includes, responsive to determining that the change to the location of the object occurred, displaying a change indicium on a display of the processing system.

IPC Classes  ?

  • G01S 17/66 - Tracking systems using electromagnetic waves other than radio waves
  • G06T 7/521 - Depth or shape recovery from the projection of structured light
  • G06T 7/246 - Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
  • G01S 17/89 - Lidar systems, specially adapted for specific applications for mapping or imaging
  • G01S 7/51 - Display arrangements
  • G01S 7/48 - RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES - Details of systems according to groups , , of systems according to group

44.

DENOISING POINT CLOUDS

      
Application Number 18078193
Status Pending
Filing Date 2022-12-09
First Publication Date 2023-06-15
Owner FARO Technologies, Inc. (USA)
Inventor
  • Balatzis, Georgios
  • Müller, Michael

Abstract

Examples described herein provide a method for denoising data. The method includes receiving an image pair, a disparity map associated with the image pair, and a scanned point cloud associated with the image pair. The method includes generating, using a machine learning model, a predicted point cloud based at least in part on the image pair and the disparity map. The method includes comparing the scanned point cloud to the predicted point cloud to identify noise in the scanned point cloud. The method includes generating a new point cloud without at least some of the noise based at least in part on comparing the scanned point cloud to the predicted point cloud.

IPC Classes  ?

  • G06T 5/00 - Image enhancement or restoration
  • G06T 5/50 - Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
  • G06T 7/593 - Depth or shape recovery from multiple images from stereo images

45.

Laser scanner with target detection

      
Application Number 18153071
Grant Number 11947036
Status In Force
Filing Date 2023-01-11
First Publication Date 2023-06-01
Grant Date 2024-04-02
Owner FARO Technologies, Inc. (USA)
Inventor Pompe, Daniel

Abstract

A scanner that can detect types of targets in a scan are includes a processor, housing and a 3D scanner disposed within the housing The processor is configured to identify locations of one more checkerboard targets disposed in the scan area by: identifying transition locations where adjacent segments on a single scan line transition from a first color to a second color; recording locations of the transition locations as first to second color transition locations; identifying and recording transition locations where adjacent segments on a single scan line transition from the second color to the first color as second to first color transition locations; forming a transition line through adjacent first to second color transition locations and adjacent second to first color transition locations; and identifying a location of a checkerboard target based on the transition line.

IPC Classes  ?

  • G01S 7/48 - RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES - Details of systems according to groups , , of systems according to group
  • G01S 7/481 - Constructional features, e.g. arrangements of optical elements
  • G01S 17/42 - Simultaneous measurement of distance and other coordinates

46.

MARKERLESS REGISTRATION OF IMAGE AND LASER SCAN DATA

      
Application Number 17884641
Status Pending
Filing Date 2022-08-10
First Publication Date 2023-05-18
Owner FARO Technologies, Inc. (USA)
Inventor
  • Wolke, Matthias
  • Parian, Jafar Amiri

Abstract

A system includes a first type of measurement device that captures first 2D images, a second type of measurement device that captures 3D scans. A 3D scan includes a point cloud and a second 2D image. The system also includes processors that register the first 2D images. The method includes accessing the 3D scan that records at least a portion of the surrounding environment that is also captured by a first 2D image. Further, 2D features in the second 2D image are detected, and 3D coordinates from the point cloud are associated to the 2D features. 2D features are also detected in the first 2D image, and matching 2D features from the first 2D image and the second 2D image are identified. A position and orientation of the first 2D image is calculated in a coordinate system of the 3D scan using the matching 2D features.

IPC Classes  ?

  • G06T 7/33 - Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods

47.

REMOVING REFLECTION FROM SCANNED DATA

      
Application Number 17903152
Status Pending
Filing Date 2022-09-06
First Publication Date 2023-05-18
Owner FARO Technologies, Inc. (USA)
Inventor
  • Wohlfeld, Denis
  • Bhardwaj, Nithin
  • Krets, Ilia
  • Bauer, Heiko

Abstract

A system includes a three-dimensional (3D) scanner, a camera, and one or more processors coupled with the 3D scanner and the camera. The processors capture a frame that includes a point cloud comprising plurality of 3D scan points and a 2D image. A 3D scan point represents a distance of a point in a surrounding environment from the 3D scanner. A pixel represents a color of a point in the surrounding environment. The processors identify, using a machine learning model, a subset of pixels that represents a reflective surface in the 2D image. Further, for each pixel in the subset of pixels, one or more corresponding 3D scan points is determined. An updated point cloud is created in the frame by removing the corresponding 3D scan points from the point cloud.

IPC Classes  ?

  • G06T 5/50 - Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
  • G06T 7/10 - Segmentation; Edge detection
  • H04N 13/25 - Image signal generators using stereoscopic image cameras using image signals from one sensor to control the characteristics of another sensor
  • G06T 7/00 - Image analysis
  • H04N 5/232 - Devices for controlling television cameras, e.g. remote control

48.

FOUR-DIMENSIONAL DATA PLATFORM USING AUTOMATIC REGISTRATION FOR DIFFERENT DATA SOURCES

      
Application Number 17967221
Status Pending
Filing Date 2022-10-17
First Publication Date 2023-04-20
Owner FARO Technologies, Inc. (USA)
Inventor
  • Zweigle, Oliver
  • Brecht, Thorsten

Abstract

A method is provided that includes generating a four-dimensional (4D) model of an environment based on three-dimensional (3D) coordinates of the environment captured at a first point in time. The method further includes updating the 4D model based at least in part to an update to at least a subset of the 3D coordinates of the environment captured at a second point in time. The method further includes enriching the 4D model by adding supplemental information to the model.

IPC Classes  ?

  • G06T 17/00 - 3D modelling for computer graphics
  • G06T 7/33 - Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G01S 17/894 - 3D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar

49.

LASER PROJECTOR

      
Application Number 18077366
Status Pending
Filing Date 2022-12-08
First Publication Date 2023-04-06
Owner FARO Technologies, Inc. (USA)
Inventor
  • Savikovsky, Arkady
  • Stave, Joel H.
  • Zangrilli, Daniel
  • Veksland, Michael
  • Korrapati, Venkat Pranav

Abstract

A laser projector steers a pulsed laser beam to form a pattern of stationary dots on an object, the pulsed laser beam having a periodicity determined based at least in part on a maximum allowable spacing of the dots and on a maximum angular velocity at which the beam can be steered, wherein a pulse width of the laser beam and a pulse peak power of the laser beam are based at least in part on the determined periodicity and on laser safety requirements.

IPC Classes  ?

  • H01S 3/00 - Lasers, i.e. devices using stimulated emission of electromagnetic radiation in the infrared, visible or ultraviolet wave range
  • H04N 9/31 - Projection devices for colour picture display
  • H01S 3/11 - Mode locking; Q-switching; Other giant-pulse techniques, e.g. cavity dumping

50.

THREE-DIMENSIONAL MEASUREMENT DEVICE

      
Application Number 17859218
Status Pending
Filing Date 2022-07-07
First Publication Date 2023-04-06
Owner FARO Technologies, Inc. (USA)
Inventor Hillebrand, Gerrit

Abstract

A method includes capturing a frame including a 3D point cloud and a 2D image. A key point is detected in the 2D image, the key point is a candidate to be used as a feature. A 3D patch of a predetermined dimension is created that includes points surrounding a 3D position of the key point. The 3D position and the points of the 3D patch are determined from the 3D point cloud. Based on a determination that the points in the 3D patch are on a single plane based on the corresponding 3D coordinates, a descriptor for the 3D patch is computed. The frame is registered with a second frame by matching the descriptor for the 3D patch with a second descriptor associated with a second 3D patch from the second frame. The 3D point cloud is aligned with multiple 3D point clouds based on the registered frame.

IPC Classes  ?

  • G01S 7/481 - Constructional features, e.g. arrangements of optical elements
  • G01S 17/894 - 3D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
  • G06T 17/05 - Geographic models

51.

CONSTRUCTION SITE DIGITAL FIELD BOOK FOR THREE-DIMENSIONAL SCANNERS

      
Application Number 17813629
Status Pending
Filing Date 2022-07-20
First Publication Date 2023-02-16
Owner FARO Technologies, Inc. (USA)
Inventor
  • Wohlfeld, Denis
  • Bauer, Heiko
  • Böhret, Tobias

Abstract

A method, system, and computer product that track scanning data acquired by a three-dimensional (3D) coordinate scanner is provided. The method includes storing a digital representation of an environment in memory of a mobile computing device. A first scan is performed with the 3D coordinate scanner in an area of the environment. A location of the first scan is determined on the digital representation. The first scan is registered with the digital representation. The location of the 3D coordinate scanner is indicated on the digital representation at the time of the first scan.

IPC Classes  ?

  • G06T 7/33 - Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
  • H04L 67/1095 - Replication or mirroring of data, e.g. scheduling or transport for data synchronisation between network nodes
  • G01S 17/894 - 3D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar

52.

VIRTUAL REALITY SYSTEM FOR VIEWING POINT CLOUD VOLUMES WHILE MAINTAINING A HIGH POINT CLOUD GRAPHICAL RESOLUTION

      
Application Number 17967236
Status Pending
Filing Date 2022-10-17
First Publication Date 2023-02-09
Owner FARO Technologies, Inc. (USA)
Inventor
  • Caputo, Manuel
  • Bergmann, Louis

Abstract

A virtual reality (VR) system that includes a three-dimensional (3D) point cloud having a plurality of points, a VR viewer having a current position, a graphics processing unit (GPU), and a central processing unit (CPU). The CPU determines a field-of-view (FOV) based at least in part on the current position of the VR viewer, selects, using occlusion culling, a subset of the points based at least in part on the FOV, and provides them to the GPU. The GPU receives the subset of the plurality of points from the CPU and renders an image for display on the VR viewer based at least in part on the received subset of the plurality of points. The selecting a subset of the plurality of points is at a first frame per second (FPS) rate and the rendering is at a second FPS rate that is faster than the first FPS rate.

IPC Classes  ?

53.

DEFECT DETECTION IN A POINT CLOUD

      
Application Number 17874970
Status Pending
Filing Date 2022-07-27
First Publication Date 2023-02-09
Owner FARO Technologies, Inc. (USA)
Inventor
  • Balatzis, Georgios
  • Sharma, Tharesh

Abstract

Examples described herein provide a method that includes performing a first scan of an object to generate first scan data. The method further includes detecting a defect on a surface of the object by analyzing the first scan data to identify a region of interest containing the defect by comparing the first scan data to reference scan data. The method further includes performing a second scan of the region of interest containing the defect to generate second scan data, the second scan data being higher resolution scan data than the first scan data. The method further includes combining the first scan data and the second scan data to generate a point cloud of the object.

IPC Classes  ?

  • G06V 10/82 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
  • G06N 3/04 - Architecture, e.g. interconnection topology
  • G06T 7/521 - Depth or shape recovery from the projection of structured light
  • G06T 3/40 - Scaling of a whole image or part thereof
  • G01S 17/48 - Active triangulation systems, i.e. using the transmission and reflection of electromagnetic waves other than radio waves

54.

A SYSTEM AND METHOD OF GENERATING A FLOORPLAN

      
Application Number 17850084
Status Pending
Filing Date 2022-06-27
First Publication Date 2023-01-05
Owner FARO Technologies, Inc. (USA)
Inventor
  • Brenner, Mark
  • Frank, Aleksej
  • Zweigle, Oliver

Abstract

A system and method of generating a two-dimensional (2D) image of an environment is provided. The system includes a scanner having a first light source, an image sensor, a second light source and a controller, the second light source emitting a visible light, the controller determining a distance to points based on a beam of light emitted by the first light source and receiving of the reflected beam of light from the points. Processors are operably coupled to the scanner execute a method comprising: generating a map of the environment; emitting light from the second light source towards an edge defined by at least a pair of surfaces; detecting the edge based on emitting a second beam of light and receiving the reflected second beam of light; and defining a room on the map based on the detecting of the corner or the edge.

IPC Classes  ?

  • G01C 21/00 - Navigation; Navigational instruments not provided for in groups

55.

CAPTURING ENVIRONMENTAL SCANS USING AUTOMATED TRANSPORTER ROBOT

      
Application Number 17702904
Status Pending
Filing Date 2022-03-24
First Publication Date 2022-12-29
Owner FARO Technologies, Inc. (USA)
Inventor
  • Brenner, Mark
  • Frank, Aleksej
  • Ramadneh, Ahmad
  • Waheed, Mufassar
  • Zweigle, Oliver

Abstract

A system includes a transporter robot with a motion controller that changes the transporter robot's poses during transportation. A scanning device is fixed to the transporter robot. One or more processors are coupled to the transporter robot and the scanning device to generate a map of the surrounding environment. At a timepoint T1, when the transporter robot is stationary at a first location, a first pose of the transporter robot is captured. During transporting the scanning device, at a timepoint T2, the scanning device captures additional scan-data of a portion of the surrounding environment. In response, the motion controller provides a second pose of the transporter robot at T2. A compensation vector and a rotation for the scan-data are determined based on a difference between the first pose and the second pose. A revised scan-data is computed, and the revised scan-data is registered to generate the map.

IPC Classes  ?

  • B25J 9/16 - Programme controls
  • B25J 13/08 - Controls for manipulators by means of sensing devices, e.g. viewing or touching devices

56.

TRACKING WITH REFERENCE TO A WORLD COORDINATE SYSTEM

      
Application Number 17845482
Status Pending
Filing Date 2022-06-21
First Publication Date 2022-12-29
Owner FARO Technologies, Inc. (USA)
Inventor
  • Parian, Jafar Amiri
  • Bridges, Robert E.

Abstract

Examples described herein provide a method that includes capturing data about an environment. The method further includes generating a database of two-dimensional (2D) features and associated three-dimensional (3D) coordinates based at least in part on the data about the environment. The method further includes determining a position (x, y, z) and an orientation (pitch, roll, yaw) of a device within the environment based at least in part on the database of 2D features and associated 3D coordinates. The method further includes causing the device to display, on a display of the device, an augmented reality element at a predetermined location based at least in part on the position and the orientation of the device.

IPC Classes  ?

  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods
  • G06V 10/74 - Image or video pattern matching; Proximity measures in feature spaces
  • G06V 10/22 - Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
  • G06T 11/00 - 2D [Two Dimensional] image generation
  • G01C 11/06 - Interpretation of pictures by comparison of two or more pictures of the same area
  • G01S 17/894 - 3D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar

57.

TARGETLESS TRACKING OF MEASUREMENT DEVICE DURING CAPTURE OF SURROUNDING DATA

      
Application Number 17702982
Status Pending
Filing Date 2022-03-24
First Publication Date 2022-12-22
Owner FARO Technologies, Inc. (USA)
Inventor
  • Lombardi, Marco
  • Bonarrigo, Francesco
  • Riccardi, Andrea
  • Barone, Federico

Abstract

Technical solutions are described to track a handheld three-dimensional (3D) scanner in an environment using natural features in the environment. In one or more examples, the natural features are detected using machine learning. Features are filtered by performing a stereo matching between respective pairs of stereo images captured by the scanner. The features are further filtered using time matching between images captured by the scanner at different timepoints.

IPC Classes  ?

  • G06T 15/20 - Perspective computation
  • G06T 7/593 - Depth or shape recovery from multiple images from stereo images
  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods
  • G06V 10/10 - Image acquisition

58.

AUTOMATED UPDATE OF GEOMETRICAL DIGITAL REPRESENTATION

      
Application Number 17750551
Status Pending
Filing Date 2022-05-23
First Publication Date 2022-12-08
Owner FARO Technologies, Inc. (USA)
Inventor
  • Bauer, Heiko
  • Boehret, Tobias
  • Wohlfeld, Denis

Abstract

A method for updating a digital representation of an environment includes capturing an image of a portion of the environment using a change-detection device. Further, a corresponding digital data is determined that represents the portion in the digital representation of the environment. A change in the portion is detected by comparing the image with the corresponding digital data. In response to the change being above a predetermined threshold, the method includes initiating a resource-intensive scan of the portion using a scanning device, and updating the digital representation of the environment by replacing the corresponding digital data representing the portion with the resource-intensive scan.

IPC Classes  ?

  • G06T 7/521 - Depth or shape recovery from the projection of structured light
  • G06T 7/70 - Determining position or orientation of objects or cameras
  • G01S 17/89 - Lidar systems, specially adapted for specific applications for mapping or imaging

59.

Projector with three-dimensional measurement device

      
Application Number 17818812
Grant Number 11852777
Status In Force
Filing Date 2022-08-10
First Publication Date 2022-12-01
Grant Date 2023-12-26
Owner FARO Technologies, Inc. (USA)
Inventor
  • Trollmann, Jens
  • Mueller, Stefan

Abstract

A device and method for projecting a light pattern is provided. The device includes a processor system and a housing. The housing is rotatable about a first axis. A measurement device is operably coupled to the housing that measures a distance to a surface in an environment. A light projector is operably coupled to the housing, the light projector having a light source and a pair of movable mirrors, the light source positioned to emit light onto the pair of movable mirrors. Wherein the processor system is responsive to computer instructions for: determining 3D coordinates of points on the surface with the 3D measurement device; selecting a pattern; adjusting the pattern based at least in part on the 3D coordinates; and causing the light projector to emit a beam of light and moving the pair of mirrors to generate the adjusted pattern on the surface.

IPC Classes  ?

  • G01V 8/26 - Detecting, e.g. by using light barriers using multiple transmitters or receivers using mechanical scanning systems
  • G02B 26/08 - Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
  • G01B 11/02 - Measuring arrangements characterised by the use of optical techniques for measuring length, width, or thickness
  • G01B 11/00 - Measuring arrangements characterised by the use of optical techniques
  • H04N 1/04 - Scanning arrangements
  • G01S 7/4865 - Time delay measurement, e.g. time-of-flight measurement, time of arrival measurement or determining the exact position of a peak

60.

GENERATING ENVIRONMENTAL MAP BY ALIGNING CAPTURED SCANS

      
Application Number 17737250
Status Pending
Filing Date 2022-05-05
First Publication Date 2022-11-17
Owner FARO Technologies, Inc. (USA)
Inventor
  • Zweigle, Oliver
  • Brenner, Mark
  • Frank, Aleksej
  • Ramadneh, Ahmad

Abstract

A method for performing a simultaneous location and mapping of a scanner device in a surrounding environment includes capturing a scan-data of a portion of a map of the surrounding environment. The scan-data comprises a point cloud. Further, at runtime, a user-interface is used to make, a selection of a feature from the scan-data, and a selection of a submap that was previously captured. The submap includes the same feature. The method further includes determining a first scan position as a present position of the scanner device, and determining a second scan position as a position of the scanner device. The method further includes determining a displacement vector for the map based on the first and the second scan positions. Further, a revised first scan position is computed based on the second scan position and the displacement vector. The scan-data is registered using the revised first scan position.

IPC Classes  ?

  • G01S 17/89 - Lidar systems, specially adapted for specific applications for mapping or imaging
  • G01S 7/484 - Transmitters
  • G06V 10/77 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation

61.

SURFACE DETERMINATION USING THREE-DIMENSIONAL VOXEL DATA

      
Application Number 17723676
Status Pending
Filing Date 2022-04-19
First Publication Date 2022-11-10
Owner FARO Technologies, Inc. (USA)
Inventor
  • Stiebeiner, Ariane
  • Balatzis, Georgios
  • Xhohaj, Festim
  • Klopp-Tosser, Antonin

Abstract

6Examples described herein provide a method that includes obtaining, by a processing device, three-dimensional (3D) voxel data. The method further includes performing, by the processing device, gray value thresholding based at least in part on the 3D voxel data and assigning a classification value to at least one voxel of the 3D voxel data. The method further includes defining, by the processing device, segments based on the classification value. The method further includes filtering, by the processing device, the segments based on the classification value. The method further includes evaluating, by the processing device, the segments to identify a surface voxel per segment. The method further includes determining, by the processing device, a position of a surface point within the surface voxel.

IPC Classes  ?

  • G06T 15/08 - Volume rendering
  • G06T 15/50 - Lighting effects
  • G06T 11/00 - 2D [Two Dimensional] image generation
  • G06T 17/20 - Wire-frame description, e.g. polygonalisation or tessellation
  • G01N 23/046 - Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups , or by transmitting the radiation through the material and forming images of the material using tomography, e.g. computed tomography [CT]
  • G01N 23/083 - Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups , or by transmitting the radiation through the material and measuring the absorption the radiation being X-rays

62.

Compensation of three-dimensional measuring instrument having an autofocus camera

      
Application Number 17813630
Grant Number 11763491
Status In Force
Filing Date 2022-07-20
First Publication Date 2022-11-10
Grant Date 2023-09-19
Owner FARO Technologies, Inc. (USA)
Inventor
  • Ossig, Martin
  • Buback, Johannes

Abstract

A 3D measuring instrument includes a registration camera and a surface measuring system having a projector and autofocus camera. In a first pose, the registration camera captures a first registration image of first registration points. The autofocus camera captures a first surface image of first light projected onto the object by the projector and determines first 3D coordinates of points on the object. In a second pose, the registration camera captures a second registration image of second registration points. The autofocus camera adjusts the autofocus mechanism based at least in part on adjusting a focal length to reduce a difference between positions of the first and second registration points. A second surface image of second light is captured. A compensation parameter is determined based at least in part on the first registration image, the second registration image, the first 3D coordinates, the second surface image, and the projected second light.

IPC Classes  ?

  • G06T 7/80 - Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
  • G06T 7/521 - Depth or shape recovery from the projection of structured light
  • G06T 7/571 - Depth or shape recovery from multiple images from focus
  • H04N 23/56 - Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
  • H04N 23/67 - Focus control based on electronic image sensor signals

63.

OCCLUSION DETECTION FOR LASER SCAN-POINT COLORING

      
Application Number 17678121
Status Pending
Filing Date 2022-02-23
First Publication Date 2022-11-03
Owner FARO TECHNOLOGIES, INC. (USA)
Inventor
  • Parian, Jafar Amiri
  • Kaabi, Hani
  • Buback, Johannes

Abstract

A system includes a three-dimensional (3D) scanner that captures a 3D point cloud with multiple scan-points corresponding to one or more objects scanned in a surrounding environment. The system further includes a camera that captures an image of the surrounding environment. The system further includes one or more processors that colorize the scan-points in the 3D point cloud using the image. Colorizing a scan-point includes determining, for the scan-point, a corresponding pixel in the image by back-projecting the scan-point to the camera. Colorizing the scan-point includes assigning, to the scan-point, a color-value based on the corresponding pixel. Colorizing the scan-point includes computing, for the scan-point, a distance of the scan-point from the camera. Colorizing the scan-point includes determining, based on the distance, that the scan-point is occluded from only one of the camera and the 3D scanner, and in response, updating the color-value assigned to the scan-point.

IPC Classes  ?

  • G06T 7/00 - Image analysis
  • G06T 7/33 - Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts

64.

Calibrating system for colorizing point-clouds

      
Application Number 17678116
Grant Number 11790557
Status In Force
Filing Date 2022-02-23
First Publication Date 2022-11-03
Grant Date 2023-10-17
Owner FARO Technologies, Inc. (USA)
Inventor
  • Parian, Jafar Amiri
  • Ossig, Martin
  • Kaabi, Hani

Abstract

A system includes a three-dimensional (3D) scanner that captures a 3D point cloud corresponding to one or more objects in a surrounding environment. The system further includes a camera that captures a control image by capturing a plurality of images of the surrounding environment, and an auxiliary camera configured to capture an ultrawide-angle image of the surrounding environment. One or more processors of the system colorize the 3D point cloud using the ultrawide-angle image by mapping the ultrawide-angle image to the 3D point cloud. The system performs a limited system calibration before colorizing each 3D point cloud, and a periodic full system calibration before/after a plurality of 3D point clouds are colorized.

IPC Classes  ?

  • G06T 7/90 - Determination of colour characteristics
  • G06T 7/80 - Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
  • G01S 17/86 - Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
  • G06V 10/75 - Image or video pattern matching; Proximity measures in feature spaces using context analysis; Selection of dictionaries
  • G01S 7/497 - Means for monitoring or calibrating
  • G01S 17/89 - Lidar systems, specially adapted for specific applications for mapping or imaging
  • G06T 3/40 - Scaling of a whole image or part thereof
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
  • H04N 17/00 - Diagnosis, testing or measuring for television systems or their details
  • H04N 23/698 - Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture

65.

HYBRID FEATURE MATCHING BETWEEN INTENSITY IMAGE AND COLOR IMAGE

      
Application Number 17678119
Status Pending
Filing Date 2022-02-23
First Publication Date 2022-11-03
Owner FARO Technologies, Inc. (USA)
Inventor Parian, Jafar Amiri

Abstract

A point cloud is colorized by mapping a color image using an intensity image. The mapping includes detecting multiple features from the intensity image using a feature-extraction algorithm. A feature is extracted that is not within a predetermined vicinity of an edge in the intensity image. A template is created by selecting a portion of a predetermined size from the intensity image with the feature at the center. A search window is created with the same size as the template by selecting a portion of a luminance image as a search space. The luminance image is obtained from the color image. A cost value is computed for each pixel of the search space by comparing image gradients of the template and the search window. A matching point is determined in the color image corresponding to the feature based on the cost value for each pixel of search space.

IPC Classes  ?

  • G06T 7/33 - Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods

66.

Automatic selection of a region in a three-dimensional (3D) point cloud

      
Application Number 17584613
Grant Number 11954798
Status In Force
Filing Date 2022-01-26
First Publication Date 2022-10-27
Grant Date 2024-04-09
Owner FARO Technologies, Inc. (USA)
Inventor Fournet, Romain

Abstract

Automatic selection of region in 3D point cloud is provided. Neighbor points are determined for given seed point of seed points. Responsive to a color difference of a given neighbor point from given seed point being less than neighbor color distance threshold and responsive to an angle between a normal of given neighbor point and a normal of given seed point being less than neighbor normal angle threshold, given neighbor point is added to region in 3D point cloud. Responsive to curvature at given neighbor point being less than curvature threshold, responsive to color difference of given neighbor point from initial seed point being less than initial seed color distance threshold and responsive to an angle between a normal of given neighbor point and a normal of initial seed point being less than an initial seed normal angle, given neighbor point is added to seed points for processing.

IPC Classes  ?

  • G06T 17/20 - Wire-frame description, e.g. polygonalisation or tessellation
  • G06T 7/60 - Analysis of geometric attributes
  • G06T 7/90 - Determination of colour characteristics

67.

Laser projector system

      
Application Number 17358511
Grant Number 11921409
Status In Force
Filing Date 2021-06-25
First Publication Date 2022-10-13
Grant Date 2024-03-05
Owner FARO Technologies, Inc. (USA)
Inventor
  • Isabelle, Maxime
  • Armstrong, Matthew T.
  • Diangelus, Salvatore
  • Martinez, Leonardo
  • Stave, Joel H.

Abstract

A light projector and method of aligning the light projector is provided. A light projector steers an outgoing beam of light onto an object, passing light returned from the object through a focusing lens onto an optical detector. The light projector may generate a light pattern or template by rapidly moving the outgoing beam of light along a path on a surface. To place the light pattern/template in a desired location, the light projector may be aligned with an electronic model.

IPC Classes  ?

  • G03B 21/20 - Lamp housings
  • G01B 11/00 - Measuring arrangements characterised by the use of optical techniques
  • G01B 11/25 - Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. moiré fringes, on the object

68.

SPECKLE REDUCTION METHODS IN LINE SCANNERS

      
Application Number 17808736
Status Pending
Filing Date 2022-06-24
First Publication Date 2022-10-06
Owner FARO Technologies, Inc. (USA)
Inventor Dhasmana, Nitesh

Abstract

A system includes a first light source that emits a beam of light; an electrical modulator that imparts a time-varying modulation on the beam of light; a beam- shaping system that shapes the beam of light and projects the shaped beam of light onto an object; an image sensor that captures the beam of light reflected from the object; and processors that determine three-dimensional (3D) coordinates of points on the object.

IPC Classes  ?

  • G01B 11/00 - Measuring arrangements characterised by the use of optical techniques
  • H04N 5/225 - Television cameras
  • G02B 27/09 - Beam shaping, e.g. changing the cross-sectioned area, not otherwise provided for

69.

HANDHELD SCANNER THAT TRANSFERS DATA AND POWER OVER ETHERNET

      
Application Number 17808734
Status Pending
Filing Date 2022-06-24
First Publication Date 2022-10-06
Owner FARO Technologies, Inc. (USA)
Inventor
  • Schoenfeldt, William E.
  • Kovalski, Fabiano

Abstract

A system includes a handheld unit having a light source, an image sensor, one or more first processors, an Ethernet cable, and a frame. The light source projects light onto an object, and the image sensor captures an image of light reflected from the object. One or more first processors are directly coupled to the frame. An accessory device has one or more second processors that receive data extracted from the captured image over the Ethernet cable and, in response, determine three-dimensional (3D) coordinates of points on the object. The accessory device also send electrical power over the Ethernet cable to the handheld unit.

IPC Classes  ?

  • H04L 67/12 - Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
  • G01B 11/00 - Measuring arrangements characterised by the use of optical techniques
  • H04L 12/10 - Current supply arrangements

70.

LINE SCANNER HAVING INTEGRATED PROCESSING CAPABILITY

      
Application Number 17808735
Status Pending
Filing Date 2022-06-24
First Publication Date 2022-10-06
Owner FARO Technologies, Inc. (USA)
Inventor
  • Schoenfeldt, William E.
  • Kovalski, Fabiano
  • Barba, Jacint R.
  • Atwell, Paul C.
  • Bonarrigo, Francesco

Abstract

A system includes a first light source that projects lines of light onto an object, a second light source that illuminates markers on or near the object, one or more image sensors that receive first reflected light from the projected lines of light and second reflected light from the illuminated markers, one or more processors that determine the locations of the lines of light on the image sensors based on the first reflected light and that determines the locations of the markers on the image sensors based on the second reflected light, and a frame physically coupled to the first light source, the second light source, the one or more image sensors, and the one or more processors.

IPC Classes  ?

  • G01B 11/25 - Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. moiré fringes, on the object
  • G06T 7/70 - Determining position or orientation of objects or cameras

71.

THREE-DIMIENSIONAL POINT CLOUD GENERATION USING MACHINE LEARNING

      
Application Number 17695352
Status Pending
Filing Date 2022-03-15
First Publication Date 2022-09-29
Owner FARO Technologies, Inc. (USA)
Inventor
  • Balatzis, Georgios
  • Bonarrigo, Francesco
  • Riccardi, Andrea

Abstract

An example method for training a machine learning model is provided. The method includes receiving training data collected by a three-dimensional (3D) imager, the training data comprising a plurality of training sets. The method further includes generating, using the training data, a machine learning model from which a disparity map can be inferred from a pair of images that capture a scene where a light pattern is projected onto an object.

IPC Classes  ?

  • H04N 13/271 - Image signal generators wherein the generated image signals comprise depth maps or disparity maps
  • H04N 13/254 - Image signal generators using stereoscopic image cameras in combination with electromagnetic radiation sources for illuminating objects
  • H04N 13/239 - Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
  • H04N 5/247 - Arrangement of television cameras
  • H04N 13/246 - Calibration of cameras
  • G06T 7/521 - Depth or shape recovery from the projection of structured light
  • G06T 7/593 - Depth or shape recovery from multiple images from stereo images
  • G06T 7/80 - Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
  • G03B 35/12 - Stereoscopic photography by simultaneous recording involving recording of different viewpoint images in different colours on a colour film

72.

UPSCALING TRIANGULATION SCANNER IMAGES TO REDUCE NOISE

      
Application Number 17583607
Status Pending
Filing Date 2022-01-25
First Publication Date 2022-08-11
Owner FARO Technologies, Inc. (USA)
Inventor
  • Müller, Michael
  • Balatzis, Georgios

Abstract

Examples described herein provide a method that includes performing, by a processing device, using a neural network, pattern recognition on an image to recognize a feature in the image. The method further includes performing, by the processing device, upscaling of the image to increase a resolution of the image while maintaining the feature to generate an upscaled image.

IPC Classes  ?

  • G06V 10/82 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
  • G06V 10/75 - Image or video pattern matching; Proximity measures in feature spaces using context analysis; Selection of dictionaries
  • G06T 3/40 - Scaling of a whole image or part thereof
  • G06N 3/08 - Learning methods

73.

Handheld three-dimensional coordinate measuring device operatively coupled to a mobile computing device

      
Application Number 17717683
Grant Number 11725928
Status In Force
Filing Date 2022-04-11
First Publication Date 2022-07-28
Grant Date 2023-08-15
Owner FARO Technologies, Inc. (USA)
Inventor
  • Döring, Daniel
  • Heidemann, Rolf
  • Ossig, Martin
  • Hillebrand, Gerrit

Abstract

A handheld device has a projector that projects a pattern of light onto an object, a first camera that captures the projected pattern of light in first images, a second camera that captures the projected pattern of light in second images, a registration camera that captures a succession of third images, one or more processors that determines three-dimensional (3D) coordinates of points on the object based at least in part on the projected pattern, the first images, and the second images, the one or more processors being further operable to register the determined 3D coordinates based at least in part on common features extracted from the succession of third images, and a mobile computing device operably connected to the handheld device and cooperating with the one or more processors, the mobile computing device operable to display the registered 3D coordinates of points.

IPC Classes  ?

  • G01B 11/00 - Measuring arrangements characterised by the use of optical techniques
  • G01K 3/00 - Thermometers giving results other than momentary value of temperature
  • G01K 13/00 - Thermometers specially adapted for specific purposes
  • H10N 10/13 - Thermoelectric devices comprising a junction of dissimilar materials, i.e. devices exhibiting Seebeck or Peltier effects operating with only the Peltier or Seebeck effects characterised by the heat-exchanging means at the junction
  • H10N 10/80 - Constructional details

74.

AUTOMATIC REGISTRATION OF MULTIPLE MEASUREMENT DEVICES

      
Application Number 17511648
Status Pending
Filing Date 2021-10-27
First Publication Date 2022-06-30
Owner FARO TECHNOLOGIES, INC. (USA)
Inventor Parian, Jafar Amiri

Abstract

A computer-implemented method is performed by one or more processors to automatically register a plurality of captured data obtained using a respective measurement device, each of the captured data is obtained separately. The computer-implemented method includes accessing a first captured data of a portion of an environment, and a first image corresponding to said portion of the environment captured from a known relative position and angle with respect to the first captured data. Further, from the plurality of captured data, a second captured data is identified that has at least a partial overlap with said portion, the second captured data is identified based on a corresponding second image. The second image is captured from a known relative position and angle with respect to the second captured data. The method further includes transforming the second captured data and/or the first captured data to a coordinate system.

IPC Classes  ?

  • G06T 7/33 - Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods
  • G06K 9/32 - Aligning or centering of the image pick-up or image-field

75.

Handheld scanner for measuring three-dimensional coordinates

      
Application Number 17556120
Grant Number 11930155
Status In Force
Filing Date 2021-12-20
First Publication Date 2022-06-23
Grant Date 2024-03-12
Owner FARO Technologies, Inc. (USA)
Inventor
  • Atwell, Paul C.
  • Mogensen, Matthew
  • Dhasmana, Nitesh
  • Reihl, Christopher M

Abstract

A 3D measuring system includes a first projector that projects a first line onto an object at a first wavelength, a second projector that projects a second line onto the object at a second wavelength, a first illuminator that emits a third light onto some markers, a second illuminator that emits a fourth light onto some markers, a first camera having a first lens and a first image sensor, a second camera having a second lens and a second image sensor, the first lens operable to pass the first wavelength, block the second wavelength, and pass the third light to a first image sensor, the second lens operable to pass the second wavelength, block the first wavelength, and pass the fourth light. The system further includes one or more processors operable to determine 3D coordinates based on images captured by the first image sensor and the second image sensor.

IPC Classes  ?

  • H04N 13/254 - Image signal generators using stereoscopic image cameras in combination with electromagnetic radiation sources for illuminating objects
  • G01B 11/25 - Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. moiré fringes, on the object
  • H04N 13/239 - Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance

76.

THREE-DIMENSIONAL SCANNER WITH EVENT CAMERA

      
Application Number 17645375
Status Pending
Filing Date 2021-12-21
First Publication Date 2022-06-23
Owner FARO Technologies, Inc. (USA)
Inventor
  • Heidemann, Rolf
  • Wolke, Matthias

Abstract

According to one aspect of the disclosure, a three-dimensional coordinate scanner is provided. The scanner includes a projector configured to emit a pattern of light; a sensor arranged in a fixed predetermined relationship to the projector, the sensor having a photosensitive array comprised of a plurality of event-based pixels, each of the event-based pixels being configured to transmit a signal in response to a change in irradiance exceeding a threshold. One or more processors are electrically coupled to the projector and the sensor, the one or more processors being configured to modulate the pattern of light and determine a three-dimensional coordinate of a surface based at least in part on the pattern of light and the signal.

IPC Classes  ?

  • G01B 11/00 - Measuring arrangements characterised by the use of optical techniques
  • G01B 11/245 - Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures using a plurality of fixed, simultaneously operating transducers
  • G01B 11/25 - Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. moiré fringes, on the object

77.

Tracking data acquired by coordinate measurement devices through a workflow

      
Application Number 17687791
Grant Number 11755784
Status In Force
Filing Date 2022-03-07
First Publication Date 2022-06-23
Grant Date 2023-09-12
Owner FARO Technologies, Inc. (USA)
Inventor
  • Ossig, Martin
  • Horvath, Oswin
  • Flohr, Daniel

Abstract

A method that includes providing a database for storing meta-data that describes steps in a workflow and an order of the steps in the workflow. The meta-data includes, for each of the steps: a reference to an input data file for the step; a description of a transaction performed at the step; and a reference to an output data file generated by the step based at least in part on applying the transaction to the input data file. Data that includes meta-data for a step in the workflow is received and the data is stored in the database. A trace of the workflow is generated based at least in part on contents of the database. The generating is based on receiving a request from a requestor for the trace of the workflow. At least a subset of the trace is output to the requestor.

IPC Classes  ?

  • G06F 21/00 - Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
  • G06F 21/64 - Protecting data integrity, e.g. using checksums, certificates or signatures
  • G06F 21/60 - Protecting data
  • G06Q 10/083 - Shipping
  • H04L 9/32 - Arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system
  • H04L 9/00 - Arrangements for secret or secure communications; Network security protocols

78.

Line scanner having target-tracking and geometry-tracking modes

      
Application Number 17556083
Grant Number 11908162
Status In Force
Filing Date 2021-12-20
First Publication Date 2022-06-23
Grant Date 2024-02-20
Owner FARO Technologies, Inc. (USA)
Inventor
  • Bonarrigo, Francesco
  • Atwell, Paul C.
  • Creachbaum, John Lucas
  • Dhasmana, Nitesh
  • Kovalski, Fabiano
  • Riccardi, Andrea
  • Schoenfeldt, William E.
  • Torsello, Marco
  • Wilson, Christopher Michael

Abstract

A handheld three-dimensional (3D) measuring system operates in a target mode and a geometry mode. In the target mode, a target-mode projector projects a first line of light onto an object, and a first illuminator sends light to markers on or near the object. A first camera captures an image of the first line of light and the illuminated markers. In the geometry mode, a geometry-mode projector projects onto the object a first multiplicity of lines, which are captured by the first camera and a second camera. One or more processors determines 3D coordinates in the target mode and the geometry mode.

IPC Classes  ?

  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods
  • G01C 11/02 - Picture-taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
  • H04N 23/55 - Optical parts specially adapted for electronic image sensors; Mounting thereof
  • H04N 23/56 - Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
  • H04N 23/80 - Camera processing pipelines; Components thereof
  • H04N 13/239 - Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance

79.

CLOUD-TO-CLOUD COMPARISON USING ARTIFICIAL INTELLIGENCE-BASED ANALYSIS

      
Application Number 17533753
Status Pending
Filing Date 2021-11-23
First Publication Date 2022-06-09
Owner FARO Technologies, Inc. (USA)
Inventor
  • Wolke, Matthias
  • Balatzis, Georgios

Abstract

Examples described herein provide a method that includes aligning, by a processing device, a measured point cloud for an object with reference data for the object. The method further includes comparing, by the processing device, the measurement point cloud to the reference data to determine a displacement value between each point in the measurement point cloud and a corresponding point in the reference data. The method further includes generating, by the processing device, a deviation histogram of the displacement values between each point in the measurement point cloud and the corresponding point in the reference data. The method further includes identifying, by the processing device, a region of interest of the deviation histogram. The method further includes determining, by the processing device, whether a deviation associated with the object exists based at least in part on the region of interest.

IPC Classes  ?

  • G01S 17/89 - Lidar systems, specially adapted for specific applications for mapping or imaging
  • G06N 3/08 - Learning methods

80.

MULTI-BAND ATTRIBUTE BLENDING IN THREE-DIMENSIONAL SPACE

      
Application Number 17412793
Status Pending
Filing Date 2021-08-26
First Publication Date 2022-06-02
Owner FARO Technologies, Inc. (USA)
Inventor
  • Kaabi, Hani
  • Parian, Jafar Amiri

Abstract

A method includes mapping attribute information from a sensor with 3D coordinates from a 3D measurement device, wherein the mapping comprises blending the attribute information to avoid boundary transition effects. The blending includes representing the 3D coordinates that are captured using a plurality of voxel grids. The blending further includes converting the plurality of voxel grids to a corresponding plurality of multi-band pyramids, wherein each multi-band pyramid comprises a plurality of levels, each level storing attribute information for a different frequency band. The blending further includes computing a blended multi-band pyramid based on the plurality of voxel grids by combining corresponding levels from each of the multi-band pyramids. The blending further includes converting the blended multi-band pyramid into a blended voxel grid. The blending further includes outputting the blended voxel grid.

IPC Classes  ?

  • G01B 11/25 - Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. moiré fringes, on the object
  • G06T 7/593 - Depth or shape recovery from multiple images from stereo images
  • G01S 7/481 - Constructional features, e.g. arrangements of optical elements
  • G01B 11/00 - Measuring arrangements characterised by the use of optical techniques

81.

ARTICULATED ARM COORDINATE MEASURING MACHINE

      
Application Number 17452587
Status Pending
Filing Date 2021-10-28
First Publication Date 2022-05-12
Owner FARO Technologies, Inc. (USA)
Inventor
  • Creachbaum, John Lucas
  • Bailey, Brent
  • Schoenfeldt, William E.
  • Kovalski, Fabiano
  • Laranjeira, Eduardo
  • Crisostomo, Chad
  • Bartel, Michael
  • Lankalapalli, Kishore
  • Riehl, Christopher M.
  • Mogensen, Matthew

Abstract

An articulated arm coordinate measuring machine includes an arm having multiple segments and an end assembly. The end assembly has multiple accessory interfaces that allow multiple accessories to be coupled to the end assembly. The accessory interfaces are configured to allow the accessories to be repeatably interchanged between the accessory interfaces.

IPC Classes  ?

  • B25J 9/04 - Programme-controlled manipulators characterised by movement of the arms, e.g. cartesian co-ordinate type by rotating at least one arm, excluding the head movement itself, e.g. cylindrical co-ordinate type or polar co-ordinate type
  • B25J 9/16 - Programme controls
  • B25J 19/02 - Sensing devices
  • G01B 5/008 - Measuring arrangements characterised by the use of mechanical techniques for measuring coordinates of points using coordinate measuring machines
  • G01B 21/04 - Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant for measuring length, width, or thickness by measuring coordinates of points

82.

SIMULTANEOUS LOCALIZATION AND MAPPING ALGORITHMS USING THREE-DIMENSIONAL REGISTRATION

      
Application Number 17410502
Status Pending
Filing Date 2021-08-24
First Publication Date 2022-05-05
Owner FARO Technologies, Inc. (USA)
Inventor
  • Brenner, Mark
  • Zweigle, Oliver
  • Buback, Johannes
  • Frank, Aleksej
  • Ramadneh, Ahmad

Abstract

An example method includes receiving, via a 3D scanner, a 3D scan of the environment. The 3D scan includes a global position and is partitioned into a plurality of 3D submaps. The method further includes receiving, via a two-dimensional (2D) scanner accessory, a plurality of 2D submaps of the environment. The method further includes receiving coordinates of the scan position in the plurality of 2D submaps in response to the 3D scanner initiating the acquisition of the 3D scan. The method further includes associating the coordinates of the scan position with the plurality of 2D submaps. The method further includes performing real-time positioning by linking the coordinates of the scan position with the plurality of 2D submaps using a SLAM algorithm. The method further includes performing, based at least in part on the real-time positioning, a registration technique on the plurality of 3D submaps to generate a global map.

IPC Classes  ?

  • G01S 17/894 - 3D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
  • G06T 17/05 - Geographic models
  • G06T 7/33 - Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods

83.

THREE DIMENSIONAL MEASUREMENT DEVICE HAVING A CAMERA WITH A FISHEYE LENS

      
Application Number 17451946
Status Pending
Filing Date 2021-10-22
First Publication Date 2022-05-05
Owner FARO Technologies, Inc. (USA)
Inventor
  • Parian, Jafar Amiri
  • Flohr, Daniel
  • Ossig, Martin
  • Woloschyn, Andreas
  • Tohme, Yazid

Abstract

A 3D measurement system, a laser scanner and a measurement device are provided. The system includes a 3D measurement device and a 360 degree image acquisition system coupled in a fixed relationship to the 3D measurement device. The 360 degree image acquisition system includes a first photosensitive array operably coupled to a first lens, the first lens having a first optical axis in a first direction, the first lens being configured to provide a first field of view greater than 180 degrees. The image acquisition system further includes a second photosensitive array operably coupled to a second lens, the second lens having a second optical axis in a second direction, the second direction is opposite the first direction, the second lens being configured to provide a second field of view greater than 180 degrees. Wherein the first field of view at least partially overlaps with the second field of view.

IPC Classes  ?

  • G01S 17/894 - 3D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
  • G01S 17/86 - Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
  • G02B 13/06 - Panoramic objectives; So-called "sky lenses"
  • G01S 7/481 - Constructional features, e.g. arrangements of optical elements

84.

Distributed measurement system for scanning projects

      
Application Number 17575738
Grant Number 11934355
Status In Force
Filing Date 2022-01-14
First Publication Date 2022-05-05
Grant Date 2024-03-19
Owner FARO Technologies, Inc. (USA)
Inventor
  • Zweigle, Oliver
  • Ramadneh, Ahmad
  • Frank, Aleksej
  • Santos, Joao

Abstract

A system and method for providing a distributed measurement system. The system performs operations that include receiving, via a user interface of a user device, a request from a requestor to access a data file of a project. The project includes a plurality of data files including the data file, and at least one of the one or more data files is generated based at least in part on measurement data output from a measurement device. Based on determining that the requestor has permission to access the data file, one or more editing options are provided for editing the data file. The one or more editing options vary based at least in part on one or both of a characteristic of the user device and a characteristic of the data file. The data file is edited in response to receiving an editing request.

IPC Classes  ?

  • G06F 16/176 - Support for shared access to files; File sharing support
  • G06F 3/0482 - Interaction with lists of selectable items, e.g. menus
  • G06F 3/0484 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
  • G06F 16/13 - File access structures, e.g. distributed indices
  • G06F 40/166 - Editing, e.g. inserting or deleting

85.

DYNAMIC SELF-CALIBRATING OF AUXILIARY CAMERA OF LASER SCANNER

      
Application Number 17492801
Status Pending
Filing Date 2021-10-04
First Publication Date 2022-04-28
Owner FARO Technologies, Inc. (USA)
Inventor
  • Parian, Jafar Amiri
  • Kaabi, Hani

Abstract

A method includes capturing, by a three-dimensional (3D) scanner, a 3D point cloud, and capturing, by a camera, a control image by capturing and stitching multiple images of the surrounding environment. The method further includes capturing, by an auxiliary camera, an ultrawide-angle calibration image. The method further includes dynamically calibrating the auxiliary camera using the 3D point cloud, the control image, and the calibration image. The calibrating includes extracting a first plurality of features from the control image and extracting a second plurality of features from the calibration image. Further, a set of matching features are determined from the first and second sets of features. A set of control points is generated using the set of matching features by determining points in the 3D point cloud that correspond to the set of matching features. Further, a self-calibration of the auxiliary camera is performed using the set of control points.

IPC Classes  ?

  • G01S 7/497 - Means for monitoring or calibrating
  • G01S 17/894 - 3D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
  • G06T 7/521 - Depth or shape recovery from the projection of structured light

86.

Hybrid photogrammetry

      
Application Number 17449127
Grant Number 11727635
Status In Force
Filing Date 2021-09-28
First Publication Date 2022-04-28
Grant Date 2023-08-15
Owner FARO TECHNOLOGIES, INC. (USA)
Inventor
  • Ossig, Martin
  • Buback, Johannes

Abstract

A method for determining three-dimensional (3D) coordinates of an object surface with a 3D measuring device includes forming from the determined 3D coordinates a mesh having a first face, constructing a voxel array aligned to the first face, obtaining a plurality of images from a first camera having a corresponding plurality of poses, obtaining for each voxel in the voxel array a plurality of voxel values obtained from the corresponding plurality of images, determining for each voxel row a quality value determined based at least in part on an average value of a first quantity and a dispersion of the first quantity, the first quantity based at least in part on first voxel values determined as a function of pose, and determining a distance from a point on the first face to the object surface based at least in part on the determined quality values for the voxel rows.

IPC Classes  ?

  • G06T 17/20 - Wire-frame description, e.g. polygonalisation or tessellation
  • G06T 7/586 - Depth or shape recovery from multiple images from multiple light sources, e.g. photometric stereo
  • G06T 7/521 - Depth or shape recovery from the projection of structured light
  • G06T 7/90 - Determination of colour characteristics
  • G06T 7/529 - Depth or shape recovery from texture

87.

THREE-DIMENSIONAL SCANNING AND IMAGE RECONSTRUCTION THEREOF

      
Application Number 17498944
Status Pending
Filing Date 2021-10-12
First Publication Date 2022-04-21
Owner FARO Technologies, Inc. (USA)
Inventor Woloschyn, Andreas

Abstract

Three-dimensional coordinate scanners and methods of scanning environments are described. The scanners include a housing having a top, a bottom, a first side, a second side, a first end face, and a second end face. A 3D point cloud system is arranged within the housing including a rotating mirror and configured to acquire 3D point cloud data of a scanned environment. A first color camera is arranged within the housing on the first side and configured to capture respective color data of the scanned environment and a second color camera arranged within the housing on the second side and configured to capture respective color data of the scanned environment.

IPC Classes  ?

  • G01S 7/481 - Constructional features, e.g. arrangements of optical elements
  • G01S 17/42 - Simultaneous measurement of distance and other coordinates
  • G01S 17/89 - Lidar systems, specially adapted for specific applications for mapping or imaging

88.

Compensation of three-dimensional measuring instrument having an autofocus camera

      
Application Number 17076070
Grant Number 11481917
Status In Force
Filing Date 2020-10-21
First Publication Date 2022-04-21
Grant Date 2022-10-25
Owner FARO TECHNOLOGIES, INC. (USA)
Inventor
  • Ossig, Martin
  • Buback, Johannes

Abstract

A three-dimensional (3D) measuring instrument includes a registration camera and a surface measuring system having a projector and an autofocus camera. For the instrument in a first pose, the registration camera captures a first registration image of first registration points. The autofocus camera captures a first surface image of first light projected onto the object by the projector and determines first 3D coordinates of points on the object. For the instrument in a second pose, the registration camera captures a second registration image of second registration points. The autofocus camera adjusts the autofocus mechanism and captures a second surface image of second light projected by the projector. A compensation parameter is determined based at least in part on the first registration image, the second registration image, the first 3D coordinates, the second surface image, and the projected second light.

IPC Classes  ?

  • G06T 7/593 - Depth or shape recovery from multiple images from stereo images
  • G06T 7/55 - Depth or shape recovery from multiple images
  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods
  • G06T 7/521 - Depth or shape recovery from the projection of structured light
  • G06T 7/80 - Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
  • G06T 7/571 - Depth or shape recovery from multiple images from focus
  • H04N 5/232 - Devices for controlling television cameras, e.g. remote control
  • G01B 11/25 - Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. moiré fringes, on the object
  • H04N 5/247 - Arrangement of television cameras

89.

System and method of scanning an environment and generating two dimensional images of the environment

      
Application Number 17496317
Grant Number 11847741
Status In Force
Filing Date 2021-10-07
First Publication Date 2022-03-31
Grant Date 2023-12-19
Owner FARO Technologies, Inc. (USA)
Inventor
  • Frank, Aleksej
  • Wolke, Matthias
  • Zweigle, Oliver

Abstract

A system and method for scanning an environment and generating an annotated 2D map is provided. The system includes a 2D scanner having a light source, an image sensor and a first controller. The first controller determines a distance value to at least one of the object points. The system further includes a 360° camera having a movable platform, and a second controller that merges the images acquired by the cameras to generate an image having a 360° view in a horizontal plane. The system also includes processors coupled to the 2D scanner and the 360° camera. The processors are responsive to generate a 2D map of the environment based at least in part on a signal from an operator and the distance value. The processors being further responsive for acquiring a 360° image and integrating it at a location on the 2D map.

IPC Classes  ?

  • G06T 1/00 - General purpose image data processing
  • G06T 17/05 - Geographic models
  • H04N 23/698 - Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
  • H04N 13/282 - Image signal generators for generating image signals corresponding to three or more geometrical viewpoints, e.g. multi-view systems

90.

DETECTING DISPLACEMENTS AND/OR DEFECTS IN A POINT CLOUD USING CLUSTER-BASED CLOUD-TO-CLOUD COMPARISON

      
Application Number 17338890
Status Pending
Filing Date 2021-06-04
First Publication Date 2022-03-24
Owner FARO Technologies, Inc. (USA)
Inventor
  • Wolke, Matthias
  • Patlolla, Prashanth Reddy

Abstract

Examples described herein provide a method that includes performing cluster matching with one or more cluster sizes for each of a plurality of points of a measurement point cloud. The method further includes determining, based on results of the multi-radii cluster matching, whether an object is displaced or whether the object includes a defect.

IPC Classes  ?

  • G06K 9/62 - Methods or arrangements for recognition using electronic means
  • G06T 17/20 - Wire-frame description, e.g. polygonalisation or tessellation
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
  • G06F 30/23 - Design optimisation, verification or simulation using finite element methods [FEM] or finite difference methods [FDM]

91.

System and method for measuring three-dimensional coordinates

      
Application Number 17454726
Grant Number 11692812
Status In Force
Filing Date 2021-11-12
First Publication Date 2022-03-10
Grant Date 2023-07-04
Owner FARO Technologies, Inc. (USA)
Inventor
  • Döring, Daniel
  • Hillebrand, Gerrit
  • Debitsch, Rasmus
  • Pfeiffer, Rene
  • Ossig, Martin
  • Kramer, Alexander

Abstract

A three-dimensional (3D) measurement system, a method of measuring 3D coordinates, and a method of generating dense 3D data is provided. The method of measuring 3D coordinates includes using a first 3D measurement device and a second 3D measurement device in a cooperative manner is provided. The method includes acquiring a first set of 3D coordinates with the first 3D measurement device. The first set of 3D coordinates are transferred to the second 3D measurement device. A second set of 3D coordinates is acquired with the second 3D measurement device. The second set of 3D coordinates are registered to the first set of 3D coordinates in real-time while the second 3D measurement device is acquiring the second set of 3D coordinates.

IPC Classes  ?

  • G01B 11/00 - Measuring arrangements characterised by the use of optical techniques

92.

CAPTURING ENVIRONMENTAL SCANS USING SENSOR FUSION

      
Application Number 17325940
Status Pending
Filing Date 2021-05-20
First Publication Date 2022-02-24
Owner FARO Technologies, Inc. (USA)
Inventor
  • Brenner, Mark
  • Frank, Aleksej
  • Zweigle, Oliver
  • Ramadneh, Ahmad

Abstract

Techniques are described to determine a constraint for performing a simultaneous location and mapping. A method includes detecting a first set of planes in a first scan-data of an environment, and detecting a second set of planes in a second scan-data. Further, a plane that is in the first set of planes and the second set of planes is identified. Further, a first set of measurements of a landmark on the plane is determined from the first scan-data, and a second set of measurements of said landmark is determined from the second scan-data. The constraint is determined by computing a relationship between the first set of measurements and the second set of measurements.

IPC Classes  ?

  • G01S 17/894 - 3D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
  • G01S 17/42 - Simultaneous measurement of distance and other coordinates
  • G01S 7/48 - RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES - Details of systems according to groups , , of systems according to group
  • G01S 7/497 - Means for monitoring or calibrating

93.

System and method of automatic room segmentation for two-dimensional laser floorplans

      
Application Number 17325947
Grant Number 11501478
Status In Force
Filing Date 2021-05-20
First Publication Date 2022-02-17
Grant Date 2022-11-15
Owner FARO TECHNOLOGIES, INC. (USA)
Inventor
  • Brenner, Mark
  • Frank, Aleksej
  • Zweigle, Oliver
  • Ramadneh, Ahmad
  • Waheed, Mufassar

Abstract

A system for generating an automatically segmented and annotated two-dimensional (2D) map of an environment includes processors coupled to a scanner to convert a 2D map from the scanner into a 2D image. Further, a mapping system categorizes a first set of pixels from the image into one of room-inside, room-outside, and noise by applying a trained neural network to the image. The mapping system further categorizes a first subset of pixels from the first set of pixels based on a room type if the first subset of pixels is categorized as room-inside. The mapping system also determines the room type of a second subset of pixels from the first set of pixels based on the first subset of pixels by using a flooding algorithm. The mapping system further annotates a portion of the 2D map to identify the room type based on the pixels corresponding to the portion.

IPC Classes  ?

  • G06T 11/60 - Editing figures and text; Combining figures or text
  • G06T 7/70 - Determining position or orientation of objects or cameras
  • G06T 7/11 - Region-based segmentation
  • G06V 30/422 - Technical drawings; Geographical maps
  • G01S 17/89 - Lidar systems, specially adapted for specific applications for mapping or imaging
  • G01S 17/42 - Simultaneous measurement of distance and other coordinates

94.

ENVIRONMENTAL SCANNING AND IMAGE RECONSTRUCTION THEREOF

      
Application Number 17395748
Status Pending
Filing Date 2021-08-06
First Publication Date 2022-02-17
Owner FARO Technologies, Inc. (USA)
Inventor
  • Schmitz, Evelyn
  • Bauer, Heiko
  • Kappes, Steffen
  • Wohlfeld, Denis

Abstract

Scanning systems and methods for measuring shafts are described. The scanning systems include a support structure and a scanner mounted to the support structure, at least one fixed guide arranged such that the support structure is configured to move along the at least one fixed guide, at least one positional guide arranged such that at least one positional guide is connected to the support structure to guide movement of the scanner along the at least one fixed guide, and an encoder operably coupled to the at least one positional guide and configured to measure, at least, a distance from the encoder to the support structure.

IPC Classes  ?

  • G01B 11/04 - Measuring arrangements characterised by the use of optical techniques for measuring length, width, or thickness specially adapted for measuring length or width of objects while moving
  • G01B 11/25 - Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. moiré fringes, on the object
  • G02B 26/10 - Scanning systems

95.

LASER SCANNER WITH ULTRAWIDE-ANGLE LENS CAMERA FOR REGISTRATION

      
Application Number 17379268
Status Pending
Filing Date 2021-07-19
First Publication Date 2022-02-17
Owner FARO Technologies, Inc. (USA)
Inventor Parian, Jafar

Abstract

According to one or more embodiments, a method includes capturing a first three-dimensional (3D) point cloud and a second 3D point cloud. Each of the 3D point clouds includes a plurality 3D coordinates corresponding to one or more objects scanned in a surrounding environment. The first 3D point cloud and the second 3D point cloud capturing at least one overlapping portion. Further, the method includes capturing a first ultrawide-angle image and a second ultrawide-angle image of the surrounding environment, the first ultrawide-angle image captures color information of the first 3D point cloud, and the second ultrawide-angle image captures color information of the second 3D point cloud. The method further includes registering the first 3D point cloud and the second 3D point cloud by mapping one or more features from the first ultrawide-angle image and the second ultrawide-angle image.

IPC Classes  ?

  • G06T 7/33 - Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
  • G06T 3/00 - Geometric image transformation in the plane of the image
  • G01S 17/89 - Lidar systems, specially adapted for specific applications for mapping or imaging

96.

Generating textured three-dimensional meshes using two-dimensional scanner and panoramic camera

      
Application Number 17325956
Grant Number 11936843
Status In Force
Filing Date 2021-05-20
First Publication Date 2022-02-10
Grant Date 2024-03-19
Owner FARO Technologies, Inc. (USA)
Inventor
  • Brenner, Mark
  • Frank, Aleksej
  • Ramadneh, Ahmad
  • Waheed, Mufassar
  • Zweigle, Oliver

Abstract

Techniques are described for converting a 2D map into a 3D mesh. The 2D map of the environment is generated using data captured by a 2D scanner. Further, a set of features is identified from a subset of panoramic images of the environment that are captured by a camera. Further, the panoramic images from the subset are aligned with the 2D map using the features that are extracted. Further, 3D coordinates of the features are determined using 2D coordinates from the 2D map and a third coordinate based on a pose of the camera. The 3D mesh is generated using the 3D coordinates of the features.

IPC Classes  ?

  • H04N 13/261 - Image signal generators with monoscopic-to-stereoscopic image conversion
  • G01P 15/097 - Measuring acceleration; Measuring deceleration; Measuring shock, i.e. sudden change of acceleration by making use of inertia forces with conversion into electric or magnetic values by vibratory elements
  • G06T 15/04 - Texture mapping
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • H04N 13/275 - Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals

97.

Part inspection method using computed tomography

      
Application Number 17340902
Grant Number 11719651
Status In Force
Filing Date 2021-06-07
First Publication Date 2022-02-10
Grant Date 2023-08-08
Owner FARO TECHNOLOGIES, INC. (USA)
Inventor
  • Stiebeiner, Ariane
  • Balatzis, Georgio
  • Raab, Simon
  • Wagner, Stefan

Abstract

A system and method of inspecting a plurality of objects using a computed tomography (CT) system is provided. The method includes acquiring an image of a fixture used for holding the plurality of objects with the CT system. A first electronic model of the fixture is generated. The objects are placed in the fixture. An image of the fixture and the objects is acquired with the CT system. A second electronic model of the fixture and the objects is generated. A third electronic model of the objects is defined based at least in part on subtracting the first electronic model from the second electronic model. Dimensions of the objects from the third electronic model are compared with a computer aided design (CAD) model. A report is output based at least in part on the comparison of the objects from the third electronic model with the CAD model.

IPC Classes  ?

  • G01N 23/046 - Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups , or by transmitting the radiation through the material and forming images of the material using tomography, e.g. computed tomography [CT]
  • G01N 23/20025 - Sample holders or supports therefor
  • G06F 30/20 - Design optimisation, verification or simulation

98.

INDOOR DEVICE LOCALIZATION

      
Application Number 17354691
Status Pending
Filing Date 2021-06-22
First Publication Date 2022-01-20
Owner FARO Technologies, Inc. (USA)
Inventor
  • Schmitz, Evelyn
  • Wohlfeld, Denis

Abstract

An example system for measuring three-dimensional (3D) coordinate values of an environment is provided. The system includes a mobile scanning platform configured to measure coordinates in the environment. The mobile scanning platform has one or more radio antennas. The system further includes one or more processors operably coupled to the mobile scanning platform, the one or more processors being responsive to nontransitory executable instructions for performing a method. The method includes registering the measured coordinates to generate a point cloud. Registering includes triangulating a position of the mobile scanning platform based at least in part on data received from the one or more radio antennas. Registering further includes adjusting an orientation or position of one or more of the measured coordinates to align with a layout of the environment.

IPC Classes  ?

  • G01S 13/86 - Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
  • G01S 13/46 - Indirect determination of position data
  • G01S 17/08 - Systems determining position data of a target for measuring distance only
  • G01S 17/89 - Lidar systems, specially adapted for specific applications for mapping or imaging

99.

User interface for three-dimensional measurement device

      
Application Number 17340917
Grant Number 11614319
Status In Force
Filing Date 2021-06-07
First Publication Date 2021-12-30
Grant Date 2023-03-28
Owner FARO Technologies, Inc. (USA)
Inventor
  • Döring, Daniel
  • Debitsch, Rasmus
  • Pfeiffer, Rene
  • Ruhland, Axel

Abstract

A system and method for providing feedback on a quality of a 3D scan is provided. The system includes a coordinate scanner configured to optically measure and determine a plurality of three-dimensional coordinates to a plurality of locations on at least one surface in the environment, the coordinate scanner being configured to move through the environment while acquiring the plurality of three-dimensional coordinates. A display having a graphical user interface. One or more processors are provided that are configured to determine a quality attribute of a process of measuring the plurality of three-dimensional coordinates based at least in part on the movement of the coordinate scanner in the environment and display a graphical quality indicator on the graphical user interface based at least in part on the quality attribute, the quality indicator is a graphical element having at least one movable element.

IPC Classes  ?

  • G06F 3/0481 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
  • G01B 11/00 - Measuring arrangements characterised by the use of optical techniques
  • G01B 11/25 - Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. moiré fringes, on the object

100.

Measurement device

      
Application Number 29698144
Grant Number D0939367
Status In Force
Filing Date 2019-07-15
First Publication Date 2021-12-28
Grant Date 2021-12-28
Owner FARO TECHNOLOGIES, INC. (USA)
Inventor
  • Ruhland, Axel
  • Bader, Jonas
  • Müller, Benjamin
  • Gramenz, Matthias
  1     2     3     ...     6        Next Page