A system and method for generating a focused three-dimensional (3D) point cloud is disclosed. A respective 3D point cloud is generated based on returns of a respective sequence of energy pulses that is emitted towards one or more regions-of-interest (ROIs) within a field-of-view (FOV) during a respective scan of the FOV, the returns including one or more secondary returns from one or more points within the FOV. During an additional scan of the FOV, subsequent to the respective scan, an additional sequence of energy pulses is emitted to generate a focused 3D point cloud that includes additional information regarding one or more selected points of the points associated with the secondary returns relative to the respective 3D point cloud.
G01S 13/10 - Systèmes pour mesurer la distance uniquement utilisant la transmission de trains discontinus d'ondes modulées par impulsions
G01S 13/32 - Systèmes pour mesurer la distance uniquement utilisant la transmission d'ondes continues, soit modulées en amplitude, en fréquence ou en phase, soit non modulées
G06T 19/00 - Transformation de modèles ou d'images tridimensionnels [3D] pour infographie
Aspects of embodiments pertain to a method for providing scene related information from a scene to a remote station. The method may comprise receiving, at the remote station, a data object in relation to at least one identified attribute of one or more physical objects located in an ROI of the scene acquired by at least one sensor. A priority level value (PLV) is associated with the data object. The method may further include generating, at the remote station, using local station data, a low-latency virtual representation of the scene for displaying, at the remote station, a scene representation comprising the low-latency scene representation and a visualization of the received data object. In addition, real-world scene data descriptive of real world ROI/Target information may be receive. A user may designate an ROI/Target of the data object visualization for displaying real world ROI/Target information relating to the designated ROI/Target.
A system for coloring a monochromatic image of a scene, the system comprising a processing circuitry configured to: capture the monochromatic image of the scene utilizing a non-visible spectrum sensor, capable of capturing monochromatic images from a viewpoint having a given position and a given orientation; obtain one or more previously captured color images, covering at least part of the scene, wherein the previously captured color images have been captured in a time prior to the capturing of the monochromatic image; determine a registration between the monochromatic image and the previously captured color images, wherein the registration is determined by projecting the previously captured color images on a plane that is conformal with the viewpoint; render one or more rendered images, being the previously captured color images adjusted to the viewpoint by utilizing the registration; and generate a colored image of the scene by changing, for at least one given pixel of the monochromatic image, values of one or more chroma components of the given pixel in accordance with values of one or more chroma components of a corresponding pixel of the rendered images.
H04N 23/11 - Caméras ou modules de caméras comprenant des capteurs d'images électroniques; Leur commande pour générer des signaux d'image à partir de différentes longueurs d'onde pour générer des signaux d'image à partir de longueurs d'onde de lumière visible et infrarouge
A mixed A mixed reality system, comprising: a sensor configured to acquire readings of real-world data, and display, on an output device, a real-world visualization of the real-world data based on the readings to a user, wherein the sensor has one or more parameters affecting the real-world visualization; and a processing circuitry configured to: obtain (a) information of one or more virtual entities located within an area from which the readings are acquired, the information defining, for each of the virtual entities, one or more simulated physical properties, and (b) values of one or more situational parameters indicative of a state of the sensor during acquisition of the readings, wherein the values of the one or more situational parameters are readings of one or more situational sensors, sensing the state of the sensor and its surroundings during acquisition of the readings; determine, for at least one given virtual entity of the virtual entities, a virtual entity visualization of the given virtual entity, the virtual entity visualization determined by manipulating a simulated reading of the simulated physical properties based on (a) the parameters affecting the real-world visualization, and (b) the values of the situational parameters; and display the virtual entity visualizations in combination with the real-world visualization, thereby enabling a user viewing the output device to view the virtual entity visualizations and the real-world visualization.
A method and a system for displaying a scene to a user wearing ahead mounted display (HMD) while removing obstructions in a field of view (FOV) of the user are provided herein. The method may include: capturing by a sensor a vehicle image of the scene wherein the first sensor is mounted on the vehicle; tracking a position and orientation of the HMD in a specified coordinate system, to yield a line of sight (LOS) of the user wearing the HMD; obtaining a database containing obstacles data indicating at least one physical object located within the vehicle and affecting the FOV of the user; calculating an obstructed portion in the FOV of the user, based on the LOS and the database; generating from the vehicle image, an un-obstructed view which includes a portion of the scene overlapping the obstructed portion; and displaying in the HMD the un-obstructed view.
An electronic assembly with heat spreading coating is having a PCB carrying conducting traces heat producing electronic components. An electrically isolating polymeric coating is applied over the electric the traces and the heat producing electronic components. The electrically isolating polymeric coating conforms with an irregular structure of the PCB. A heat spreading layer is applied over the polymeric coating. The heat spreading layer comprises: at least one heat spreading zone selected from the group consisting of: a plurality of graphene nano-platelets, a plurality of graphene particles, a plurality of boron-nitride particles, a plurality of graphene flakes, a plurality of boron-nitride flakes, at least one graphene sheets, and combination thereof; and a binder, wherein the electrically isolating polymeric coating adheres to the PCB and covers the electric conducting traces and the heat producing electronic components, and the heat spreading layer conforms to the irregular structure of the polymeric coating.
H05K 7/20 - Modifications en vue de faciliter la réfrigération, l'aération ou le chauffage
H01L 23/373 - Refroidissement facilité par l'emploi de matériaux particuliers pour le dispositif
H01L 33/64 - DISPOSITIFS À SEMI-CONDUCTEURS NON COUVERTS PAR LA CLASSE - Détails caractérisés par les éléments du boîtier des corps semi-conducteurs Éléments d'extraction de la chaleur ou de refroidissement
F28F 21/02 - Structure des appareils échangeurs de chaleur caractérisée par l'emploi de matériaux spécifiés de carbone, p.ex. de graphite
F28F 21/04 - Structure des appareils échangeurs de chaleur caractérisée par l'emploi de matériaux spécifiés de pierre naturelle
A method and system for providing depth perception to a two-dimensional (2D) representation of a given three-dimensional (3D) object within a 2D non-visible spectrum image of a scene is provided. The method comprises: capturing the 2D non-visible spectrum image at a capture time, by at least one non-visible spectrum sensor; obtaining 3D data regarding the given 3D object independently of the 2D non-visible spectrum image; generating one or more depth cues based on the 3D data; applying the depth cues on the 2D representation to generate a depth perception image that provides the depth perception to the 2D representation; and displaying the depth perception image.
G06T 7/586 - Récupération de la profondeur ou de la forme à partir de plusieurs images à partir de plusieurs sources de lumière, p.ex. stéréophotométrie
An eyepiece suitable for a night vision device, having an optical combiner for injecting a synthetic image onto the scene and having a single optical correction mechanism is provided herein. The eyepiece may include an observer-side lens; an objective-side lens; a diopter adjustment knob configured to set a distance between the observer-side lens and the objective-side lens; and an optical combiner located between the objective-side lens and the objective-side lens, wherein the optical combiner reflects towards the observer-side lens, the synthetic image transmitted from outside the eyepiece and transfers towards the observer-side lens, a scene image coming from an objecting lens of the night vision device and passing through the objective side lens, and wherein the diopter adjustment knob serves as a single setting mechanism which simultaneously sets a diopter of the observer and a focal depth of the display source image.
G02B 7/06 - Mise au point de jumelles binoculaires
H04N 5/77 - Circuits d'interface entre un appareil d'enregistrement et un autre appareil entre un appareil d'enregistrement et une caméra de télévision
G06F 3/14 - Sortie numérique vers un dispositif de visualisation
G02B 23/12 - Télescopes ou lunettes d'approche, p.ex. jumelles; Périscopes; Instruments pour voir à l'intérieur de corps creux; Viseurs; Pointage optique ou appareils de visée avec des moyens pour renverser ou intensifier l'image
A profile comparator for comparing between a human operator and a clone including a storage device, a simulation processor and a parameter comparator, the storage device including a recording of at least one parameter during an activity session of a platform, the platform including at least one control system, the parameter being at least one of a parameter of the platform and an action of an operator of the platform during the activity session, and a predetermined profile, the simulation processor configured to generate a virtual clone of the platform according to at least one of the recorded parameter, the simulation processor being further configured to manage the virtual clone according to the predetermined profile, the parameter comparator configured to compare at least one comparison parameter relative to the comparison parameter of the virtual clone and configured to determine at least one deviation wherein the comparison parameter deviates from the comparison parameter of the virtual clone.
G09B 9/24 - Simulateurs pour l'enseignement ou l'entraînement pour l'enseignement de la conduite des véhicules ou autres moyens de transport pour l'enseignement du pilotage des aéronefs, p.ex. bancs d'entraînement au pilotage sans visibilité comprenant l'affichage ou l'enregistrement de la trajectoire de vol simulée
G09B 9/08 - Simulateurs pour l'enseignement ou l'entraînement pour l'enseignement de la conduite des véhicules ou autres moyens de transport pour l'enseignement du pilotage des aéronefs, p.ex. bancs d'entraînement au pilotage sans visibilité
G09B 9/06 - Simulateurs pour l'enseignement ou l'entraînement pour l'enseignement de la conduite des véhicules ou autres moyens de transport pour l'enseignement de la manœuvre des navires, des vaisseaux ou autres véhicules flottants
09 - Appareils et instruments scientifiques et électriques
Produits et services
Head-mounted transparent binocular video displays containing an integrated head tracker for sensing motion, for use in providing a peripheral, high resolution and real time field-of-view, fire control and target detection capabilities, and navigational and operational information
A touch screen (TS) comprises a display; a frame with edges positioned opposite each other around the display; at least a first sensor array and at least a second sensor array, wherein s each of the sensor arrays has a plurality of light transmitters and a plurality of light sensors, and wherein the at least first sensor array and the at least second sensor array are disposed on the first edge and the second edge of the frame, respectively, wherein the transmitters of the first sensor array are facing the light sensors on the second sensor array positioned on two opposing edges of the frame; and at least one physical obstacle located on the third edge or the fourth edge, for to reducing stray light scattered or reflected by third edge or the fourth edge and arriving to the light sensors.
Aspects of embodiments pertain to a sensing systems configured to receive scene electromagnetic (EM) radiation comprising a first wavelength (WL1) range and a second wavelength (WL2) range. The sensing system comprises at least one spectral filter configured to filter the received scene EM radiation to obtain EM radiation in the WL1 range and the WL2 ranges; and a self-adaptive electromagnetic (EM) energy attenuating structure. The self-adaptive EM energy attenuating structure may comprise material that includes nanosized particles which are configured such that high intensity EM radiation at the WL1 range incident onto a portion of the self-adaptive EM energy attenuating structure causes interband excitation of one or more electron-hole pairs, thereby enabling intraband transition in the portion of the self-adaptive EM energy attenuating structure by EM radiation in the WL2 range.
H04N 23/52 - Caméras ou modules de caméras comprenant des capteurs d'images électroniques; Leur commande - Détails de structure Éléments optimisant le fonctionnement du capteur d'images, p. ex. pour la protection contre les interférences électromagnétiques [EMI] ou la commande de la température par des éléments de transfert de chaleur ou de refroidissement
H04N 23/11 - Caméras ou modules de caméras comprenant des capteurs d'images électroniques; Leur commande pour générer des signaux d'image à partir de différentes longueurs d'onde pour générer des signaux d'image à partir de longueurs d'onde de lumière visible et infrarouge
G01J 3/02 - Spectrométrie; Spectrophotométrie; Monochromateurs; Mesure de la couleur - Parties constitutives
G01J 3/10 - Aménagements de sources lumineuses spécialement adaptées à la spectrométrie ou à la colorimétrie
B82Y 15/00 - Nanotechnologie pour l’interaction, la détection ou l'actionnement, p.ex. points quantiques comme marqueurs en dosages protéiques ou moteurs moléculaires
B82Y 20/00 - Nano-optique, p.ex. optique quantique ou cristaux photoniques
Embodiments pertain to systems and methods for providing information related to a scene occurring in a region of interest (ROI), using a scene data collector, configured to receive scene source data, from one or more data sources, using at least one sensor, identify one or more physical objects located in the ROI, based on the received scene source data, determine one or more attributes of the identified physical objects, generate a data object, for at least one of the identified one or more physical objects, based on one or more attributes thereof, and transmit all data objects generated to at least one remote station (RS), located remotely from the ROI. Each RS may be configured to receive the transmitted one or more data objects, generate a virtual scene data, based on the received one or more data objects; and display the generated virtual scene data.
A method and a system for landing and terrain flight assistance are provided herein. The method may include the following steps: obtaining, by at least one imaging sensor disposed on an aerial platform, at least two images of at least a portion of a specified region of a terrain; determining, based on the at least two images, a 3D model of at least a portion of the specified region; receiving a predetermined model of at least a portion of the specified region; determining a real-world geographic location of the aerial platform based on the 3D model and the predetermined model; and determining flight instructions based on the 3D model and the determined geographic location of the aerial platform.
G08G 5/02 - Aides pour l'atterrissage automatique, c. à d. systèmes dans lesquels les données des vols d'avions arrivant sont traitées de façon à fournir les données d'atterrissage
09 - Appareils et instruments scientifiques et électriques
Produits et services
Head-mounted display for use in helicopters containing
integrated hybrid trackers, sensors, cameras, imaging
apparatus, and associated computer software and hardware,
for use in providing crew members with navigational and
operational information, ultra-wide binocular
fields-of-view, high resolution two and three dimensional
colored symbology, and enhanced video capabilities.
09 - Appareils et instruments scientifiques et électriques
10 - Appareils et instruments médicaux
Produits et services
Downloadable and recorded computer software for performing 2D or 3D imaging and virtual imaging of body cavities, organs and tissues and for reception, encoding, processing, storage, transmission, reproduction and decoding of such images Medical and surgical equipment, instruments and devices, namely, cameras, electronic visors and video monitors and parts therefor used for 2D or 3D imaging and virtual imaging of body cavities, organs and tissues; medical imaging apparatus in the nature of visors and video monitors, namely, head-mounted display, fixed displays, see through and non see through displays for displaying medical images and other layers of information for a user; medical and surgical equipment, instruments and devices, namely, computer hardware, and peripherals for performing 2D or 3D imaging and virtual imaging of body cavities, organs and tissues and for reception, encoding, processing, storage, transmission, reproduction and decoding of such images; medical equipment, instruments and devices, namely, an imaging camera, an imaging camera used with an image processor, and an image processor for medical use
09 - Appareils et instruments scientifiques et électriques
Produits et services
Head-mounted display for use in helicopters containing integrated hybrid trackers, sensors, cameras, imaging apparatus, and associated recorded computer software and computer hardware, for use in providing crew members with navigational and operational information, ultra-wide binocular fields-of-view, high resolution two- and three-dimensional colored symbology, and enhanced video capabilities
09 - Appareils et instruments scientifiques et électriques
Produits et services
Command, control, and mission management system, comprising
computer software, computer hardware, a secured cloud-based
battle space network, a platform agnostic mission management
unit, and apparatus for receiving and transmitting data and
communications, for providing aircraft crew operators a
real-time view of aircrafts operational surroundings, combat
picture, ultimate situational awareness, and battle
management operational tools, and for integrating data,
threats, warnings, borders, obstacles, entities,
navigational aids, route calculations, digital maps and
weapons into a single screen operational data display.
A transfer-alignment system for a Head-Mounted Display (HMD), and a display coupled to the HMD, wherein the display is adapted to display images rendered by a display-processor, and wherein the HMD is monitored by a tracking system configured to provide information indicating position and/or orientation of the HMD with respect to a Frame of Reference (FoR), the system comprising: at least one first inertial sensor attached to the HMD and configured to acquire HMD's Inertial Readings Information (IRI); at least one second inertial sensor attached to the display and configured to acquire display's IRI; and a processor configured to: obtain the HMD's IRI, the display's IRI and the information indicating the HMD's position and/or orientation with respect to the FoR; continuously analyze movement information of the HMD and the display to determine relative orientation therebetween; and cause the display-processor to adjust the images to conform with respect to the FoR.
G06F 3/01 - Dispositions d'entrée ou dispositions d'entrée et de sortie combinées pour l'interaction entre l'utilisateur et le calculateur
G06T 7/73 - Détermination de la position ou de l'orientation des objets ou des caméras utilisant des procédés basés sur les caractéristiques
G01C 21/16 - Navigation; Instruments de navigation non prévus dans les groupes en utilisant des mesures de la vitesse ou de l'accélération exécutées à bord de l'objet navigant; Navigation à l'estime en intégrant l'accélération ou la vitesse, c. à d. navigation par inertie
21.
SYSTEM AND METHOD FOR DETERMINING A CONFIGURATION OF A MEASUREMENT VOLUME
A method of determining a configuration of a measurement volume, the method may include: generating, by at least one transmitter, a transmitted magnetic field within the measurement volume; measuring, by at least one receiver positioned, a total magnetic field in the measurement volume at at least one receiver position and generating at least one receiver output signal; generating, by a processing unit, a measured dataset; comparing, by the processing unit, the measured dataset with at least one of at least two reference configuration datasets each for determined for one of at least two different configurations of the measurement volume; and identifying, by the processing unit, a reference configuration dataset of the at least two reference configuration datasets that corresponds to the measured dataset.
G01D 5/14 - Moyens mécaniques pour le transfert de la grandeur de sortie d'un organe sensible; Moyens pour convertir la grandeur de sortie d'un organe sensible en une autre variable, lorsque la forme ou la nature de l'organe sensible n'imposent pas un moyen de conversion déterminé; Transducteurs non spécialement adaptés à une variable particulière utilisant des moyens électriques ou magnétiques influençant la valeur d'un courant ou d'une tension
22.
SYSTEM AND METHOD FOR DETERMINING A RELATIVE MOTION BETWEEN TWO OR MORE OBJECTS
Systems and methods for determining a relative motion between two or more objects are disclosed. The system may include an excitation unit adapted to be disposed on a first object and configured to excite and to induce at least one change in at least a portion of a second object. The system may include a sensing unit adapted to be disposed on the first object, the sensing unit may include at least one sensor configured to detect the at least one change in the second object at two or more different time instances and to generate corresponding two or more sensor output datasets. The system may include a processing unit configured to determine a relative motion between the first object and the second object based on the two or more sensor output datasets thereof.
G01P 3/68 - Dispositifs caractérisés par la détermination du temps mis à parcourir une distance constante en utilisant des moyens optiques, c. à d. en utilisant la lumière infrarouge, visible ou ultraviolette
G01S 11/12 - Systèmes pour déterminer la distance ou la vitesse sans utiliser la réflexion ou la reradiation utilisant des ondes électromagnétiques autres que les ondes radio
23.
Head mounted display symbology concepts and implementations, associated with a reference vector
Head mounted displays (HMD) and corresponding display methods are provided, which obtain, repeatedly, from a monitoring system of a vehicle and, a reference vector relating to the vehicle; display on the HMD a reference symbol that indicates the reference vector; and determine movements of a HMD symbology according to a spatial relation between a received user's line of sight (LOS) and the reference vector. For example, the vehicle may be an aircraft and the reference vector a flight path vector (FPV) received from the aircraft's avionics. The proposed HMD enhances the displayed information content while avoiding excessive movements of the symbology. The HMD's functional parameters may be pre-set or adapted according to user preference and flight stage characteristics. The reference symbol anchors most of the symbology, while minimal critical information may be moved along with the user's LOS, providing a clearer and more stable view through the HMD.
G06F 3/01 - Dispositions d'entrée ou dispositions d'entrée et de sortie combinées pour l'interaction entre l'utilisateur et le calculateur
G09G 5/38 - Dispositions ou circuits de commande de l'affichage communs à l'affichage utilisant des tubes à rayons cathodiques et à l'affichage utilisant d'autres moyens de visualisation caractérisés par l'affichage de dessins graphiques individuels en utilisant une mémoire à mappage binaire avec des moyens pour commander la position de l'affichage
Aspects of embodiments pertain to a method for providing scene related information from a scene to a remote station. The method may comprise receiving, at the remote station, a data object in relation to at least one identified attribute of one or more physical objects located in an ROI of the scene acquired by at least one sensor. A priority level value (PLV) is associated with the data object. The method may further include generating, at the remote station, using local station data, a low-latency virtual representation of the scene for displaying, at the remote station, a scene representation comprising the low-latency scene representation and a visualization of the received data object. In addition, real-world scene data descriptive of real world ROI/Target information may be receive. A user may designate an ROI/Target of the data object visualization for displaying real world ROI/Target information relating to the designated ROI/Target.
G06F 3/0481 - Techniques d’interaction fondées sur les interfaces utilisateur graphiques [GUI] fondées sur des propriétés spécifiques de l’objet d’interaction affiché ou sur un environnement basé sur les métaphores, p.ex. interaction avec des éléments du bureau telles les fenêtres ou les icônes, ou avec l’aide d’un curseur changeant de comport
G06T 19/00 - Transformation de modèles ou d'images tridimensionnels [3D] pour infographie
G05D 1/00 - Commande de la position, du cap, de l'altitude ou de l'attitude des véhicules terrestres, aquatiques, aériens ou spatiaux, p.ex. pilote automatique
A system and method for generating a focused three-dimensional (3D) point cloud is disclosed. A respective 3D point cloud is generated based on returns of a respective sequence of energy pulses that is emitted towards one or more regions-of-interest (ROIs) within a field- of-view (FOV) during a respective scan of the FOV, the returns including one or more secondary returns from one or more points within the FOV. During an additional scan of the FOV, subsequent to the respective scan, an additional sequence of energy pulses is emitted to generate a focused 3D point cloud that includes additional information regarding one or more selected points of the points associated with the secondary returns relative to the respective 3D point cloud.
G01S 17/89 - Systèmes lidar, spécialement adaptés pour des applications spécifiques pour la cartographie ou l'imagerie
G01C 11/02 - Dispositions de prises de vues spécialement adaptées pour la photogrammétrie ou les levers photographiques, p.ex. pour commander le recouvrement des photos
G01S 17/894 - Imagerie 3D avec mesure simultanée du temps de vol sur une matrice 2D de pixels récepteurs, p.ex. caméras à temps de vol ou lidar flash
G01S 17/04 - Systèmes de détermination de la présence d'une cible
G01S 17/42 - Mesure simultanée de la distance et d'autres coordonnées
A system for generating a three-dimensional (3D) map of part of a field-of-view (FOV) of at least one detector of an active 3D scanner, comprising: the active 3D scanner, comprising: a mechanism configured to scan the FOV; at least one energy emitting source configured to emit energy pulses; and the at least one detector; and processing circuitry configured to: obtain information, wherein at least some of designation information is tracker-based designation information that is designated by a user of the system via a tracker that tracks a line-of-sight between the user and the FOV; selectively activate the energy emitting source to emit a subset of the energy pulses, in accordance with the information, including the tracker-based designation information, and in synchronization with the mechanism, to cover the part of the FOV; obtain current readings, from the detector, based on reflections of the subset of the energy pulses; and generate the 3D map based on the current readings.
G01C 3/08 - Utilisation de détecteurs électriques de radiations
G01S 17/894 - Imagerie 3D avec mesure simultanée du temps de vol sur une matrice 2D de pixels récepteurs, p.ex. caméras à temps de vol ou lidar flash
G01S 13/32 - Systèmes pour mesurer la distance uniquement utilisant la transmission d'ondes continues, soit modulées en amplitude, en fréquence ou en phase, soit non modulées
G01S 13/86 - Combinaisons de systèmes radar avec des systèmes autres que radar, p.ex. sonar, chercheur de direction
27.
OPTICAL SEE THROUGH (OST) HEAD MOUNTED DISPLAY (HMD) SYSTEM AND METHOD FOR PRECISE ALIGNMENT OF VIRTUAL OBJECTS WITH OUTWARDLY VIEWED OBJECTS
A method for irradiating an image in an optical see-through (OST) head mounted display (HMD) for viewing through, the OST HMD by a user's eye, an object having at least one of known orientation and position and orientation (O/P&O), associated with a first reference frame, the method comprising: generating and irradiating said image for appearing to said user superimposed in an aligned manner to said object, according to predetermined information, eyeball feature position data, and said O/P&O; said predetermined information relates correction data with a plurality of different respective position data values of at least one eyeball feature position of said eye; said predetermined information further includes display corrections of said electro-optical display module with respect to said position data values of said at least one eyeball feature position, with respect to a second reference frame; and said O/P&O is between said second reference and said first reference frame.
A61B 90/00 - Instruments, outillage ou accessoires spécialement adaptés à la chirurgie ou au diagnostic non couverts par l'un des groupes , p.ex. pour le traitement de la luxation ou pour la protection de bords de blessures
A61B 34/20 - Systèmes de navigation chirurgicale; Dispositifs pour le suivi ou le guidage d'instruments chirurgicaux, p.ex. pour la stéréotaxie sans cadre
A61B 34/10 - Planification, simulation ou modélisation assistées par ordinateur d’opérations chirurgicales
G06T 7/33 - Détermination des paramètres de transformation pour l'alignement des images, c. à d. recalage des images utilisant des procédés basés sur les caractéristiques
G06T 7/73 - Détermination de la position ou de l'orientation des objets ou des caméras utilisant des procédés basés sur les caractéristiques
A61B 17/00 - Instruments, dispositifs ou procédés chirurgicaux, p.ex. tourniquets
A61B 90/50 - Supports pour instruments chirurgicaux, p.ex. bras articulés
G02B 27/00 - Systèmes ou appareils optiques non prévus dans aucun des groupes ,
09 - Appareils et instruments scientifiques et électriques
Produits et services
Software-defined airborne radar system for detection and
mapping of terrain and obstacles in degraded visual
environment and adverse visibility conditions.
A mixed reality system, comprising: a data acquisition device configured to acquire real-world data; an output device for providing the real-world data to a user; and a processing circuitry configured to: obtain (a) data acquired by the data acquisition device, and (b) information of one or more virtual entities having properties enabling determination of simulated effects of the virtual entities on the data; determine the simulated effects of the virtual entities on the data utilizing the properties; and provide the user with output on the output device being a manipulation of the data reflecting the simulated effects.
A method and a system for displaying a scene to a user wearing a head mounted display (HMD) while removing obstructions in a field of view (FOV) of the user are provided herein. The method may include: capturing by a sensor a vehicle image of the scene wherein the first sensor is mounted on the vehicle; tracking a position and orientation of the HMD in a specified coordinate system, to yield a line of sight (LOS) of the user wearing the HMD; obtaining a database containing obstacles data indicating at least one physical object located within the vehicle and affecting the FOV of the user; calculating an obstructed portion in the FOV of the user, based on the LOS and the database; generating from the vehicle image, an un-ob structed view which includes a portion of the scene overlapping the obstructed portion; and displaying in the HMD the un-ob structed view.
A system and a method of monitoring physical properties of a physical medium over time are provided herein. The method may include the following steps: embedding a plurality of acoustic sensors into a physical medium before curing thereof; transmitting an acoustic wave by at least one transmitter coupled to or embedded within said physical medium; repeatedly calculating, over different points of time, a travel time of said acoustic wave between the at least one transmitter and the plurality of acoustic sensors; and analyzing said travel times, to detect a change over time in physical properties of said physical medium associated with said travel time.
A method and system for providing depth perception to a two-dimensional (2D) representation of a given three-dimensional (3D) object within a 2D non-visible spectrum image of a scene is provided. The method comprises: capturing the 2D non-visible spectrum image at a capture time, by at least one non-visible spectrum sensor; obtaining 3D data regarding the given 3D object independently of the 2D non-visible spectrum image; generating one or more depth cues based on the 3D data; applying the depth cues on the 2D representation to generate a depth perception image that provides the depth perception to the 2D representation; and displaying the depth perception image.
H04N 13/122 - Raffinement de la perception 3D des images stéréoscopiques par modification du contenu des signaux d’images, p.ex. par filtrage ou par ajout d’indices monoscopiques de profondeur
G06T 7/55 - Récupération de la profondeur ou de la forme à partir de plusieurs images
G06T 7/50 - Récupération de la profondeur ou de la forme
G02B 30/40 - Systèmes ou appareils optiques pour produire des effets tridimensionnels [3D], p.ex. des effets stéréoscopiques donnant à l’observateur d'une seule image bidimensionnelle [2D] une impression perceptive de profondeur
G06T 17/00 - Modélisation tridimensionnelle [3D] pour infographie
G01B 11/22 - Dispositions pour la mesure caractérisées par l'utilisation de techniques optiques pour mesurer la profondeur
G06T 7/70 - Détermination de la position ou de l'orientation des objets ou des caméras
G06T 5/40 - Amélioration ou restauration d'image en utilisant des techniques d'histogrammes
H04N 13/268 - Générateurs de signaux d’images avec conversion d’images monoscopiques en images stéréoscopiques au moyen du rendu basé sur une image de profondeur
H04N 13/271 - Générateurs de signaux d’images où les signaux d’images générés comprennent des cartes de profondeur ou de disparité
33.
AN EYEPIECE FOR NIGHT VISION DEVICES CAPABLE OF INJECTING A SYNTHETIC IMAGE WITH A SINGLE OPTICAL CORRECTION MECHANISM
An eyepiece suitable for a night vision device, having an optical combiner for injecting a synthetic image onto the scene and having a single optical correction mechanism is provided herein. The eyepiece may include an observer- side lens; an objective- side lens; a diopter adjustment knob configured to set a distance between the observer- side lens and the objective- side lens; and an optical combiner located between the objective-side lens and the objective- side lens, wherein the optical combiner reflects towards the observer-side lens, the synthetic image transmitted from outside the eyepiece and transfers towards the observer- side lens, a scene image coming from an objecting lens of the night vision device and passing through the objective side lens, and wherein the diopter adjustment knob serves as a single setting mechanism which simultaneously sets a diopter of the observer and a focal depth of the display source image.
G02B 15/14 - Objectifs optiques avec moyens de faire varier le grossissement par déplacement axial d'au moins une lentille ou de groupes de lentilles relativement au plan de l'image afin de faire varier de façon continue la distance focale équivalente de l'objectif
G02B 27/14 - Systèmes divisant ou combinant des faisceaux fonctionnant uniquement par réflexion
G02B 23/12 - Télescopes ou lunettes d'approche, p.ex. jumelles; Périscopes; Instruments pour voir à l'intérieur de corps creux; Viseurs; Pointage optique ou appareils de visée avec des moyens pour renverser ou intensifier l'image
09 - Appareils et instruments scientifiques et électriques
Produits et services
Airborne radar apparatus featuring embedded software for detection and mapping of terrain and obstacles in degraded visual environments and adverse visibility conditions
09 - Appareils et instruments scientifiques et électriques
Produits et services
Computer hardware and software for planning, commanding and
controlling air force operations and missions including the
integration of multiple sensor and data sources of radars,
aircrafts, intelligence and anti-aircraft risks and
batteries, for creating on line comprehensive and accurate
aerial picture, aerial arena and situational awareness;
computer hardware and software for air force training and
simulation; flight simulators containing computer-operated
hardware units and associated software.
37.
SYSTEMS AND METHODS FOR REDUCING IMAGE ARTEFACTS IN BINOCULAR DISPLAYS
Aspects of embodiments pertain to a method for reducing a subjective visual artefact when displaying binocular overlapping images to a user of a binocular display system, the method comprising generating, by an image display unit comprising a plurality of pixels elements, right and left-eye source images; projecting the right and left-eye source images via corresponding left and right viewing optics to its user such that the user perceives partially overlapping left and right-hand observation images; reducing, a perception of a subjective visual artefact in the perceived right and/or left-hand observation images by modifying one or more pixel and/or image parameters values relating to the left and/or right-hand source images.
09 - Appareils et instruments scientifiques et électriques
Produits et services
Head-mounted displays for use primarily in aircraft, namely, head-mounted transparent electronic displays featuring a binocular display with an integrated head tracker for sensing motion, for use in providing crew members with navigational and operational information
A system for displaying videos, comprising a processing resource configured to: provide a data repository comprising a plurality of previously captured video segments (PCVSs) captured during previous operations of corresponding platforms, each being associated with metadata indicative of a Line-of-Sight (LoS) of a sensor, carried by the corresponding platform of the platforms used to capture the corresponding PCVS, with respect to a fixed coordinate system established in space, during capturing the corresponding PCVS; obtain an indication of a Region-of-Interest (RoI); identify one or more of the PCVSs that include at least part of the RoI, utilizing the LoSs associated with the PCVSs, giving rise to RoI matching PCVSs; and display at least part of at least one of the RoI matching PCVSs, being displayed RoI matching PCVSs, on a display of an operating platform to an operator of the operating platform during a current operation of the operating platform.
G01C 23/00 - Instruments combinés indiquant plus d’une valeur de navigation, p.ex. pour l’aviation; Dispositifs de mesure combinés pour mesurer plusieurs variables du mouvement, p.ex. la distance, la vitesse ou l’accélération
Aspects of embodiments pertain to a display illumination optics for illuminating an image display device of an image generation apparatus, the display illumination optics comprising: a source illumination distributor that includes an illumination waveguide having a front surface and a back surface opposite the front surface and configured to internally direct light along a main direction, wherein the illumination waveguide is configured to distribute the luminance of input illumination light along the main direction of the display illumination optics to obtain, along the main direction, output illumination light of desired luminance.
09 - Appareils et instruments scientifiques et électriques
Produits et services
Large area displays such as helmet mounted displays, head-up
displays, panoramic head-down displays and built-in aircraft
displays for presenting flight and mission data.
A touch screen (TS) is provided that comprises a display; a frame with edges positioned opposite each other around the display; at least a first sensor array and at least a second sensor array, wherein each of the sensor arrays has a plurality of light transmitters and a plurality of light sensors, and wherein the at least first sensor array and the at least second sensor array are disposed on the first edge and the second edge of the frame, respectively, wherein the transmitters of the first sensor array are facing the light sensors on the second sensor array positioned on two opposing edges of the frame; and at least one physical obstacle located on the third edge or the fourth edge, for reducing stray light scattered or reflected by third edge or the fourth edge and arriving to the light sensors.
A method and a system for landing and terrain flight assistance are provided herein. The method may include the following steps: obtaining, by at least one imaging sensor disposed on an aerial platform, at least two images of at least a portion of a specified region of a terrain; determining, based on the at least two images, a 3D model of at least a portion of the specified region; receiving a predetermined model of at least a portion of the specified region; determining a real-world geographic location of the aerial platform based on the 3D model and the predetermined model; and determining flight instructions based on the 3D model and the determined geographic location of the aerial platform.
G08G 5/02 - Aides pour l'atterrissage automatique, c. à d. systèmes dans lesquels les données des vols d'avions arrivant sont traitées de façon à fournir les données d'atterrissage
Embodiments pertain to systems and methods for providing information related to a scene occurring in a region of interest (ROI), using a scene data collector, configured to receive scene source data, from one or more data sources, using at least one sensor, identify one or more physical objects located in the ROI, based on the received scene source data, determine one or more attributes of the identified physical objects, generate a data object, for at least one of the identified one or more physical objects, based on one or more attributes thereof, and transmit all data objects generated to at least one remote station (RS), located remotely from the ROI. Each RS may be configured to receive the transmitted one or more data objects, generate a virtual scene data, based on the received one or more data objects; and display the generated virtual scene data.
A head wearable device (HWD) suitable to be worn by a user, the HWD may include: a head tracker configured to track a line of sight (LOS) of the user; a near eye display (NED) comprising: a plurality of transistors groups forming a pixel array of said display, a plurality of backlight units, forming a backlight surface of said display; a backlight control module configured to dim the backlight units that spatially overlap one or more of the transistor groups whenever the data at said transistor groups is being refreshed and further configured to change at least one of: a frequency and a location of the dimmed backlight units; and a computer processor coupled to the tracker and the NED and configured to instruct the backlight control module to change at least one of: the frequency and the location of the dimmed backlight units, based on the user LOS.
G09G 3/34 - Dispositions ou circuits de commande présentant un intérêt uniquement pour l'affichage utilisant des moyens de visualisation autres que les tubes à rayons cathodiques pour la présentation d'un ensemble de plusieurs caractères, p.ex. d'une page, en composant l'ensemble par combinaison d'éléments individuels disposés en matrice en commandant la lumière provenant d'une source indépendante
G06F 3/01 - Dispositions d'entrée ou dispositions d'entrée et de sortie combinées pour l'interaction entre l'utilisateur et le calculateur
G09G 3/36 - Dispositions ou circuits de commande présentant un intérêt uniquement pour l'affichage utilisant des moyens de visualisation autres que les tubes à rayons cathodiques pour la présentation d'un ensemble de plusieurs caractères, p.ex. d'une page, en composant l'ensemble par combinaison d'éléments individuels disposés en matrice en commandant la lumière provenant d'une source indépendante utilisant des cristaux liquides
47.
DIRECT VIEW DISPLAY WITH TRANSPARENT VARIABLE OPTICAL POWER ELEMENTS
A direct view display system (DVDS) and a method to operate it are provided herein. The DVDS may include: a variable optical power element (VOPE); a transparent active image source located with the VOPE on a common optical axis going from an outside scene to an eye position of a viewer; and a time division multiplexer (TDM) configured to control the VOPE and the transparent active image source, wherein the TDM is configured in a certain time period to cause the transparent active image source to be in a transparent state and the VOPE to exhibit no optical power, and wherein the TDM is configured in another time period to cause said transparent active image source to exhibit an image and said VOPE to apply non-zero optical power, for projecting the image onto the eye position at a desirable distance therefrom.
G02B 27/00 - Systèmes ou appareils optiques non prévus dans aucun des groupes ,
G02B 30/34 - Stéréoscopes fournissant une paire stéréoscopique d'images séparées correspondant à des vues déplacées parallèlement du même objet, p.ex. visionneuses de diapositives 3D
48.
SYSTEM AND METHOD FOR PROVIDING INCREASED SENSOR FIELD OF VIEW
A system and method for displaying a sensor data on a display are provided herein. the system may include: a tracker arrangement to track line of sight (LOS) of a user; a sensor configured to be directed based on the LOS and configured to capture data of a scene relative to said LOS, to yield LOS captured data; and a display configured to: receive the LOS captured data, and display the LOS captured data relative to the LOS, wherein the display field of view (FOV) is wider than the sensor FOV and wherein the display is configured to display a mosaic of plurality of the LOS captured data, wherein at least one of the LOS captured data appears in the mosaic is a real time LOS captured data, and wherein at least one of the LOS captured data appears in the mosaic is a previous LOS captured data.
09 - Appareils et instruments scientifiques et électriques
Produits et services
Large area displays, namely, helmet mounted displays, head-up displays, panoramic head-down displays, and built-in aircraft displays, all for use in military and commercial aircraft for presenting flight and mission data
50.
In-flight training simulation displaying a virtual environment
Method and system for displaying virtual environment during in-flight simulation. A simulation environment is selected for a training simulation of an airborne platform operating in flight within a real environment. The position and orientation of a display viewable by an operator of the airborne platform is determined with respect to the selected simulation environment. The display displays at least one simulation image comprising a view from a virtual altitude of simulation environmental terrain in the selected simulation environment, while the airborne platform is in flight at a real altitude above the real environmental terrain in the real environment, the virtual altitude above the simulation environmental terrain being a lower altitude than the real altitude above the real environmental terrain. The simulation image is displayed in accordance with the determined position and orientation of the display, such that the simulation environment is adaptive to operator manipulations of the airborne platform.
G09B 9/24 - Simulateurs pour l'enseignement ou l'entraînement pour l'enseignement de la conduite des véhicules ou autres moyens de transport pour l'enseignement du pilotage des aéronefs, p.ex. bancs d'entraînement au pilotage sans visibilité comprenant l'affichage ou l'enregistrement de la trajectoire de vol simulée
G09B 9/30 - Simulation de vue à partir d'un aéronef
G09B 9/44 - Simulateurs pour l'enseignement ou l'entraînement pour l'enseignement de la conduite des véhicules ou autres moyens de transport pour l'enseignement du pilotage des aéronefs, p.ex. bancs d'entraînement au pilotage sans visibilité assurant la simulation dans un aéronef réel qui vole à travers l'atmosphère sans limitation de sa trajectoire
G09B 9/46 - Simulateurs pour l'enseignement ou l'entraînement pour l'enseignement de la conduite des véhicules ou autres moyens de transport pour l'enseignement du pilotage des aéronefs, p.ex. bancs d'entraînement au pilotage sans visibilité l'aéronef étant un hélicoptère
51.
SYSTEM AND METHOD FOR DETERMINING A RELATIVE MOTION BETWEEN TWO OR MORE OBJECTS
Systems and methods for determining a relative motion between two or more objects are disclosed. The system may include an excitation unit adapted to be disposed on a first object and configured to excite and to induce at least one change in at least a portion of a second object. The system may include a sensing unit adapted to be disposed on the first object, the sensing unit may include at least one sensor configured to detect the at least one change in the second object at two or more different time instances and to generate corresponding two or more sensor output datasets. The system may include a processing unit configured to determine a relative motion between the first object and the second object based on the two or more sensor output datasets thereof.
G01S 17/58 - Systèmes de détermination de la vitesse ou de la trajectoire; Systèmes de détermination du sens d'un mouvement
G01P 3/68 - Dispositifs caractérisés par la détermination du temps mis à parcourir une distance constante en utilisant des moyens optiques, c. à d. en utilisant la lumière infrarouge, visible ou ultraviolette
G06T 7/254 - Analyse du mouvement impliquant de la soustraction d’images
52.
SYSTEM AND METHOD FOR DETERMINING A CONFIGURATION OF A MEASUREMENT VOLUME
A method of determining a configuration of a measurement volume, the method may include: generating, by at least one transmitter, a transmitted magnetic field within the measurement volume; measuring, by at least one receiver positioned, a total magnetic field in the measurement volume at at least one receiver position and generating at least one receiver output signal; generating, by a processing unit, a measured dataset; comparing, by the processing unit, the measured dataset with at least one of at least two reference configuration datasets each for determined for one of at least two different configurations of the measurement volume; and identifying, by the processing unit, a reference configuration dataset of the at least two reference configuration datasets that corresponds to the measured dataset.
G01S 11/06 - Systèmes pour déterminer la distance ou la vitesse sans utiliser la réflexion ou la reradiation utilisant les ondes radioélectriques utilisant des mesures d'intensité
G01S 1/08 - Systèmes pour déterminer une direction ou une ligne de position
A61B 5/06 - Dispositifs autres que ceux à radiation, pour détecter ou localiser les corps étrangers
Aspects of embodiments pertain to a sensing systems configured to receive scene electromagnetic (EM) radiation comprising a first wavelength (WL1) range and a second wavelength (WL2) range. The sensing system comprises at least one spectral filter configured to filter the received scene EM radiation to obtain EM radiation in the WL1 range and the WL2 ranges; and a self-adaptive electromagnetic (EM) energy attenuating structure. The self-adaptive EM energy attenuating structure may comprise material that includes nanosized particles which are configured such that high intensity EM radiation at the WL1 range incident onto a portion of the self-adaptive EM energy attenuating structure causes interband excitation of one or more electron-hole pairs, thereby enabling intraband transition in the portion of the self-adaptive EM energy attenuating structure by EM radiation in the WL2 range.
H01L 31/0232 - Dispositifs à semi-conducteurs sensibles aux rayons infrarouges, à la lumière, au rayonnement électromagnétique d'ondes plus courtes, ou au rayonnement corpusculaire, et spécialement adaptés, soit comme convertisseurs de l'énergie dudit rayonnement e; Procédés ou appareils spécialement adaptés à la fabrication ou au traitement de ces dispositifs ou de leurs parties constitutives; Leurs détails - Détails Éléments ou dispositions optiques associés au dispositif
H01L 31/0352 - Dispositifs à semi-conducteurs sensibles aux rayons infrarouges, à la lumière, au rayonnement électromagnétique d'ondes plus courtes, ou au rayonnement corpusculaire, et spécialement adaptés, soit comme convertisseurs de l'énergie dudit rayonnement e; Procédés ou appareils spécialement adaptés à la fabrication ou au traitement de ces dispositifs ou de leurs parties constitutives; Leurs détails caractérisés par leurs corps semi-conducteurs caractérisés par leur forme ou par les formes, les dimensions relatives ou la disposition des régions semi-conductrices
54.
SYSTEM AND METHOD FOR GENERATING A THREE-DIMENSIONAL (3D) MAP BASED ON MAPPING DESIGNATION INFORMATION
A system for generating a three-dimensional (3D) map of part of a field-of-view (FOV) of at least one detector of an active 3D scanner, the system comprising: the active 3D scanner, comprising: a scanning mechanism configured to scan the FOV; at least one energy emitting source configured to emit energy pulses, in synchronization with the scanning mechanism, to cover the FOV; and the at least one detector; and processing circuitry configured to: obtain mapping designation information independent of past readings obtained by the at least one detector, if any; selectively activate the energy emitting source to emit only a subset of the energy pulses, in accordance with the mapping designation information, to cover the part of the FOV; obtain current readings, from the at least one detector, based on reflections of the subset of the energy pulses; and generate the 3D map based on the current readings.
09 - Appareils et instruments scientifiques et électriques
Produits et services
Infrared search and track systems usable as wide field of
view surveillance units, for searching, detecting, tracking,
classifying and prioritization of potential targets, and
providing support with operational capabilities.
56.
System and method for providing synthetic information on a see-through device
A method and a system for displaying conformal synthetic data on a scene over a head-mounted see-through display (HMSTD) having a line of sight (LOS) are provided herein. The system may include: a tracker configured to track the LOS of the HMSTD; a display controller configured to display on the HMSTD a first display area comprising a synthetic image data conformal to a scene viewed from the HMSTD; wherein said display controller is configured to receive a desired point being a point within the scene which intersects the LOS of the HMSTD and to display a second display area on said HMSTD, wherein the second display area is positioned relative to said desired point, wherein said synthetic image data is displayed over the HMSTD at a reduced intensity on an overlap area between the first and the second areas.
A system for registering a coordinate system associated with a model of an object with a reference-coordinate-system, the object includes at least one marker, the system includes a portable-unit, a tracking-system and a processor. The portable unit includes and display and an optical-detection-assembly for acquiring at least one representation of the marker. The tracking-system tracks the position-and-orientation of the portable-unit in the reference-coordinate-system. The processor is configured to determine position-related-information respective of the marker in the reference-coordinate-system, to register the model with the reference-coordinate-system at least based on the position-related-information respective of the marker, and on a location of the marker in a coordinate system associated with the model, and to display registration-related-information on the display, at least one of the registration-related-information and the display location of the registration-related-information is related to the position-and-orientation of the portable-unit in the reference-coordinate-system.
A61B 5/055 - Détection, mesure ou enregistrement pour établir un diagnostic au moyen de courants électriques ou de champs magnétiques; Mesure utilisant des micro-ondes ou des ondes radio faisant intervenir la résonance magnétique nucléaire [RMN] ou électronique [RME], p.ex. formation d'images par résonance magnétique
A61B 34/10 - Planification, simulation ou modélisation assistées par ordinateur d’opérations chirurgicales
A61B 34/20 - Systèmes de navigation chirurgicale; Dispositifs pour le suivi ou le guidage d'instruments chirurgicaux, p.ex. pour la stéréotaxie sans cadre
58.
Method and system for a dynamic collision awareness envelope for a vehicle
A system determines a dynamic collision awareness envelope for a vehicle. The system includes at least one vehicle motion sensor, an operator Line-Of-Sight detector and a processor. The vehicle motion sensor periodically provides measurements relating to the motion of the vehicle in a reference coordinate system. The operator Line-Of-Sight detector periodically provides information relating to the direction of the Line-Of-Sight of an operator of the vehicle, in a vehicle coordinate system. The processor is coupled with the at least one vehicle motion sensor, and with the operator Line-Of-Sight detector. The processor determines an operator vector from the direction of the Line-Of-Sight of the operator. The processor further determines an operational vector at least from the motion of the vehicle. The processor periodically determines a collision awareness envelope respective of each of the operational vectors, from the operator vector and the respective operational vector.
G01S 13/933 - Radar ou systèmes analogues, spécialement adaptés pour des applications spécifiques pour prévenir les collisions d'aéronefs ou d'engins spatiaux
G01S 17/933 - Systèmes lidar, spécialement adaptés pour des applications spécifiques pour prévenir les collisions d’aéronefs ou d'engins spatiaux
G01S 19/39 - Détermination d'une solution de navigation au moyen des signaux émis par un système de positionnement satellitaire à radiophares le système de positionnement satellitaire à radiophares transmettant des messages horodatés, p.ex. GPS [Système de positionnement global], GLONASS [Système mondial de satellites de navigation] ou GALILEO
G06F 3/01 - Dispositions d'entrée ou dispositions d'entrée et de sortie combinées pour l'interaction entre l'utilisateur et le calculateur
A mixed reality system, comprising: a data acquisition device configured to acquire real-world data; an output device for providing the real-world data to a user; and a processing circuitry configured to: obtain (a) data acquired by the data acquisition device, and (b) information of one or more virtual entities having properties enabling determination of simulated effects of the virtual entities on the data; determine the simulated effects of the virtual entities on the data utilizing the properties; and provide the user with output on the output device being a manipulation of the data reflecting the simulated effects.
A mixed reality system, comprising: a data acquisition device configured to acquire real-world data; an output device for providing the real-world data to a user; and a processing circuitry configured to: obtain (a) data acquired by the data acquisition device, and (b) information of one or more virtual entities having properties enabling determination of simulated effects of the virtual entities on the data; determine the simulated effects of the virtual entities on the data utilizing the properties; and provide the user with output on the output device being a manipulation of the data reflecting the simulated effects.
A system for displaying videos, comprising a processing resource configured to: provide a data repository comprising a plurality of previously captured video segments (PCVSs) captured during previous operations of corresponding platforms, each being associated with metadata indicative of a Line-of-Sight (LoS) of a sensor, carried by the corresponding platform of the platforms used to capture the corresponding PCVS, with respect to a fixed coordinate system established in space, during capturing the corresponding PCVS; obtain an indication of a Region-of-Interest (RoI); identify one or more of the PCVSs that include at least part of the RoI, utilizing the LoSs associated with the PCVSs, giving rise to RoI matching PCVSs; and display at least part of at least one of the RoI matching PCVSs, being displayed RoI matching PCVSs, on a display of an operating platform to an operator of the operating platform during a current operation of the operating platform.
G01C 23/00 - Instruments combinés indiquant plus d’une valeur de navigation, p.ex. pour l’aviation; Dispositifs de mesure combinés pour mesurer plusieurs variables du mouvement, p.ex. la distance, la vitesse ou l’accélération
A system comprising a processing resource configured to: obtain a first indication of a confirmation, by an operator of a platform, of an alignment of one symbol of a first plurality of symbols with a second symbol as viewed in a head-mounted display of a head-mounting worn by the operator, the first plurality of symbols being projected by at least one optical apparatus disposed on the platform to a plurality of different viewing angles, the second symbol being projected onto the head-mounted display by a projection unit of the head-mounted display; determine which given symbol of the first plurality of symbols was aligned with the second, the given symbol having first orientation data, the second symbol having second orientation data; and perform an alignment of the head-mounting and the head-mounted display at least based on the first orientation data and the second orientation data.
09 - Appareils et instruments scientifiques et électriques
Produits et services
Infrared search and track systems usable as wide field of view surveillance units in the nature of infrared detection apparatus for searching, detecting, tracking, classifying and prioritization of potential targets, and providing support with operational capabilities
64.
MICROSURGERY SYSTEM FOR DISPLAYING IN REAL-TIME MAGNIFIED DIGITAL IMAGE SEQUENCES OF AN OPERATED AREA
A microsurgery system comprising a robotic arm, configured for movement; a head mounted display (HMD) configured to display to a user in real-time image sequences of an operated area; at least one camera coupled to said robotic arm, said at least one camera configured to acquire operated-area image sequences of said operated area, said at least one camera being suspended above said operated area and being mechanically and optically disconnected from said HMD, said robotic arm enables said at least one camera to capture said operated-area image sequences from a range of perspectives; a processing device configured to be coupled with said HMD and said at least one camera, said processing device configured to transmit said image sequences of said operated-area to said HMD; and a tracker configured to track at least one of a head of said user, a hand of said user, and a tool held by said user; wherein said robotic arm is enabled to be guided according to movements of tracked at least one of said head, said hand, and said tool.
A61B 1/00 - Instruments pour procéder à l'examen médical de l'intérieur des cavités ou des conduits du corps par inspection visuelle ou photographique, p.ex. endoscopes; Dispositions pour l'éclairage dans ces instruments
A61B 5/00 - Mesure servant à établir un diagnostic ; Identification des individus
A61B 90/30 - Dispositifs pour éclairer une zone chirurgicale, les dispositifs ayant une corrélation avec d’autres dispositifs chirurgicaux ou avec une intervention chirurgicale
A61B 90/00 - Instruments, outillage ou accessoires spécialement adaptés à la chirurgie ou au diagnostic non couverts par l'un des groupes , p.ex. pour le traitement de la luxation ou pour la protection de bords de blessures
A method for automatically determining a potential transmission region, comprising: acquiring multiple binary indications for a direct line of sight between a transmitter and an airborne vehicle respective of multiple geo-positions of the airborne vehicle; acquiring each of the multiple geo-positions respective of each of the multiple binary indications for the direct line of sight; for each one of the acquired geo-positions, determining a layer of access respective of topographical data for the geographical region and the acquired geo-position, as a subset of the geographical region that includes at least one potential point defining a potential line of sight between the transmitter and the airborne vehicle, thereby determining multiple layers of access; determining an intersection of the multiple layers of access; and determining, from the intersection, the potential transmission region respective of the transmitter and the geo-positions of the airborne vehicle.
G01S 7/02 - DÉTERMINATION DE LA DIRECTION PAR RADIO; RADIO-NAVIGATION; DÉTERMINATION DE LA DISTANCE OU DE LA VITESSE EN UTILISANT DES ONDES RADIO; LOCALISATION OU DÉTECTION DE LA PRÉSENCE EN UTILISANT LA RÉFLEXION OU LA RERADIATION D'ONDES RADIO; DISPOSITIONS ANALOGUES UTILISANT D'AUTRES ONDES - Détails des systèmes correspondant aux groupes , , de systèmes selon le groupe
G01S 7/52 - DÉTERMINATION DE LA DIRECTION PAR RADIO; RADIO-NAVIGATION; DÉTERMINATION DE LA DISTANCE OU DE LA VITESSE EN UTILISANT DES ONDES RADIO; LOCALISATION OU DÉTECTION DE LA PRÉSENCE EN UTILISANT LA RÉFLEXION OU LA RERADIATION D'ONDES RADIO; DISPOSITIONS ANALOGUES UTILISANT D'AUTRES ONDES - Détails des systèmes correspondant aux groupes , , de systèmes selon le groupe
G01S 5/20 - Position de source déterminée par plusieurs goniomètres espacés
G01S 5/16 - Localisation par coordination de plusieurs déterminations de direction ou de ligne de position; Localisation par coordination de plusieurs déterminations de distance utilisant des ondes électromagnétiques autres que les ondes radio
G01S 7/48 - DÉTERMINATION DE LA DIRECTION PAR RADIO; RADIO-NAVIGATION; DÉTERMINATION DE LA DISTANCE OU DE LA VITESSE EN UTILISANT DES ONDES RADIO; LOCALISATION OU DÉTECTION DE LA PRÉSENCE EN UTILISANT LA RÉFLEXION OU LA RERADIATION D'ONDES RADIO; DISPOSITIONS ANALOGUES UTILISANT D'AUTRES ONDES - Détails des systèmes correspondant aux groupes , , de systèmes selon le groupe
G01S 5/14 - Localisation par coordination de plusieurs déterminations de direction ou de ligne de position; Localisation par coordination de plusieurs déterminations de distance utilisant les ondes radioélectriques déterminant des distances absolues à partir de plusieurs points espacés d'emplacement connu
G01S 5/00 - Localisation par coordination de plusieurs déterminations de direction ou de ligne de position; Localisation par coordination de plusieurs déterminations de distance
G01S 7/00 - DÉTERMINATION DE LA DIRECTION PAR RADIO; RADIO-NAVIGATION; DÉTERMINATION DE LA DISTANCE OU DE LA VITESSE EN UTILISANT DES ONDES RADIO; LOCALISATION OU DÉTECTION DE LA PRÉSENCE EN UTILISANT LA RÉFLEXION OU LA RERADIATION D'ONDES RADIO; DISPOSITIONS ANALOGUES UTILISANT D'AUTRES ONDES - Détails des systèmes correspondant aux groupes , ,
66.
Gradual transitioning between two-dimensional and three-dimensional augmented reality images
System and method for enhancing situational awareness. A moveable see-through display viewable by a user displays an augmented reality 2D image of an external scene based on received 2D image data, in accordance with updated position and orientation of display. The see-through display further displays an augmented reality 3D image of the external scene based on received 3D image data, the 3D image overlaid conformally onto view of external scene, in accordance with updated position and orientation of display. The see-through display further selectively displays: a gradual transition of the 2D image into the 3D image, or a gradual transition of the 3D image into the 2D image. At least one image feature may gradually appear or gradually disappear during the gradual transition. The 2D or 3D image may include a region of interest based on updated position and orientation of display or selected by user.
A system for displaying, on a see-through display located within a moving platform, a frame, while at least partially correcting a rolling display effect.
G02B 23/12 - Télescopes ou lunettes d'approche, p.ex. jumelles; Périscopes; Instruments pour voir à l'intérieur de corps creux; Viseurs; Pointage optique ou appareils de visée avec des moyens pour renverser ou intensifier l'image
System and method for determining position of head-mounted image sensor using celestial navigation. A head-mounted image sensor, worn by an operator, captures at least one image of an external scene comprising a celestial view, pursuant with natural head movements of operator. A processor receives orientation of head-mounted image sensor in reference coordinate system, extracts parameters of celestial bodies using stored celestial data, captured image, and received orientation, and determines position of head-mounted image sensor based on extracted parameters, such as based on difference between relative angle and expected relative angle of single celestial body or constellation of celestial bodies in captured image. The image sensor may be situated in a mobile platform. A default geolocation device, such as an IMU subject to angular drifts, may provide a default geolocation estimate of mobile platform, and may be monitored, updated or calibrated using determined position of head-mounted image sensor.
G01C 21/02 - Navigation; Instruments de navigation non prévus dans les groupes par des moyens astronomiques
G01C 21/16 - Navigation; Instruments de navigation non prévus dans les groupes en utilisant des mesures de la vitesse ou de l'accélération exécutées à bord de l'objet navigant; Navigation à l'estime en intégrant l'accélération ou la vitesse, c. à d. navigation par inertie
A method and a system are provided herein for calculating whether or not a specific aerial vehicle at a specified point of time can maneuver over a given location in the terrain while complying with terrain clearance requirements. The system may include a computer memory configured to store a 3D model representing at least a portion of a terrain located in a vicinity of an aerial vehicle; a computer processor configured to map said portion of the terrain into at least two types: a first type indicative of a potential of the aerial vehicle to maneuver over a respective terrain while complying with terrain clearance, and a second type indicative of a non-potential of said aerial vehicle to maneuver over a respective terrain, wherein the mapping is carried out based on said parameters, the 3D model and given predefined performance of the aerial vehicle.
A system for displaying videos, comprising a processing resource configured to: provide a data repository comprising a plurality of previously captured video segments (PCVSs) captured during previous operations of corresponding platforms, each being associated with metadata indicative of a Line-of-Sight (LoS) of a sensor, carried by the corresponding platform of the platforms used to capture the corresponding PC VS, with respect to a fixed coordinate system established in space, during capturing the corresponding PC VS; obtain an indication of a Region-of-Interest (Rol); identify one or more of the PCVSs that include at least part of the Rol, utilizing the LoSs associated with the PCVSs, giving rise to Rol matching PCVSs; and display at least part of at least one of the Rol matching PCVSs, being displayed Rol matching PCVSs, on a display of an operating platform to an operator of the operating platform during a current operation of the operating platform.
B60R 1/00 - Dispositions pour la visibilité optique; Dispositions de visualisation en temps réel pour les conducteurs ou les passagers utilisant des systèmes de capture d’images optiques, p.ex. des caméras ou des systèmes vidéo spécialement adaptés pour être utilisés dans ou sur des véhicules
G06K 9/46 - Extraction d'éléments ou de caractéristiques de l'image
G06T 7/73 - Détermination de la position ou de l'orientation des objets ou des caméras utilisant des procédés basés sur les caractéristiques
71.
System and method for providing synthetic information on a see-through device
A system for displaying a combined image data from several sources on a head-mounted see-through display (HMSTD) defining a line of sight (LOS) is provided herein. The system includes: a tracker configured to track the LOS of the HMSTD; a display controller configured to display a first display area including a synthetic image data conformal to a scene viewed via the HMSTD; a trigger mechanism configured to select a desired point within the scene which intersects the LOS of the HMSTD at a time of a triggering event, wherein the display controller is configured to receive the desired point and to display a second display area on the HMSTD, wherein the second display area is positioned relative to the desired point, and wherein said display controller is further configured to modify the synthetic image data in a portion of the first display area covered by the second display area.
A method for initiating a gesture-based mutual interaction scheme between a first and second mobile device, comprising: associating a gesture-based mutual interaction scheme between the first and second mobile device that associates a position scheme with a respective action, where the position scheme relates to any of: an absolute or relative orientation, and an absolute or relative trajectory; acquiring a first position property of the first mobile device and a second position property of the second mobile device; determining that each of the first and second position properties comply with the position scheme; triggering an execution of an action on the second mobile device, where the action is associated with the position scheme that the first position property complies with; where the second mobile device conditions the execution of the action triggered by the first mobile device on the compliance of the second position property with the position scheme.
G06F 3/0346 - Dispositifs de pointage déplacés ou positionnés par l'utilisateur; Leurs accessoires avec détection de l’orientation ou du mouvement libre du dispositif dans un espace en trois dimensions [3D], p.ex. souris 3D, dispositifs de pointage à six degrés de liberté [6-DOF] utilisant des capteurs gyroscopiques, accéléromètres ou d’inclinaiso
G06F 1/16 - TRAITEMENT ÉLECTRIQUE DE DONNÉES NUMÉRIQUES - Détails non couverts par les groupes et - Détails ou dispositions de structure
Optical systems and methods are provided, which combine see-through view of the real world and display source images using a conical optical combiner cut to have flat surfaces normal to the viewer line of sight. The conical shape minimizes interferences in the view of the real world as the edges of the optical combiner are tangent to the viewer vision field of view and the inner part of the optical combiner is semitransparent. Additionally, the optical system comprises a beam splitter, a shutter(s) for attenuating or blocking the see-through path and may employ polarizing element to improve the contrast between the scene observation and the projected display and thus enabling selective viewing of either. The system may also be configured to enable diopter adjustment and virtual display distance adjustments.
G02F 1/00 - Dispositifs ou dispositions pour la commande de l'intensité, de la couleur, de la phase, de la polarisation ou de la direction de la lumière arrivant d'une source lumineuse indépendante, p.ex. commutation, ouverture de porte ou modulation; Optique non linéaire
09 - Appareils et instruments scientifiques et électriques
Produits et services
(1) Computer hardware and software based command and control systems for planning and managing military operations, consisting of communication and ground positioning systems, officer's operation terminals, operator's sensor terminals, mobile and stationary commander's terminals, mobile and stationary patrol terminals, parts and fittings for the afore-defined systems
75.
Thermal management of printed circuit board components
A first thermal management approach involves an air flow through cooling mechanism with multiple airflow channels for dissipating heat generated in a PCA. The air flow direction through at least one of the channels is different from the air flow direction through at least another of the channels. Alternatively or additionally, the airflow inlet of at least one channel is off-axis with respect to the airflow outlet. A second thermal management approach involves the fabrication of a PCB with enhanced durability by mitigating via cracking or PTH fatigue. At least one PCB layer is composed of a base material formed from a 3D woven fiberglass fabric, and conductive material deposited onto the base material surface. A conductive PTH extends through the base material of multiple PCB layers, where the CTE of the base material along the z-axis direction substantially matches the CTE of the conductive material along the x-axis direction.
Imaging systems and methods are provided, which implement wide field imaging of a region for medical procedures and provide tracking of tools and tissues in the whole region while providing digitally magnified images of a portion of the captured region. Optical tracking may be implemented by stereoscopic imaging, and various elements may be optically tracked such as various markers and fiducials, as well as certain shapes and objects which are optically identifiable by image processing.
A system and method for displaying a sensor data on a display are provided herein. the system may include: a tracker arrangement to track line of sight (LOS) of a user; a sensor configured to be directed based on the LOS and configured to capture data of a scene relative to said LOS, to yield LOS captured data; and a display configured to: receive the LOS captured data, and display the LOS captured data relative to the LOS, wherein the display field of view (FOV) is wider than the sensor FOV and wherein the display is configured to display a mosaic of plurality of the LOS captured data, wherein at least one of the LOS captured data appears in the mosaic is a real time LOS captured data, and wherein at least one of the LOS captured data appears in the mosaic is a previous LOS captured data.
H04N 5/232 - Dispositifs pour la commande des caméras de télévision, p.ex. commande à distance
G06T 7/70 - Détermination de la position ou de l'orientation des objets ou des caméras
G02B 27/00 - Systèmes ou appareils optiques non prévus dans aucun des groupes ,
G06F 16/487 - Recherche caractérisée par l’utilisation de métadonnées, p.ex. de métadonnées ne provenant pas du contenu ou de métadonnées générées manuellement utilisant des informations géographiques ou spatiales, p.ex. la localisation
LCD resizing methods. In first method, display is situated in vacuum chamber under vacuum conditions, and temperature is decreased below freezing temperature of display image-generating medium. A groove is formed along selected groove dimensions, extending through one plate and terminating within other plate. Adhesive is applied into formed groove, and display temperature is increased above adhesive liquidation temperature. Vacuum pressure is increased to at least atmospheric pressure, adhesive is polymerized to form seal, and plates are severed. In second method, grooves are formed in each plate along respective groove dimensions. A compressive force is applied against intended remaining plate sections to prevent air penetrating between plates and seepage of image-generating medium. Display plates are severed along respective groove dimensions, forming an exposed inner surface at plate edge. Adhesive is applied at exposed surface, compressive force is reduced allowing adhesive to permeate between plates, and adhesive is polymerized to form seal.
G02F 1/1339 - Dispositions relatives à la structure Éléments d'espacement; Scellement des cellules
B32B 17/06 - Produits stratifiés composés essentiellement d'une feuille de verre ou de fibres de verre, de scorie ou d'une substance similaire comprenant du verre comme seul composant ou comme composant principal d'une couche adjacente à une autre couche d'une substance spécifique
79.
SYSTEMS AND METHODS FOR REDUCING IMAGE ARTEFACTS IN BINOCULAR DISPLAYS
Aspects of embodiments pertain to a method for reducing a subjective visual artefact when displaying binocular overlapping images to a user of a binocular display system, the method comprising generating, by an image display unit comprising a plurality of pixels elements, right and left-eye source images; projecting the right and left-eye source images via corresponding left and right viewing optics to its user such that the user perceives partially overlapping left and right-hand observation images; reducing, a perception of a subjective visual artefact in the perceived right and/or left-hand observation images by modifying one or more pixel and/or image parameters values relating to the left and/or right-hand source images.
A direct view display system (DVDS) and a method to operate it are provided herein. The DVDS includes a variable optical power element (VOPE); a transparent active image source located with the VOPE on a common optical axis going from an outside scene to an eye position of a viewer; and a time division multiplexer (TDM) configured to control the VOPE and the transparent active image source, wherein the TDM is configured in a certain time period to cause the transparent active image source to be in a transparent state and the VOPE to exhibit no optical power, and wherein the TDM is configured in another time period to cause said transparent active image source to exhibit an image and said VOPE to apply non-zero optical power, for projecting the image onto the eye position at a desirable distance therefrom.
A system comprising a processing resource configured to: obtain a first indication of a confirmation, by an operator of a platform, of an alignment of one symbol of a first plurality of symbols with a second symbol as viewed in a head- mounted display of a head-mounting worn by the operator, the first plurality of symbols being projected by at least one optical apparatus disposed on the platform to a plurality of different viewing angles, the second symbol being projected onto the head-mounted display by a projection unit of the head-mounted display; determine which given symbol of the first plurality of symbols was aligned with the second, the given symbol having first orientation data, the second symbol having second orientation data; and perform an alignment of the head-mounting and the head- mounted display at least based on the first orientation data and the second orientation data.
Aspects of embodiments pertain to a display illumination optics for illuminating an image display device of an image generation apparatus, the display illumination optics comprising: a source illumination distributor that includes an illumination waveguide having a front surface and a back surface opposite the front surface and configured to internally direct light along a main direction, wherein the illumination waveguide is configured to distribute the luminance of input illumination light along the main direction of the display illumination optics to obtain, along the main direction, output illumination light of desired luminance.
F21V 8/00 - Utilisation de guides de lumière, p.ex. dispositifs à fibres optiques, dans les dispositifs ou systèmes d'éclairage
G02F 1/01 - Dispositifs ou dispositions pour la commande de l'intensité, de la couleur, de la phase, de la polarisation ou de la direction de la lumière arrivant d'une source lumineuse indépendante, p.ex. commutation, ouverture de porte ou modulation; Optique non linéaire pour la commande de l'intensité, de la phase, de la polarisation ou de la couleur
09 - Appareils et instruments scientifiques et électriques
Produits et services
Computer hardware and software based command and control
systems for planning and managing military operations,
consisting of communication and ground positioning systems,
officer's operation terminals, operator's sensor terminals,
mobile and stationary commander's terminals, mobile and
stationary patrol terminals; parts and fittings for the
afore-defined systems.
84.
Augmented reality display reflective of visibility affecting features in real-world environment
Method and system for displaying augmented reality reflective of environmental features affecting visibility. Characteristics of a virtual object to be displayed on view of scene is determined. Environmental features affecting visibility along a line-of-sight from scene origin to virtual object are detected. When detected feature is at least one non-obstructing feature, its effect on visibility is determined, and virtual object is displayed superimposed onto view of scene such that appearance of virtual object is consistent with determined effect on visibility. When detected feature includes an amorphous obstructing feature, its range and contour is determined, and obstructed portions of virtual object is determined based on difference between range of virtual object and range of amorphous obstructing feature, and virtual object is displayed superimposed onto view of scene such that determined obstructed portions of virtual object appear obstructed in displayed view.
G02B 23/12 - Télescopes ou lunettes d'approche, p.ex. jumelles; Périscopes; Instruments pour voir à l'intérieur de corps creux; Viseurs; Pointage optique ou appareils de visée avec des moyens pour renverser ou intensifier l'image
G09G 5/37 - Dispositions ou circuits de commande de l'affichage communs à l'affichage utilisant des tubes à rayons cathodiques et à l'affichage utilisant d'autres moyens de visualisation caractérisés par l'affichage de dessins graphiques individuels en utilisant une mémoire à mappage binaire - Détails concernant le traitement de dessins graphiques
G06T 11/60 - Edition de figures et de texte; Combinaison de figures ou de texte
G09G 5/38 - Dispositions ou circuits de commande de l'affichage communs à l'affichage utilisant des tubes à rayons cathodiques et à l'affichage utilisant d'autres moyens de visualisation caractérisés par l'affichage de dessins graphiques individuels en utilisant une mémoire à mappage binaire avec des moyens pour commander la position de l'affichage
B64D 43/00 - Aménagements ou adaptations des instruments
G06F 3/01 - Dispositions d'entrée ou dispositions d'entrée et de sortie combinées pour l'interaction entre l'utilisateur et le calculateur
G01C 23/00 - Instruments combinés indiquant plus d’une valeur de navigation, p.ex. pour l’aviation; Dispositifs de mesure combinés pour mesurer plusieurs variables du mouvement, p.ex. la distance, la vitesse ou l’accélération
G02B 27/00 - Systèmes ou appareils optiques non prévus dans aucun des groupes ,
G06F 3/0484 - Techniques d’interaction fondées sur les interfaces utilisateur graphiques [GUI] pour la commande de fonctions ou d’opérations spécifiques, p.ex. sélection ou transformation d’un objet, d’une image ou d’un élément de texte affiché, détermination d’une valeur de paramètre ou sélection d’une plage de valeurs
B60K 35/00 - Agencement ou adaptations des instruments
Method for verifying registration of a model of an internal-body-part with the internal-body-part in a reference coordinate system. The internal-body-part is at least partially unseen directly by a user. The method includes the procedures of continuously determining a position and orientation of a head-mounted-display in the reference coordinate system, determining a display location of at least one virtual marker, and displaying the virtual marker according to the display location. The display location of the virtual marker is determined according to the expected position of a respective at least one reference point relative to the head-mounted-display. The reference point is directly visible to the user. The relative position between the reference point and the internal-body-part is substantially constant. The position of the reference point in the reference coordinate system is predetermined. When the model and the internal-body-part are effectively registered, the reference point and corresponding virtual marker appears visually in alignment.
G06T 7/33 - Détermination des paramètres de transformation pour l'alignement des images, c. à d. recalage des images utilisant des procédés basés sur les caractéristiques
G06T 11/60 - Edition de figures et de texte; Combinaison de figures ou de texte
G06T 7/73 - Détermination de la position ou de l'orientation des objets ou des caméras utilisant des procédés basés sur les caractéristiques
A61B 34/10 - Planification, simulation ou modélisation assistées par ordinateur d’opérations chirurgicales
A61B 90/00 - Instruments, outillage ou accessoires spécialement adaptés à la chirurgie ou au diagnostic non couverts par l'un des groupes , p.ex. pour le traitement de la luxation ou pour la protection de bords de blessures
A61B 34/20 - Systèmes de navigation chirurgicale; Dispositifs pour le suivi ou le guidage d'instruments chirurgicaux, p.ex. pour la stéréotaxie sans cadre
A61B 90/50 - Supports pour instruments chirurgicaux, p.ex. bras articulés
09 - Appareils et instruments scientifiques et électriques
Produits et services
Computer hardware and preinstalled software based command and control systems for planning and managing military operations, consisting of communication and ground positioning systems, officer's operation computer terminals, operator's sensor computer terminals, mobile and stationary commander's computer terminals, mobile and stationary patrol computer terminals; parts and fittings for the afore-defined systems
Head-mounted see-through displays and methods are provided which improve helicopter displays and provide more relevant information to the pilot while reducing the information load on the display. Displays comprise a peripheral pilot-scene pitch indication region, indicating scene orientation information according to an orientation of the pilot's head with respect to the scene, a propulsion state indication region, indicating rotor and engines states with respect to their nominal operation states by presenting only deviations of the rotor and engines states from their nominal operation states, a speed indication region, indicating, in association with displayed air speed, a calculated ground speed or a calculated ground speed component in a direction of propagation and a helicopter pitch indication region displaying a pitch ladder which is re-arranged to space close pitch lines and crowd remote pitch lines.
G01C 23/00 - Instruments combinés indiquant plus d’une valeur de navigation, p.ex. pour l’aviation; Dispositifs de mesure combinés pour mesurer plusieurs variables du mouvement, p.ex. la distance, la vitesse ou l’accélération
Method and system for displaying virtual environment during in-flight simulation. A simulation environment is selected for a training simulation of an airborne platform operating in flight within a real environment. The position and orientation of a display viewable by an operator of the airborne platform is determined with respect to the selected simulation environment. The display displays at least one simulation image comprising a view from a virtual altitude of simulation environmental terrain in the selected simulation environment, while the airborne platform is in flight at a real altitude above the real environmental terrain in the real environment, the virtual altitude above the simulation environmental terrain being a lower altitude than the real altitude above the real environmental terrain. The simulation image is displayed in accordance with the determined position and orientation of the display, such that the simulation environment is adaptive to operator manipulations of the airborne platform.
G09B 9/30 - Simulation de vue à partir d'un aéronef
G06F 3/01 - Dispositions d'entrée ou dispositions d'entrée et de sortie combinées pour l'interaction entre l'utilisateur et le calculateur
G09B 9/24 - Simulateurs pour l'enseignement ou l'entraînement pour l'enseignement de la conduite des véhicules ou autres moyens de transport pour l'enseignement du pilotage des aéronefs, p.ex. bancs d'entraînement au pilotage sans visibilité comprenant l'affichage ou l'enregistrement de la trajectoire de vol simulée
91.
METHOD AND SYSTEM FOR A DYNAMIC COLLISION AWARENESS ENVELOPE FOR A VEHICLE
A system for determining a dynamic collision awareness envelope for a vehicle. The system includes at least one vehicle motion sensor, an operator Line-Of-Sight detector and a processor. The vehicle motion sensor periodically provides measurements relating to the motion of the vehicle in a reference coordinate system. The operator Line-Of-Sight detector periodically provides information relating to the direction of the Line-Of-Sight of an operator of the vehicle, in a vehicle coordinate system. The processor is coupled with the at least one vehicle motion sensor, and with the operator Line-Of-Sight detector. The processor determines an operator vector from the direction of the Line-Of-Sight of the operator. The processor further determines an operational vector at least from the motion of the vehicle. The processor periodically determines a collision awareness envelope respective of each of the operational vectors, from the operator vector and the respective operational vector.
G06K 9/00 - Méthodes ou dispositions pour la lecture ou la reconnaissance de caractères imprimés ou écrits ou pour la reconnaissance de formes, p.ex. d'empreintes digitales
G06F 3/01 - Dispositions d'entrée ou dispositions d'entrée et de sortie combinées pour l'interaction entre l'utilisateur et le calculateur
A system for management of a plurality of projects developed by a plurality of teams, each project of the projects spans over a plurality of consecutive time-periods, and having a plurality of milestones defined by a client of the respective project, each milestone defining corresponding stages of the respective project, each milestone having a milestone due-date.
G06Q 10/06 - Ressources, gestion de tâches, des ressources humaines ou de projets; Planification d’entreprise ou d’organisation; Modélisation d’entreprise ou d’organisation
Methods and systems are provided for using a user's line of sight (LOS) and/or gaze direction to enhance displayed data and control various displays and instruments. The user's LOS is tracked, corresponding element(s) in a scene at which the user gazes are identified, e.g., in a database, and respective displayed data are enhanced or otherwise manipulated with respect to the identified element(s). The database may be multilayered and may comprise data layers which are conformal to a scene representation. Users may also select, using their LOS, among multiple layers of information and among multiple displayed parts to enhance or attenuate respective layers or parts. Designated elements may be real-world elements, displayed elements or instruments in the operational surroundings of the user. Identification of elements at which LOSs of multiple users are aimed at may be used for exchange of information among the users.
G06F 3/01 - Dispositions d'entrée ou dispositions d'entrée et de sortie combinées pour l'interaction entre l'utilisateur et le calculateur
G06F 3/041 - Numériseurs, p.ex. pour des écrans ou des pavés tactiles, caractérisés par les moyens de transduction
G06F 3/04812 - Techniques d’interaction fondées sur l’aspect ou le comportement du curseur, p.ex. sous l’influence de la présence des objets affichés
G06F 3/0484 - Techniques d’interaction fondées sur les interfaces utilisateur graphiques [GUI] pour la commande de fonctions ou d’opérations spécifiques, p.ex. sélection ou transformation d’un objet, d’une image ou d’un élément de texte affiché, détermination d’une valeur de paramètre ou sélection d’une plage de valeurs
G06F 16/29 - Bases de données d’informations géographiques
G06T 19/00 - Transformation de modèles ou d'images tridimensionnels [3D] pour infographie
H04W 4/029 - Services de gestion ou de suivi basés sur la localisation
95.
System and method for determining the position and orientation of a tool tip relative to eye tissue of interest
System for determining a position of a tool-point-of-interest of a tool, relative to an eye-tissue-of-interest, which includes and imager a tool-tracker and a processor. The imager acquires an image of a tissue-reference-marker. The tool-tracker determines information relating to the P&O of the tool in a reference-coordinate-system. The imager and the tool-tracker are in registration with the reference-coordinate-system. The processor determines the position of the tissue-reference-marker in the reference-coordinate-system, according to the acquired image of the tissue-reference-marker. The processor determines the P&O of the eye-tissue-of-interest in the reference-coordinate-system according to at least the position of the tissue-reference-marker and a relative position between the tissue-reference-marker and the eye-tissue-of-interest. The relative position is pre-determined from a stored-3D-model. The processor determines the position of a tool-point-of-interest in the reference-coordinate-system from the P&O of the tool in the reference-coordinate-system. The processor also determines a relative position between the tool-point-of-interest and the eye-tissue-of-interest.
A61B 5/00 - Mesure servant à établir un diagnostic ; Identification des individus
A61B 34/10 - Planification, simulation ou modélisation assistées par ordinateur d’opérations chirurgicales
A61B 34/20 - Systèmes de navigation chirurgicale; Dispositifs pour le suivi ou le guidage d'instruments chirurgicaux, p.ex. pour la stéréotaxie sans cadre
A61B 90/00 - Instruments, outillage ou accessoires spécialement adaptés à la chirurgie ou au diagnostic non couverts par l'un des groupes , p.ex. pour le traitement de la luxation ou pour la protection de bords de blessures
09 - Appareils et instruments scientifiques et électriques
Produits et services
Recorded computer software for developing and designing helmets; recorded computer software for developing and designing decorations and labels for helmets
97.
Microsurgery system for displaying in real time magnified digital image sequences of an operated area
A system captures and displays video of surgeries. The system may include at least one digital image sensor optically coupled to one or more lenses and configured to capture a video sequence of a scene in a surgery; at least one interface configured to receive at least one region on interest (ROI) of the captured video sequence; an electronic display, selected so that at least one of the digital image sensors has a pixel resolution which is substantially greater than the pixel resolution of the electronic display; and a computer processor configured to: receive the at least one captured video sequence and the at least one received ROI and display over the at least one electronic display a portion of the captured video sequence based on the at least one selected ROI.
A61B 1/00 - Instruments pour procéder à l'examen médical de l'intérieur des cavités ou des conduits du corps par inspection visuelle ou photographique, p.ex. endoscopes; Dispositions pour l'éclairage dans ces instruments
A61B 90/30 - Dispositifs pour éclairer une zone chirurgicale, les dispositifs ayant une corrélation avec d’autres dispositifs chirurgicaux ou avec une intervention chirurgicale
A61B 90/00 - Instruments, outillage ou accessoires spécialement adaptés à la chirurgie ou au diagnostic non couverts par l'un des groupes , p.ex. pour le traitement de la luxation ou pour la protection de bords de blessures
A61B 5/00 - Mesure servant à établir un diagnostic ; Identification des individus
A61B 5/11 - Mesure du mouvement du corps entier ou de parties de celui-ci, p.ex. tremblement de la tête ou des mains ou mobilité d'un membre
A61B 17/00 - Instruments, dispositifs ou procédés chirurgicaux, p.ex. tourniquets
G02B 27/00 - Systèmes ou appareils optiques non prévus dans aucun des groupes ,
A61B 90/50 - Supports pour instruments chirurgicaux, p.ex. bras articulés
A61B 90/20 - Microscopes chirurgicaux caractérisés par des aspects non optiques
98.
SELF-CALIBRATING INERTIAL MEASUREMENT SYSTEM AND METHOD
An inertial measurement system comprising at least one sensor cluster comprising a plurality of inertial sensors for sampling at least one of acceleration and angular velocity of said at least one sensor cluster with respect to each axis in a plurality of axes of a reference frame, and for producing individual outputs associated with said at least one of acceleration and angular velocity, at least three of said inertial sensors said sampling with respect to each same respective said axis; and a processing engine for receiving said individual outputs, combining said individual outputs to yield respective combined outputs, detecting which of said individual outputs diverges from at least one of their inter-comparison, and its respective combined output, according to a decision rule, said processing engine configured to dynamically self-calibrate a parameter that includes individual scale factor of those said inertial sensors whose said individual outputs were detected to diverge.
G01C 25/00 - Fabrication, étalonnage, nettoyage ou réparation des instruments ou des dispositifs mentionnés dans les autres groupes de la présente sous-classe
G01C 21/16 - Navigation; Instruments de navigation non prévus dans les groupes en utilisant des mesures de la vitesse ou de l'accélération exécutées à bord de l'objet navigant; Navigation à l'estime en intégrant l'accélération ou la vitesse, c. à d. navigation par inertie
99.
Fault tolerant LCD display using redundant drivers, select lines, data lines, and switches
A display device comprising: a plurality of pixels, each pixel including at least one sub-pixel; each sub-pixel comprising: a drivable visual segment, being operative to exhibit at least a first visible state and a second visible state; a first electrical potential setting section coupled with the drivable visual segment and with a first select terminal and a first data terminal, the first electrical potential setting section being operative to drive the drivable visual segment, at least from the first visible state to the second visible state; a second electrical potential setting section coupled with the drivable visual segment and with a second select terminal and a second data terminal, the second electrical potential setting section being operative to drive the drivable visual segment, independently from the first electrical potential setting section, at least from the first visible state to said second visible state.
G09G 3/36 - Dispositions ou circuits de commande présentant un intérêt uniquement pour l'affichage utilisant des moyens de visualisation autres que les tubes à rayons cathodiques pour la présentation d'un ensemble de plusieurs caractères, p.ex. d'une page, en composant l'ensemble par combinaison d'éléments individuels disposés en matrice en commandant la lumière provenant d'une source indépendante utilisant des cristaux liquides
G02F 1/1368 - Cellules à adressage par une matrice active dans lesquelles l'élément de commutation est un dispositif à trois électrodes
G02F 1/1362 - Cellules à adressage par une matrice active
A map data structure and method and system for obtaining geographic information. Raw map data is converted into map data structure by generating an attribute layer for each geographic attribute according to a defined line pattern and line width, and defining a bit-array of fixed bit-length for each element of a grid representing geographic coordinates, where a bit value of “1” is assigned if a respective attribute is present at the grid element location and a bit value of “0” is assigned if the respective attribute is absent. A map image depicting selected geographic attributes may be generated and displayed from map data structure, by sequentially scanning each bit in bit-array in descending order of priority, for each grid element of selected location, and if binary value of scanned bit is “1” then attribute layer associated with bit is displayed according to assigned color value if attribute selected for display.
G01C 21/32 - Structuration ou formatage de données cartographiques
G01C 21/20 - Instruments pour effectuer des calculs de navigation
G06F 16/909 - Recherche caractérisée par l’utilisation de métadonnées, p.ex. de métadonnées ne provenant pas du contenu ou de métadonnées générées manuellement utilisant des informations géographiques ou spatiales, p.ex. la localisation
G06F 16/29 - Bases de données d’informations géographiques