A surgical robotic system comprising: a surgical robotic arm having a plurality of robotic arm links and a plurality of joints operable to move according to multiple degrees of freedom; a proximity sensor coupled to the surgical robotic arm, the proximity sensor comprising a plurality of sensing pads operable to detect a movement of a nearby controlling object prior to contact with the surgical robotic arm; and a processor configured to determine a desired position of the surgical robotic arm based on the detected movement of the nearby controlling object and drive a movement of more than one of the plurality of robotic arm links or the plurality of joints to achieve the desired position of the surgical robotic arm.
A method performed by a video controller. The method receives a first video stream captured by an endoscope of a surgical system, and receives a second video stream that comprises surgical data. The method displays the second video stream superimposed above an area of the first video stream, and determines that the second video stream is to cease being superimposed. Responsive to determining that the second video stream is to cease being superimposed, the method continues to display the first video stream.
A61B 1/00 - Instruments pour procéder à l'examen médical de l'intérieur des cavités ou des conduits du corps par inspection visuelle ou photographique, p.ex. endoscopes; Dispositions pour l'éclairage dans ces instruments
3.
IMMERSIVE THREE-DIMENSIONAL DISPLAY FOR ROBOTIC SURGERY
An immersive display for use in a robotic surgical system includes a support arm, a housing mounted to the support arm and configured to engage with a face of the user, at least two eyepiece assemblies disposed in the housing and configured to provide a three-dimensional display, and at least one sensor, wherein the sensor enables operation of the robotic surgical system, and wherein the support arm is actuatable to move the housing for ergonomic positioning.
A61B 90/00 - Instruments, outillage ou accessoires spécialement adaptés à la chirurgie ou au diagnostic non couverts par l'un des groupes , p.ex. pour le traitement de la luxation ou pour la protection de bords de blessures
A61B 34/00 - Chirurgie assistée par ordinateur; Manipulateurs ou robots spécialement adaptés à l’utilisation en chirurgie
A61B 34/35 - Robots chirurgicaux pour la téléchirurgie
A61B 90/50 - Supports pour instruments chirurgicaux, p.ex. bras articulés
G06F 3/01 - Dispositions d'entrée ou dispositions d'entrée et de sortie combinées pour l'interaction entre l'utilisateur et le calculateur
G06T 19/00 - Transformation de modèles ou d'images tridimensionnels [3D] pour infographie
H04N 13/344 - Affichage pour le visionnement à l’aide de lunettes spéciales ou de visiocasques avec des visiocasques portant des affichages gauche et droit
A surgical robotic system is disclosed to include an operating table, a plurality of robotic arms and surgical instruments, a user console, and a control tower. The plurality of robotic arms are mounted on the operating table and can be stowed folded under the table for storage. The user console has one or more user interface devices, which function as master devices to control the plurality of surgical instruments.
A61B 34/00 - Chirurgie assistée par ordinateur; Manipulateurs ou robots spécialement adaptés à l’utilisation en chirurgie
A61B 34/20 - Systèmes de navigation chirurgicale; Dispositifs pour le suivi ou le guidage d'instruments chirurgicaux, p.ex. pour la stéréotaxie sans cadre
A61G 13/08 - Tables d'opération réglables; Leurs commandes la table étant divisée en plusieurs parties réglables
A61G 13/10 - Tables d'opération; Leurs accessoires - Parties constitutives, détails ou accessoires
5.
MULTI-PANEL GRAPHICAL USER INTERFACE FOR A ROBOTIC SURGICAL SYSTEM
A method for a robotic surgical system includes displaying a graphical user interface on a display to a user, wherein the graphical user interface includes a plurality of reconfigurable display panels, receiving a user input at one or more user input devices, wherein the user input indicates a selection of at least one software application relating to the robotic surgical system, and rendering content from the at least one selected software application among the plurality of reconfigurable display panels.
A61B 34/00 - Chirurgie assistée par ordinateur; Manipulateurs ou robots spécialement adaptés à l’utilisation en chirurgie
A61B 1/00 - Instruments pour procéder à l'examen médical de l'intérieur des cavités ou des conduits du corps par inspection visuelle ou photographique, p.ex. endoscopes; Dispositions pour l'éclairage dans ces instruments
A61B 90/00 - Instruments, outillage ou accessoires spécialement adaptés à la chirurgie ou au diagnostic non couverts par l'un des groupes , p.ex. pour le traitement de la luxation ou pour la protection de bords de blessures
G06F 3/0482 - Interaction avec des listes d’éléments sélectionnables, p.ex. des menus
G06F 3/04845 - Techniques d’interaction fondées sur les interfaces utilisateur graphiques [GUI] pour la commande de fonctions ou d’opérations spécifiques, p.ex. sélection ou transformation d’un objet, d’une image ou d’un élément de texte affiché, détermination d’une valeur de paramètre ou sélection d’une plage de valeurs pour la transformation d’images, p.ex. glissement, rotation, agrandissement ou changement de couleur
Embodiments described herein provide various examples of a machine-learning-based visual-haptic system for constructing visual-haptic models for various interactions between surgical tools and tissues. In one aspect, a process for constructing a visual-haptic model is disclosed. This process can begin by receiving a set of training videos. The process then processes each training video in the set of training videos to extract one or more video segments that depict a target tool-tissue interaction from the training video, wherein the target tool-tissue interaction involves exerting a force by one or more surgical tools on a tissue. Next, for each video segment in the set of video segments, the process annotates each video image in the video segment with a set of force levels predefined for the target tool-tissue interaction. The process subsequently trains a machine-learning model using the annotated video images to obtain a trained machine-learning model for the target tool-tissue interaction.
A61B 1/00 - Instruments pour procéder à l'examen médical de l'intérieur des cavités ou des conduits du corps par inspection visuelle ou photographique, p.ex. endoscopes; Dispositions pour l'éclairage dans ces instruments
A61B 34/00 - Chirurgie assistée par ordinateur; Manipulateurs ou robots spécialement adaptés à l’utilisation en chirurgie
A61B 34/35 - Robots chirurgicaux pour la téléchirurgie
Embodiments described herein provide various examples of monitoring adverse events in the background while displaying a higher-resolution surgical video on a lower-resolution display device. In one aspect, a process for detecting adverse events during a surgical procedure can begin by receiving a surgical video. The process then displays a first portion of the video images of the surgical video on a screen to assist a surgeon performing the surgical procedure. While displaying the first portion of the video images, the process uses a set of deep-learning models to monitor a second portion of the video images not being displayed on the screen, wherein each deep-learning model is constructed to detect a given adverse event among a set of adverse events. In response to detecting an adverse event in the second portion of the video images, the process notifies the surgeon of the detected adverse event to prompt an appropriate action.
H04N 7/18 - Systèmes de télévision en circuit fermé [CCTV], c. à d. systèmes dans lesquels le signal vidéo n'est pas diffusé
A61B 1/00 - Instruments pour procéder à l'examen médical de l'intérieur des cavités ou des conduits du corps par inspection visuelle ou photographique, p.ex. endoscopes; Dispositions pour l'éclairage dans ces instruments
A61B 1/313 - Instruments pour procéder à l'examen médical de l'intérieur des cavités ou des conduits du corps par inspection visuelle ou photographique, p.ex. endoscopes; Dispositions pour l'éclairage dans ces instruments pour l'introduction dans des incisions chirurgicales, p.ex. laparoscopes
A61B 34/00 - Chirurgie assistée par ordinateur; Manipulateurs ou robots spécialement adaptés à l’utilisation en chirurgie
A61B 34/20 - Systèmes de navigation chirurgicale; Dispositifs pour le suivi ou le guidage d'instruments chirurgicaux, p.ex. pour la stéréotaxie sans cadre
A61B 90/00 - Instruments, outillage ou accessoires spécialement adaptés à la chirurgie ou au diagnostic non couverts par l'un des groupes , p.ex. pour le traitement de la luxation ou pour la protection de bords de blessures
H04N 19/59 - Procédés ou dispositions pour le codage, le décodage, la compression ou la décompression de signaux vidéo numériques utilisant le codage prédictif mettant en œuvre un sous-échantillonnage spatial ou une interpolation spatiale, p.ex. modification de la taille de l’image ou de la résolution
8.
INTERLOCK MECHANISMS TO DISENGAGE AND ENGAGE A TELEOPERATION MODE
A method for engaging and disengaging a surgical instrument of a surgical robotic system comprising: receiving a plurality of interlock inputs from one or more interlock detection components of the surgical robotic system; determining, by one or more processors communicatively coupled to the interlock detection components, whether the plurality of interlock inputs indicate each of the following interlock requirements are satisfied: (1) a user is looking toward a display, (2) at least one or more user interface devices of the surgical robotic system are configured in a usable manner, and (3) a surgical workspace of the surgical robotic system is configured in a usable manner; in response to determining each of the interlock requirements are satisfied, transition the surgical robotic system into a teleoperation mode; and in response to determining less than all of the interlock requirements are satisfied, transition the surgical robotic system out of a teleoperation mode.
A computerized method for estimating joint friction in a joint of a robotic wrist of an end effector. Sensor measurements of force or torque in a transmission that mechanically couples a robotic wrist to an actuator, are produced. Joint friction in a joint of the robotic wrist that is driven by the actuator is computed by applying the sensor measurements of force or torque to a closed form mathematical expression that relates transmission force or torque variables to a joint friction variable. A tracking error of the end effector is also computed, using a closed form mathematical expression that relates the joint friction variable to the tracking error. Other aspects are also described and claimed.
For control of a surgical instrument in a surgical robotic system, multiple actuators establish a static pretension by actuating in opposition to each other in torque control. The static pretension reduces or removes the compliance and elasticity, reducing the backlash width. To drive the tool, the actuators are then moved in cooperation with each other in position mode control so that the movement maintains the static pretension while providing precise control.
A foot pedal assembly for controlling a robotic surgical system. The foot pedal assembly including a foot pedal base, a foot pedal and a sensor. The foot pedal moves relative to the foot pedal base and has a contact surface extending from a distal end to a proximal end of the foot pedal. The contact surface is to come into contact with a foot of a user during use of the foot pedal assembly for controlling the robotic surgical system and the distal end is farther away from a heel of the foot than the proximal end during use of the assembly for controlling the robotic surgical system. The sensor is coupled to the contact surface of the foot pedal at a position closer to the proximal end than the distal end, and the sensor is operable to sense a target object positioned a distance over the contact surface.
Engaging and/or homing is provided for a motor control of a surgical tool in a surgical robotic system. Where two or more motors are to control the same motion, the motors may be used to detect engagement even where no physical stop is provided. The motors operate in opposition to each other or in a way that does not attempt the same motion, resulting in one of the motors acting as a stop for the other motor in engagement. A change in motor operation then indicates the engagement. The known angles of engaged motors and the transmission linking the motor drives to the surgical tool indicate the home or current position of the surgical tool.
A61B 1/00 - Instruments pour procéder à l'examen médical de l'intérieur des cavités ou des conduits du corps par inspection visuelle ou photographique, p.ex. endoscopes; Dispositions pour l'éclairage dans ces instruments
A virtual representation of an operating room is generated based on robot information and sensing of the OR with depth cameras. One of the depth cameras is integrated with a portable electronic device, operated by a local user in the operating room. The virtual representation of the OR is communicated to the virtual reality headset, with three-dimensional point cloud data. A virtual reality environment is rendered to a display of the virtual reality headset, operated by a remote user. A virtual representation of the remote user is rendered in augmented reality to a display of the portable electronic device.
G16H 40/67 - TIC spécialement adaptées à la gestion ou à l’administration de ressources ou d’établissements de santé; TIC spécialement adaptées à la gestion ou au fonctionnement d’équipement ou de dispositifs médicaux pour le fonctionnement d’équipement ou de dispositifs médicaux pour le fonctionnement à distance
An autostereoscopic three-dimensional display system for surgical robotics has an autostereoscopic three-dimensional display configured to receive and display video from a surgical robotics camera, and a first sensor assembly and a second sensor assembly. A processor is configured to detect and track an eye position or a head position of a user relative to the display based on processing output data of the first sensor assembly, and to detect and track a gaze of the user based on processing output data of the second sensor assembly. The processor further is configured to modify or control an operation of the display system based on the detected gaze of the user. A spatial relationship of the display also can be automatically adjusted in relation to the user based on the detected eye or head position of the user to optimize the user's visualization of three-dimensional images on the display.
G02B 30/26 - Systèmes ou appareils optiques pour produire des effets tridimensionnels [3D], p.ex. des effets stéréoscopiques en fournissant des première et seconde images de parallaxe à chacun des yeux gauche et droit d’un observateur du type autostéréoscopique
A61B 34/35 - Robots chirurgicaux pour la téléchirurgie
G02B 27/00 - Systèmes ou appareils optiques non prévus dans aucun des groupes ,
G06F 3/01 - Dispositions d'entrée ou dispositions d'entrée et de sortie combinées pour l'interaction entre l'utilisateur et le calculateur
G06F 3/04817 - Techniques d’interaction fondées sur les interfaces utilisateur graphiques [GUI] fondées sur des propriétés spécifiques de l’objet d’interaction affiché ou sur un environnement basé sur les métaphores, p.ex. interaction avec des éléments du bureau telles les fenêtres ou les icônes, ou avec l’aide d’un curseur changeant de comport utilisant des icônes
15.
ADJUSTABLE USER CONSOLE FOR A SURGICAL ROBOTIC SYSTEM
A method performed by a surgical robotic system that includes a seat that is arranged for a user to sit and a display column that includes at least one display for displaying a three-dimensional (3D) surgical presentation. The method includes receiving an indication that the user has manually adjusted the seat and in response, determining, while the user is sitting on the seat, a position of the user's eyes, determining a configuration for the display column based on the determined position of the user's eyes, and adjusting the display column by actuating one or more actuators of the display column according to the determined configuration.
A surgical exercise performed with a surgical robotic system is sensed by depth cameras, generating 3D point cloud data. Robot system data associated with the surgical robotic system is logged. Object recognition is performed on image data produced by the one or more depth cameras, to recognized objects, including surgical equipment and people, in the operating room (OR). The surgical exercise is digitized by storing the 3D point cloud data of unrecognized objects, a position and orientation associated with the recognized objects, and c) the robot system data.
G16H 30/20 - TIC spécialement adaptées au maniement ou au traitement d’images médicales pour le maniement d’images médicales, p.ex. DICOM, HL7 ou PACS
A61B 34/20 - Systèmes de navigation chirurgicale; Dispositifs pour le suivi ou le guidage d'instruments chirurgicaux, p.ex. pour la stéréotaxie sans cadre
G06T 7/50 - Récupération de la profondeur ou de la forme
G06T 7/70 - Détermination de la position ou de l'orientation des objets ou des caméras
G16H 40/20 - TIC spécialement adaptées à la gestion ou à l’administration de ressources ou d’établissements de santé; TIC spécialement adaptées à la gestion ou au fonctionnement d’équipement ou de dispositifs médicaux pour la gestion ou l’administration de ressources ou d’établissements de soins de santé, p.ex. pour la gestion du personnel hospitalier ou de salles d’opération
G16H 50/20 - TIC spécialement adaptées au diagnostic médical, à la simulation médicale ou à l’extraction de données médicales; TIC spécialement adaptées à la détection, au suivi ou à la modélisation d’épidémies ou de pandémies pour le diagnostic assisté par ordinateur, p.ex. basé sur des systèmes experts médicaux
G16H 40/63 - TIC spécialement adaptées à la gestion ou à l’administration de ressources ou d’établissements de santé; TIC spécialement adaptées à la gestion ou au fonctionnement d’équipement ou de dispositifs médicaux pour le fonctionnement d’équipement ou de dispositifs médicaux pour le fonctionnement local
G06F 16/901 - Indexation; Structures de données à cet effet; Structures de stockage
A61B 1/00 - Instruments pour procéder à l'examen médical de l'intérieur des cavités ou des conduits du corps par inspection visuelle ou photographique, p.ex. endoscopes; Dispositions pour l'éclairage dans ces instruments
17.
ENERGY TOOL ACTIVATION DETECTION IN SURGICAL VIDEOS USING DEEP LEARNING
Embodiments described herein include a process for detecting energy tool activations. The process can begin by receiving a surgical video of a surgical procedure involving energy tool activations. The process then applies a sequence of sampling windows to the surgical video to generate a sequence of windowed samples of the surgical video. Next, for each windowed sample in the sequence of windowed samples, the process applies a deep-learning model to a sequence of video frames within the windowed sample to generate an activation/non-activation inference and a confidence level associated with the activation/non-activation inference for the windowed sample. As a result, a sequence of activation/non-activation inferences and a sequence of associated confidence levels are generated. The process subsequently identifies a sequence of activation events in the surgical video based on the sequence of activation/non-activation inferences and the sequence of associated confidence levels.
Embodiments described in this disclosure include a process for collecting energy tool usage data from surgical videos and using such data for post surgery analysis. The process can begin by receiving a plurality of surgical videos of a surgical procedure involving an energy tool. For each surgical video in the plurality of surgical videos, the process detects a set of activation events in the surgical video, wherein each detected activation event includes an identified starting timestamp and a duration. The process further extracts a set of energy tool usage data based on the set of detected activation events, and then stores the extracted set of energy tool usage data in a database indexed based on a set of energy tool usage metrics. Next, in response to a user search request, the process returns the stored energy tool usage data that matches the search request from the database.
G06V 20/40 - RECONNAISSANCE OU COMPRÉHENSION D’IMAGES OU DE VIDÉOS Éléments spécifiques à la scène dans le contenu vidéo
G06V 10/70 - Dispositions pour la reconnaissance ou la compréhension d’images ou de vidéos utilisant la reconnaissance de formes ou l’apprentissage automatique
G16H 70/20 - TIC spécialement adaptées au maniement ou au traitement de références médicales concernant des pratiques ou des directives
G16H 50/70 - TIC spécialement adaptées au diagnostic médical, à la simulation médicale ou à l’extraction de données médicales; TIC spécialement adaptées à la détection, au suivi ou à la modélisation d’épidémies ou de pandémies pour extraire des données médicales, p.ex. pour analyser les cas antérieurs d’autres patients
19.
SURGICAL ROBOTIC USER INPUT APPARATUS HAVING OPTICAL FIBER-BASED INTRINSIC SENSORS
A surgical robotic user input apparatus has a fiber optic cable with a handheld user input device attached at one end, and a connector attached at another end. Multiple intrinsic sensors, such as fiber Bragg grating sensors, are in the fiber optic cable. The intrinsic sensors are used to detect a pose of the handheld user input device. Other embodiments are also described and claimed.
G01L 1/24 - Mesure des forces ou des contraintes, en général en mesurant les variations des propriétés optiques du matériau quand il est soumis à une contrainte, p.ex. par l'analyse des contraintes par photo-élasticité
G01D 5/353 - Moyens mécaniques pour le transfert de la grandeur de sortie d'un organe sensible; Moyens pour convertir la grandeur de sortie d'un organe sensible en une autre variable, lorsque la forme ou la nature de l'organe sensible n'imposent pas un moyen de conversion déterminé; Transducteurs non spécialement adaptés à une variable particulière utilisant des moyens optiques, c. à d. utilisant de la lumière infrarouge, visible ou ultraviolette avec atténuation ou obturation complète ou partielle des rayons lumineux les rayons lumineux étant détectés par des cellules photo-électriques en modifiant les caractéristiques de transmission d'une fibre optique
20.
CONTROL MODES AND PROCESSES FOR POSITIONING OF A ROBOTIC MANIPULATOR
A method for controlling a robotic arm in a robotic surgical system includes defining a reference plane at a predetermined reference location for a robotic arm, where the robotic arm includes a plurality of joints, and driving at least one of the plurality of joints to guide the robotic arm through a series of predetermined poses substantially constrained within the reference plane.
A61B 90/00 - Instruments, outillage ou accessoires spécialement adaptés à la chirurgie ou au diagnostic non couverts par l'un des groupes , p.ex. pour le traitement de la luxation ou pour la protection de bords de blessures
A method performed by a surgical system. The method determines one or more characteristics of an end effector of an ultrasonic instrument and determines that the end effector is at least partially submerged within a liquid based on the determined one or more characteristics. In response, the method displays a notification on a display of the surgical system indicating that the end effector is at least partially submerged within the liquid.
A method performed by a surgical system. The method determines a temperature of an end effector of the ultrasonic instrument based on one or more characteristics of the end effector. The method determines that the end effector is in contact with an object, and, in response to determining that the end effector is in contact with the object and that the temperature is greater than a threshold temperature, presents a notification indicating that the end effector is too hot to be in contact with the object.
A method performed by a surgical system. The method determines a temperature of an end effector of the ultrasonic instrument based on one or more characteristics of the end effector. The method determines that the end effector is in contact with an object, and, in response to determining that the end effector is in contact with the object and that the temperature is greater than a threshold temperature, presents a notification indicating that the end effector is too hot to be in contact with the object.
A method performed by a surgical system. The method determines one or more characteristics of an end effector of an ultrasonic instrument and determines that the end effector is at least partially submerged within a liquid based on the determined one or more characteristics. In response, the method displays a notification on a display of the surgical system indicating that the end effector is at least partially submerged within the liquid.
Embodiments described herein provide various examples of a system for extracting an actual procedure duration composed of actual surgical tool-tissue interactions from an overall procedure duration of a surgical procedure on a patient. In one aspect, the system is configured to obtain the actual procedure duration by: obtaining an overall procedure duration of the surgical procedure; receiving a set of operating room (OR) data from a set of OR data sources collected during the surgical procedure, wherein the set of OR data includes an endoscope video captured during the surgical procedure; analyzing the set of OR data to detect a set of non-surgical events during the surgical procedure that do not involve surgical tool-tissue interactions; extracting a set of durations corresponding to the set of non-surgical events; and determining the actual procedure duration by subtracting the set of extracted durations from the overall procedure duration.
G16H 40/20 - TIC spécialement adaptées à la gestion ou à l’administration de ressources ou d’établissements de santé; TIC spécialement adaptées à la gestion ou au fonctionnement d’équipement ou de dispositifs médicaux pour la gestion ou l’administration de ressources ou d’établissements de soins de santé, p.ex. pour la gestion du personnel hospitalier ou de salles d’opération
A61B 34/10 - Planification, simulation ou modélisation assistées par ordinateur d’opérations chirurgicales
G16H 40/40 - TIC spécialement adaptées à la gestion ou à l’administration de ressources ou d’établissements de santé; TIC spécialement adaptées à la gestion ou au fonctionnement d’équipement ou de dispositifs médicaux pour la gestion d’équipement ou de dispositifs médicaux, p.ex. pour planifier la maintenance ou les mises à jour
26.
SURGICAL INSTRUMENT RECOGNITION FROM SURGICAL VIDEOS
A machine learning model has two stages. In a first stage, features from one or more frames of a surgical video are extracted, wherein the features include presence of a surgical instrument and type of the surgical instrument. A second stage analyzes the surgical video based on the extracted features to recognize a video segment, wherein the recognized video segment includes a detected presence of the surgical instrument, and where the video segment is recognized by a multi-stage temporal convolution network (MS-TCN) or a vision transformer. Other aspects are also described and claimed.
G06V 20/40 - RECONNAISSANCE OU COMPRÉHENSION D’IMAGES OU DE VIDÉOS Éléments spécifiques à la scène dans le contenu vidéo
G06V 10/77 - Dispositions pour la reconnaissance ou la compréhension d’images ou de vidéos utilisant la reconnaissance de formes ou l’apprentissage automatique utilisant l’intégration et la réduction de données, p.ex. analyse en composantes principales [PCA] ou analyse en composantes indépendantes [ ICA] ou cartes auto-organisatrices [SOM]; Séparation aveugle de source
G06V 10/82 - Dispositions pour la reconnaissance ou la compréhension d’images ou de vidéos utilisant la reconnaissance de formes ou l’apprentissage automatique utilisant les réseaux neuronaux
27.
SYSTEMS AND METHODS FOR CONTROLLING GRIP FORCE OF JAWS WHEN TRANSITIONING BETWEEN POSITION CONTROL MODE AND FORCE MODE
Disclosed are systems and methods for achieving a smooth transition in the grip force when the wrist jaws transition between the position and force mode. In the position mode, the desired jaw angle is above a threshold corresponding to an angle at which both jaws are just simultaneously in contact with an object held between the jaws or, if there is no object, when the jaws begin to touch each other. A feedback loop may determine that the jaws are transitioning between the modes based on changes of the desired jaw angle. The feedback loop may analyze the commanded grip force and the measured grip force to determine whether to adjust the commanded grip force during the transition. If so, the feedback loop may adjust the commanded grip force to reduce changes in the measured grip force that is otherwise based on the desired jaw angle.
A61B 90/00 - Instruments, outillage ou accessoires spécialement adaptés à la chirurgie ou au diagnostic non couverts par l'un des groupes , p.ex. pour le traitement de la luxation ou pour la protection de bords de blessures
G16H 40/63 - TIC spécialement adaptées à la gestion ou à l’administration de ressources ou d’établissements de santé; TIC spécialement adaptées à la gestion ou au fonctionnement d’équipement ou de dispositifs médicaux pour le fonctionnement d’équipement ou de dispositifs médicaux pour le fonctionnement local
G16H 20/40 - TIC spécialement adaptées aux thérapies ou aux plans d’amélioration de la santé, p.ex. pour manier les prescriptions, orienter la thérapie ou surveiller l’observance par les patients concernant des thérapies mécaniques, la radiothérapie ou des thérapies invasives, p.ex. la chirurgie, la thérapie laser, la dialyse ou l’acuponcture
28.
ATTACHMENT MECHANISM FOR DOCKING CANNULAS TO SURGICAL ROBOTIC ARMS
A cannula sterile adapter for attachment of a cannula to a robotic surgical system, the adapter comprising: a rigid barrier portion having a cannula interface defining an opening dimensioned to receive a cannula lug, a first cannula interface structure extending from the cannula interface, and a second cannula interface structure, the first cannula interface structure and the second cannula interface structure are dimensioned to interface with alignment structures of a cannula lug; and a flexible barrier portion molded to the rigid barrier portion, the flexible barrier portion defining a cavity around the opening of the rigid barrier portion that is dimensioned to receive a cannula lug inserted therein, the cavity having a first side defined by the first cannula interface structure and a second side along which the second cannula interface structure is positioned, and wherein the second cannula interface structure is entirely surrounded by the flexible barrier portion.
A robotic surgical system and method are disclosed for handling real-time and non-real-time traffic. In one embodiment, a surgical robotic system is provided comprising at least one robotic arm coupled to an operating table; and a control computer comprising a processor and a hardware interface, wherein the processor is configured to: receive a notification about real-time data from the operating table at the hardware interface; process the real-time data immediately upon receiving the notification; and poll the hardware interface for non-real time data from the operating table only when not processing the real-time data. Other embodiments are provided.
A61B 34/00 - Chirurgie assistée par ordinateur; Manipulateurs ou robots spécialement adaptés à l’utilisation en chirurgie
G16H 40/63 - TIC spécialement adaptées à la gestion ou à l’administration de ressources ou d’établissements de santé; TIC spécialement adaptées à la gestion ou au fonctionnement d’équipement ou de dispositifs médicaux pour le fonctionnement d’équipement ou de dispositifs médicaux pour le fonctionnement local
30.
BELT TERMINATION AND TENSIONING IN A PULLEY ARRANGEMENT FOR A ROBOTIC ARM
In one variation, a pulley arrangement includes a base pulley portion rotatable within a driving plane, an adjustable pulley portion coupled to the base pulley portion wherein the adjustable pulley portion is rotatable relative to the base pulley portion within the driving plane, and a driving member including an end coupled to the adjustable pulley portion wherein at least a portion of the driving member is wrapped at least partially around the adjustable pulley portion. In another variation, a pulley arrangement includes a base pulley portion rotatable around an axis, an adjustable pulley portion coupled to the base pulley portion and movable in a first direction parallel to the axis, and a sliding block engaged with the adjustable pulley portion, wherein the sliding block moves in a second direction different from the first direction, in response to compression of the adjustable pulley portion against the base pulley portion.
F16H 19/06 - Transmissions comportant essentiellement et uniquement des engrenages ou des organes de friction et qui ne peuvent transmettre un mouvement rotatif indéfini pour convertir un mouvement rotatif en mouvement alternatif et vice versa comportant un organe flexible sans fin
B25J 9/10 - Manipulateurs à commande programmée caractérisés par des moyens pour régler la position des éléments manipulateurs
F16H 55/52 - Poulies ou disques de friction réglables par construction
31.
ACTUATION COMBINER MECHANISM AND MOTOR FOR A SURGICAL TOOL
A surgical tool for a surgical robotic system, the surgical tool comprising: a surgical tool grasper having a jaw operable to perform a surgical procedure; a handle coupled to the surgical tool grasper and having a lever operable to actuate the jaw; and an actuation combiner mechanism coupled to the lever and operable to combine an actuation force output of the lever with an actuation force output of a motor to control the operation of the jaw or the lever.
A surgical tool for a surgical robotic system, the surgical tool comprising: a surgical tool grasper having a jaw operable to perform a surgical procedure; a handle coupled to the surgical tool grasper and having a lever operable to actuate the jaw; an actuation combiner mechanism coupled to the lever and operable to combine an actuation force output of the lever with an actuation force output of a motor into an output link to control the jaw or the lever; and one or more processors configured to analyze a characteristic associated with the actuation force output of the lever or the motor to optimize the surgical procedure.
A surgical tool for a surgical robotic system, the surgical tool comprising: a surgical tool grasper having a jaw operable to perform a surgical procedure; a handle coupled to the surgical tool grasper and having a lever operable to actuate the jaw; and an actuation combiner mechanism coupled to the lever and operable to combine a first actuation force input of a lever input link from the lever with a second actuation force input of a mechanical input link from a mechanical actuator into an output link to control the operation of the jaw or the lever. Other aspects are also described and claimed.
Embodiments described herein provide systems and techniques for tracking workflow inside an operating room (OR) by identifying and tracking target objects, such as a patient bed or a surgical table in the OR. In one aspect, a process for identifying and tracking a target object in an OR begins by receiving a depth image among a sequence of depth images captured by a depth camera installed in the OR. The process then generates a three-dimensional (3D) point cloud based on the depth image. Next, the process identifies a set of potential target points in the 3D point cloud that potentially belongs to the target object based on one or more target object criteria. The process next extracts one or more object clusters of from the set of potential target points using a data-point clustering technique. The process subsequently identifies the target object from the extracted one or more object clusters.
G06V 10/762 - Dispositions pour la reconnaissance ou la compréhension d’images ou de vidéos utilisant la reconnaissance de formes ou l’apprentissage automatique utilisant le regroupement, p.ex. de visages similaires sur les réseaux sociaux
G06V 20/52 - Activités de surveillance ou de suivi, p.ex. pour la reconnaissance d’objets suspects
G06T 7/70 - Détermination de la position ou de l'orientation des objets ou des caméras
G06T 7/80 - Analyse des images capturées pour déterminer les paramètres de caméra intrinsèques ou extrinsèques, c. à d. étalonnage de caméra
G16H 40/20 - TIC spécialement adaptées à la gestion ou à l’administration de ressources ou d’établissements de santé; TIC spécialement adaptées à la gestion ou au fonctionnement d’équipement ou de dispositifs médicaux pour la gestion ou l’administration de ressources ou d’établissements de soins de santé, p.ex. pour la gestion du personnel hospitalier ou de salles d’opération
A surgical tool for a surgical robotic system, the surgical tool comprising: a surgical tool grasper operable to perform a surgical procedure; and a handle coupled to the surgical tool grasper and having a lever operable to actuate the surgical tool grasper to perform the surgical procedure, the lever configured to move about a first pivot point and coupled to a bi-stable latch assembly configured to move about a second pivot point, and wherein a position of the bi-stable latch assembly relative to a boundary line intersecting the first pivot point and the second pivot point causes the bi-stable latch assembly to latch the lever in a closed position or unlatch the lever.
Embodiments described herein provide systems and techniques for tracking and deidentifying person in a captured operating room (OR) video. In one aspect, a process for deidentifying OR personnel in an OR video begins by simultaneously receiving a color image captured by an RGB camera and a depth image captured by a depth camera installed in the vicinity of the RGB camera. The process then generates a 3D point cloud based on the depth image. Next, the process applies a human-body detector to the 3D point cloud to detect a set of 3D bodies in the 3D point cloud, wherein each detected 3D body corresponds to a detected person in the OR. The process next projects each detected 3D body into a 2D body outline in the color image to represent the same detected person in the color image. The process subsequently de-identifies the detected people in the color image based on the projected 2D body outlines.
G06V 10/77 - Dispositions pour la reconnaissance ou la compréhension d’images ou de vidéos utilisant la reconnaissance de formes ou l’apprentissage automatique utilisant l’intégration et la réduction de données, p.ex. analyse en composantes principales [PCA] ou analyse en composantes indépendantes [ ICA] ou cartes auto-organisatrices [SOM]; Séparation aveugle de source
G06V 10/46 - Descripteurs pour la forme, descripteurs liés au contour ou aux points, p.ex. transformation de caractéristiques visuelles invariante à l’échelle [SIFT] ou sacs de mots [BoW]; Caractéristiques régionales saillantes
G06V 10/25 - Détermination d’une région d’intérêt [ROI] ou d’un volume d’intérêt [VOI]
G06T 7/73 - Détermination de la position ou de l'orientation des objets ou des caméras utilisant des procédés basés sur les caractéristiques
A surgical tool for a surgical robotic system, the surgical tool comprising: a surgical tool grasper having a jaw operable to perform a surgical procedure; a handle coupled to the surgical tool grasper and having a lever operable to actuate the jaw; and an actuation combiner mechanism coupled to the lever and operable to combine a first actuation force input of a lever input link from the lever with a second actuation force input of a mechanical input link from a mechanical actuator into an output link to control the operation of the jaw or the lever.
A braking assembly for a strain wave gearing of a surgical robotic manipulator, the braking assembly including a first braking member fixedly coupled to an input portion of a strain wave gearing of a surgical robotic manipulator; and a second braking member fixedly coupled to an output portion of the strain wave gearing, and wherein during a braking operation the first braking member contacts the second braking member to mechanically brake the input portion to the output portion.
A61B 34/00 - Chirurgie assistée par ordinateur; Manipulateurs ou robots spécialement adaptés à l’utilisation en chirurgie
B25J 9/10 - Manipulateurs à commande programmée caractérisés par des moyens pour régler la position des éléments manipulateurs
B25J 19/00 - Accessoires adaptés aux manipulateurs, p.ex. pour contrôler, pour observer; Dispositifs de sécurité combinés avec les manipulateurs ou spécialement conçus pour être utilisés en association avec ces manipulateurs
Generally, a system for use in a robotic surgical system may be used to determine an attachment state between a tool driver, sterile adapter, and surgical tool of the system. The system may include sensors used to generate attachment data corresponding to the attachment state. The attachment state may be used to control operation of the tool driver and surgical tool. In some variations, one or more of the attachment states may be visually output to an operator using one or more of the tool driver, sterile adapter, and surgical tool. In some variations, the tool driver and surgical tool may include electronic communication devices configured to be in close proximity when the surgical tool is attached to the sterile adapter and tool driver.
A61B 46/10 - Draps de chirurgie spécialement adaptés aux instruments
A61B 90/00 - Instruments, outillage ou accessoires spécialement adaptés à la chirurgie ou au diagnostic non couverts par l'un des groupes , p.ex. pour le traitement de la luxation ou pour la protection de bords de blessures
B25J 15/04 - Têtes de préhension avec possibilité pour l'enlèvement ou l'échange à distance de la tête ou de parties de celle-ci
A61B 34/35 - Robots chirurgicaux pour la téléchirurgie
B29C 45/00 - Moulage par injection, c. à d. en forçant un volume déterminé de matière à mouler par une buse d'injection dans un moule fermé; Appareils à cet effet
B29C 65/02 - Assemblage d'éléments préformés; Appareils à cet effet par chauffage, avec ou sans pressage
A robotic surgical system includes at least one robotic arm comprising at least one movable joint and an actuator configured to drive the at least one movable joint, and a controller configured to generate a first signal, the first signal comprising a first oscillating waveform having a first frequency and being modulated by a second oscillating waveform having a second frequency, wherein the second frequency is higher than the first frequency. The actuator is configured to drive the at least one movable joint based on the first signal to at least partially compensate for friction in the at least one movable joint.
The disclosed embodiments relate to systems and methods for a surgical tool or a surgical robotic system. An actuator or a motor of a tool driver is configured to operate a joint of a tool. One or more processors are configured to receive an initial joint command for the joint of the tool, determine a joint torque based on motor torque of the motor or actuator as well as motor to joint torque mapping, calculate a tip force based on an effective length associated with the joint and based on the joint torque, compare the tip force to a predetermined threshold, calculate an admittance control compensation term in response to the tip force exceeding the predetermined threshold, and generate a command for the motor or actuator based on the admittance control compensation term and the initial joint command.
A61B 34/35 - Robots chirurgicaux pour la téléchirurgie
B25J 9/12 - Manipulateurs à commande programmée caractérisés par des moyens pour régler la position des éléments manipulateurs électriques
A61B 90/00 - Instruments, outillage ou accessoires spécialement adaptés à la chirurgie ou au diagnostic non couverts par l'un des groupes , p.ex. pour le traitement de la luxation ou pour la protection de bords de blessures
42.
STERILE ADAPTER DRIVE DISKS FOR USE IN A ROBOTIC SURGICAL SYSTEM
Generally, a sterile adapter for use in robotic surgery may include a frame configured to be interposed between a tool driver and a surgical tool, a plate assembly coupled to the frame, and at least one rotatable coupler supported by the plate assembly and configured to communicate torque from an output drive of the tool driver to an input drive of the surgical tool.
A61B 46/10 - Draps de chirurgie spécialement adaptés aux instruments
A61B 90/00 - Instruments, outillage ou accessoires spécialement adaptés à la chirurgie ou au diagnostic non couverts par l'un des groupes , p.ex. pour le traitement de la luxation ou pour la protection de bords de blessures
B25J 15/04 - Têtes de préhension avec possibilité pour l'enlèvement ou l'échange à distance de la tête ou de parties de celle-ci
A61B 34/35 - Robots chirurgicaux pour la téléchirurgie
B29C 45/00 - Moulage par injection, c. à d. en forçant un volume déterminé de matière à mouler par une buse d'injection dans un moule fermé; Appareils à cet effet
B29C 65/02 - Assemblage d'éléments préformés; Appareils à cet effet par chauffage, avec ou sans pressage
Communication apparatus and devices for surgical robotic systems are described. The communication apparatus can include a user console in communication with a communication device having a surgical tool. The communication device can include a microphone to convert a sound input into an acoustic input signal. The communication device can transmit the acoustic input signal to the user console for reproduction as a sound output for a remote operator. The surgical tool can include an endoscope having several microphones mounted on a housing. The surgical tool can be a sterile barrier having a microphone and a drape. The microphone(s) of the surgical tools can face a surrounding environment such that a tableside staff is a source of the sound input that causes the sound output, and a surgeon and the tableside staff can communicate in a noisy environment. Other embodiments are also described and claimed.
A61B 1/00 - Instruments pour procéder à l'examen médical de l'intérieur des cavités ou des conduits du corps par inspection visuelle ou photographique, p.ex. endoscopes; Dispositions pour l'éclairage dans ces instruments
G10K 11/178 - Procédés ou dispositifs de protection contre le bruit ou les autres ondes acoustiques ou pour amortir ceux-ci, en général utilisant des effets d'interférence; Masquage du son par régénération électro-acoustique en opposition de phase des ondes acoustiques originales
H04R 1/02 - Boîtiers; Meubles; Montages à l'intérieur de ceux-ci
H04R 1/40 - Dispositions pour obtenir la fréquence désirée ou les caractéristiques directionnelles pour obtenir la caractéristique directionnelle désirée uniquement en combinant plusieurs transducteurs identiques
A61B 1/313 - Instruments pour procéder à l'examen médical de l'intérieur des cavités ou des conduits du corps par inspection visuelle ou photographique, p.ex. endoscopes; Dispositions pour l'éclairage dans ces instruments pour l'introduction dans des incisions chirurgicales, p.ex. laparoscopes
A61B 46/10 - Draps de chirurgie spécialement adaptés aux instruments
A61B 1/04 - Instruments pour procéder à l'examen médical de l'intérieur des cavités ou des conduits du corps par inspection visuelle ou photographique, p.ex. endoscopes; Dispositions pour l'éclairage dans ces instruments combinés avec des dispositifs photographiques ou de télévision
44.
METHOD AND SYSTEM FOR ESTIMATING TEMPERATURE OF AN ULTRASONIC INSTRUMENT
A method performed by a surgical system. The method determines that an ultrasonic instrument is in a low-power state The method determines a resonance frequency of an end effector of the ultrasonic instrument and determines a temperature of the end effector based on the resonance frequency. A notification is displayed on a display of the surgical system based on the temperature.
A61B 18/04 - Instruments, dispositifs ou procédés chirurgicaux pour transférer des formes non mécaniques d'énergie vers le corps ou à partir de celui-ci par chauffage
A61B 17/00 - Instruments, dispositifs ou procédés chirurgicaux, p.ex. tourniquets
45.
METHOD AND SYSTEM FOR MODEL-BASED TEMPERATURE ESTIMATION OF AN ULTRASONIC INSTRUMENT
A method performed by a surgical system that includes an ultrasonic instrument with an end effector. The method determines a change in resonance frequency of the end effector while the ultrasonic instrument is either in 1) a high-power state in which the ultrasonic instrument draws a first current to cause the end effector to produce heat or 2) a low-power state in which the ultrasonic instrument draws a second current, which is less than the first current that does not cause the end effector to produce heat. The method determines a temperature of the end effector by applying the change in resonance frequency to a hysteresis model that includes a hysteretic relationship between changes in resonance frequency of the end effector and corresponding temperatures of the end effector, and outputs a notification based on the temperature.
An annotation system facilitates collection of labels for images, video, or other content items relevant to training machine learning models associated with surgical applications or other medical applications. The annotation system enables an administrator to configure annotation jobs associated with training a machine learning model. The job configuration controls presentation of content items to various participating annotators via an annotation application and collection of the labels via a user interface of the annotation application. The annotation application enables the participating annotators to provide inputs in a simple and efficient manner, such as by providing gesture-based inputs or selecting graphical elements associated with different possible labels.
G16H 50/00 - TIC spécialement adaptées au diagnostic médical, à la simulation médicale ou à l’extraction de données médicales; TIC spécialement adaptées à la détection, au suivi ou à la modélisation d’épidémies ou de pandémies
G16H 30/00 - TIC spécialement adaptées au maniement ou au traitement d’images médicales
G16H 10/60 - TIC spécialement adaptées au maniement ou au traitement des données médicales ou de soins de santé relatives aux patients pour des données spécifiques de patients, p.ex. pour des dossiers électroniques de patients
A61B 34/00 - Chirurgie assistée par ordinateur; Manipulateurs ou robots spécialement adaptés à l’utilisation en chirurgie
An annotation system facilitates collection of labels for images, video, or other content items relevant to training machine learning models associated with surgical applications or other medical applications. The annotation system enables an administrator to configure annotation jobs associated with training a machine learning model. The job configuration controls presentation of content items to various participating annotators via an annotation application and collection of the labels via a user interface of the annotation application. The annotation application enables the participating annotators to provide inputs in a simple and efficient manner, such as by providing gesture-based inputs or selecting graphical elements associated with different possible labels.
G06V 10/778 - Apprentissage de profils actif, p.ex. apprentissage en ligne des caractéristiques d’images ou de vidéos
G06V 20/70 - RECONNAISSANCE OU COMPRÉHENSION D’IMAGES OU DE VIDÉOS Éléments spécifiques à la scène Étiquetage du contenu de scène, p.ex. en tirant des représentations syntaxiques ou sémantiques
G06V 10/94 - Architectures logicielles ou matérielles spécialement adaptées à la compréhension d’images ou de vidéos
G06F 3/0482 - Interaction avec des listes d’éléments sélectionnables, p.ex. des menus
G06F 3/0488 - Techniques d’interaction fondées sur les interfaces utilisateur graphiques [GUI] utilisant des caractéristiques spécifiques fournies par le périphérique d’entrée, p.ex. des fonctions commandées par la rotation d’une souris à deux capteurs, ou par la nature du périphérique d’entrée, p.ex. des gestes en fonction de la pression exer utilisant un écran tactile ou une tablette numérique, p.ex. entrée de commandes par des tracés gestuels
G06V 10/774 - Dispositions pour la reconnaissance ou la compréhension d’images ou de vidéos utilisant la reconnaissance de formes ou l’apprentissage automatique utilisant l’intégration et la réduction de données, p.ex. analyse en composantes principales [PCA] ou analyse en composantes indépendantes [ ICA] ou cartes auto-organisatrices [SOM]; Séparation aveugle de source méthodes de Bootstrap, p.ex. "bagging” ou “boosting”
48.
METHOD AND SYSTEM FOR ESTIMATING TEMPERATURE OF AN ULTRASONIC INSTRUMENT
A method performed by a surgical system. The method determines that an ultrasonic instrument is in a low-power state The method determines a resonance frequency of an end effector of the ultrasonic instrument and determines a temperature of the end effector based on the resonance frequency. A notification is displayed on a display of the surgical system based on the temperature.
A method performed by a surgical system that includes an ultrasonic instrument with an end effector. The method determines a change in resonance frequency of the end effector while the ultrasonic instrument is either in 1) a high-power state in which the ultrasonic instrument draws a first current to cause the end effector to produce heat or 2) a low-power state in which the ultrasonic instrument draws a second current, which is less than the first current that does not cause the end effector to produce heat. The method determines a temperature of the end effector by applying the change in resonance frequency to a hysteresis model that includes a hysteretic relationship between changes in resonance frequency of the end effector and corresponding temperatures of the end effector, and outputs a notification based on the temperature.
In some embodiments, an apparatus can include a robotic arm cart for transporting, delivering, and securing robotic arms to a surgical table having a table top. The arm cart can include an arm container and a base. The arm container can be configured to receive and contain one or more robotic arms. The arm cart can include a first coupling member configured to engage with a second coupling member associated with a surgical table such that, when the first coupling member is engaged with the second coupling member, the one or more robotic arms can be releasably coupled with the surgical table. The arm cart can provide for movement of the one or more robotic arms in at least one of a lateral, longitudinal, or vertical direction relative to the table top prior to the securement of the one or more robotic arms to the surgical table.
B62B 3/02 - Voitures à bras ayant plus d'un essieu portant les roues servant au déplacement; Dispositifs de direction à cet effet; Appareillage à cet effet comportant des parties réglables, rabattables, attachables, détachables ou transformables
An analysis system trains a machine learning model to detect stapling events from a video of a surgical procedure. The machine learning model detects times when stapling events occur as well as one or more characteristics of each stapling event such as length of staples, clamping time, or other characteristics. The machine learning model is trained on videos of surgical procedures identifying when stapling events occurred through a learning process. The machine learning model may be applied to an input video to detect a sequence of stapler events. Stapler event sequences may furthermore be analyzed and/or aggregated to generate various analytical data relating to the surgical procedures for applications such as inventor management, performance evaluation, or predicting patient outcomes.
Disclosed are various systems and techniques for performing video-based surgeon technical-skill assessments and classifications. In one aspect, a process for classifying a surgeon's technical skill in performing a surgery is disclosed. During operation, the process receives a tool-motion track comprising a sequence of detected tool motions of a surgeon performing a surgery with a surgical tool. The process then generates a sequence of multi-channel feature matrices to mathematically represent the tool-motion track. Next, the process performs a one-dimensional (1D) convolution operation on the sequence of multi-channel feature matrices to generate a sequence of context-aware multi-channel feature representations of the tool-motion track. The sequence of context-aware multi-channel feature representations is subsequently processed by a transformer model to generate the skill classification, wherein the transformer model is trained to identify and focus on a subset of tool motions in the sequence of detected tool motions that are most relevant to the skill classification.
G16H 50/00 - TIC spécialement adaptées au diagnostic médical, à la simulation médicale ou à l’extraction de données médicales; TIC spécialement adaptées à la détection, au suivi ou à la modélisation d’épidémies ou de pandémies
G16H 30/40 - TIC spécialement adaptées au maniement ou au traitement d’images médicales pour le traitement d’images médicales, p.ex. l’édition
A61B 34/20 - Systèmes de navigation chirurgicale; Dispositifs pour le suivi ou le guidage d'instruments chirurgicaux, p.ex. pour la stéréotaxie sans cadre
53.
VIDEO-BASED ANALYSIS OF STAPLING EVENTS DURING A SURGICAL PROCEDURE USING MACHINE LEARNING
An analysis system trains a machine learning model to detect stapling events from a video of a surgical procedure. The machine learning model detects times when stapling events occur as well as one or more characteristics of each stapling event such as length of staples, clamping time, or other characteristics. The machine learning model is trained on videos of surgical procedures identifying when stapling events occurred through a learning process. The machine learning model may be applied to an input video to detect a sequence of stapler events. Stapler event sequences may furthermore be analyzed and/or aggregated to generate various analytical data relating to the surgical procedures for applications such as inventor management, performance evaluation, or predicting patient outcomes.
G16H 50/20 - TIC spécialement adaptées au diagnostic médical, à la simulation médicale ou à l’extraction de données médicales; TIC spécialement adaptées à la détection, au suivi ou à la modélisation d’épidémies ou de pandémies pour le diagnostic assisté par ordinateur, p.ex. basé sur des systèmes experts médicaux
G16H 30/00 - TIC spécialement adaptées au maniement ou au traitement d’images médicales
G16H 40/60 - TIC spécialement adaptées à la gestion ou à l’administration de ressources ou d’établissements de santé; TIC spécialement adaptées à la gestion ou au fonctionnement d’équipement ou de dispositifs médicaux pour le fonctionnement d’équipement ou de dispositifs médicaux
Surgical systems including a user console for controlling a surgical robotic tool are described. A witness sensor and a reference sensor can be mounted on the user console to measure an electromagnetic field distortion near a location, and to measure deformation of the location, respectively. Distortion in the electromagnetic field can be detected based on the measurements from the witness sensor and the reference sensor. An alert can be generated, or teleoperation of the surgical tool can be adjusted or paused, when a user interface device used to control the surgical tool is within a range of the distortion. The distortion can be from a known source, such as from actuation of a haptic motor of the user interface device, and the user console can adjust the actuation to reduce the likelihood that the distortion will disrupt surgical tool control. Other embodiments are described and claimed.
A61B 34/35 - Robots chirurgicaux pour la téléchirurgie
A61B 34/20 - Systèmes de navigation chirurgicale; Dispositifs pour le suivi ou le guidage d'instruments chirurgicaux, p.ex. pour la stéréotaxie sans cadre
A61B 34/00 - Chirurgie assistée par ordinateur; Manipulateurs ou robots spécialement adaptés à l’utilisation en chirurgie
55.
SURGICAL ROBOTIC TOOL MULTI-MOTOR ACTUATOR AND CONTROLLER
A first input coupling and a second input coupling are coupled to rotatably drive an output coupling at the same time. In one embodiment, the output coupling rotates a robotic surgery endoscope about a longitudinal axis of the output coupling. A first motor drives the first input coupling while being assisted by a second motor that is driving the second input coupling. A first compensator produces a first motor input based on a position error and in accordance with a position control law, and a second compensator produces a second motor input based on the position error and in accordance with an impedance control law. In another embodiment, the second compensator receives a measured torque of the first motor. Other embodiments are also described and claimed.
A61B 1/00 - Instruments pour procéder à l'examen médical de l'intérieur des cavités ou des conduits du corps par inspection visuelle ou photographique, p.ex. endoscopes; Dispositions pour l'éclairage dans ces instruments
A61B 1/05 - Instruments pour procéder à l'examen médical de l'intérieur des cavités ou des conduits du corps par inspection visuelle ou photographique, p.ex. endoscopes; Dispositions pour l'éclairage dans ces instruments combinés avec des dispositifs photographiques ou de télévision caractérisés par le fait que le capteur d'images, p.ex. l'appareil photographique, est placé dans la partie de l'extrémité distale
A61B 1/313 - Instruments pour procéder à l'examen médical de l'intérieur des cavités ou des conduits du corps par inspection visuelle ou photographique, p.ex. endoscopes; Dispositions pour l'éclairage dans ces instruments pour l'introduction dans des incisions chirurgicales, p.ex. laparoscopes
A61B 17/29 - Pinces pour la chirurgie faiblement invasive
This patent disclosure provides various verification techniques to ensure that anonymized surgical procedure videos are indeed free of any personally-identifiable information (PII). In a particular aspect, a process for verifying that an anonymized surgical procedure video is free of PII is disclosed. This process can begin by receiving a surgical video corresponding to a surgery. The process next removes personally-identifiable information (PII) from the surgical video to generate an anonymized surgical video. Next, the process selects a set of verification video segments from the anonymized surgical procedure video. The process subsequently determines whether each segment in the set of verification video segments is free of PII. If so, the process replaces the surgical video with the anonymized surgical video for storage. If not, the process performs additional PII removal steps on the anonymized surgical video to generate an updated anonymized surgical procedure video.
A61B 90/00 - Instruments, outillage ou accessoires spécialement adaptés à la chirurgie ou au diagnostic non couverts par l'un des groupes , p.ex. pour le traitement de la luxation ou pour la protection de bords de blessures
57.
APPARATUS, SYSTEMS, AND METHODS FOR INTRAOPERATIVE INSTRUMENT TRACKING AND INFORMATION VISUALIZATION
Systems and methods for intraoperative tracking and visualization are disclosed. A current minimally invasive surgical (MIS) instrument pose may be determined based on a live intraoperative input video stream comprising a current image frame captured by a MIS camera. In addition, an instrument activation state and at least one parameter value associated with the instrument may also be determined. Intraoperative graphic visualization enhancements may be determined based on the activation state of the instrument, and/or a comparison of parameter values with corresponding parametric thresholds. The visualization enhancements may be applied to a current graphics frame. The current graphics frame may also include visualization enhancements related to proximate anatomical structures with proximity determined from the instrument pose and an anatomical model. The current graphics frame may be blended with the current input image frame to obtain an output blended image frame, which may form part of an output video stream.
A61B 34/20 - Systèmes de navigation chirurgicale; Dispositifs pour le suivi ou le guidage d'instruments chirurgicaux, p.ex. pour la stéréotaxie sans cadre
A61B 34/10 - Planification, simulation ou modélisation assistées par ordinateur d’opérations chirurgicales
A61B 90/00 - Instruments, outillage ou accessoires spécialement adaptés à la chirurgie ou au diagnostic non couverts par l'un des groupes , p.ex. pour le traitement de la luxation ou pour la protection de bords de blessures
A61B 90/92 - Moyens d’identification pour les patients ou les instruments, p.ex. étiquettes utilisant des codes de couleurs
A61B 90/50 - Supports pour instruments chirurgicaux, p.ex. bras articulés
G06V 10/82 - Dispositions pour la reconnaissance ou la compréhension d’images ou de vidéos utilisant la reconnaissance de formes ou l’apprentissage automatique utilisant les réseaux neuronaux
58.
APPARATUS, SYSTEMS, AND METHODS FOR INTRAOPERATIVE INSTRUMENT TRACKING AND INFORMATION VISUALIZATION
Systems and methods for intraoperative tracking and visualization are disclosed. A current minimally invasive surgical (MIS) instrument pose may be determined based on a live intraoperative input video stream comprising a current image frame captured by a MIS camera. In addition, an instrument activation state and at least one parameter value associated with the instrument may also be determined. Intraoperative graphic visualization enhancements may be determined based on the activation state of the instrument, and/or a comparison of parameter values with corresponding parametric thresholds. The visualization enhancements may be applied to a current graphics frame. The current graphics frame may also include visualization enhancements related to proximate anatomical structures with proximity determined from the instrument pose and an anatomical model. The current graphics frame may be blended with the current input image frame to obtain an output blended image frame, which may form part of an output video stream.
A61B 90/00 - Instruments, outillage ou accessoires spécialement adaptés à la chirurgie ou au diagnostic non couverts par l'un des groupes , p.ex. pour le traitement de la luxation ou pour la protection de bords de blessures
A61B 17/00 - Instruments, dispositifs ou procédés chirurgicaux, p.ex. tourniquets
A61B 34/10 - Planification, simulation ou modélisation assistées par ordinateur d’opérations chirurgicales
A61B 34/20 - Systèmes de navigation chirurgicale; Dispositifs pour le suivi ou le guidage d'instruments chirurgicaux, p.ex. pour la stéréotaxie sans cadre
59.
MOBILE VIRTUAL REALITY SYSTEM FOR SURGICAL ROBOTIC SYSTEMS
Mobile virtual reality system for simulation, training or demonstration of a surgical robotic system can include a virtual reality processor. The processor can generate a virtual surgical robot and render the virtual surgical robot on a display. The virtual surgical robot can include a virtual surgical tool. A handheld user input device (UID) can sense a hand input from a hand. A foot input device can sense a foot input from a foot. The virtual reality processor can be configured to control a movement or action of the virtual surgical robot based on the hand input, and change which of the virtual surgical instruments is controlled by the one or more handheld UIDs based on the foot input. Other embodiments and aspects are disclosed and claimed.
This disclosure provides techniques of synchronizing the playback of two recorded videos of the same surgical procedure. In one aspect, a process for generating a composite video from two recorded videos of a surgical procedure is disclosed. This process begins by receiving a first and second surgical videos of the same surgical procedure. The process then performs phase segmentation on each of the first and second surgical videos to segment the first and second surgical videos into a first set of video segments and a second set of video segments, respectively, corresponding to a sequence of predefined phases. Next, the process time-aligns each video segment of a given predefined phase in the first video with a corresponding video segment of the given predefined phase in the second video. The process next displays the time-aligned first and second surgical videos for comparative viewing.
H04N 7/169 - Systèmes fonctionnant dans le domaine temporel du signal de télévision
G11B 27/32 - Indexation; Adressage; Minutage ou synchronisation; Mesure de l'avancement d'une bande en utilisant une information détectable sur le support d'enregistrement en utilisant des signaux d'information enregistrés par le même procédé que pour l'enregistrement principal sur des pistes auxiliaires séparées du même support d'enregistrement ou d'un support auxiliaire
G16H 70/20 - TIC spécialement adaptées au maniement ou au traitement de références médicales concernant des pratiques ou des directives
G06V 20/40 - RECONNAISSANCE OU COMPRÉHENSION D’IMAGES OU DE VIDÉOS Éléments spécifiques à la scène dans le contenu vidéo
61.
Machine-learning-based visual-haptic system for robotic surgical platforms
Embodiments described herein provide various examples of a machine-learning-based visual-haptic system for constructing visual-haptic models for various interactions between surgical tools and tissues. In one aspect, a process for constructing a visual-haptic model is disclosed. This process can begin by receiving a set of training videos. The process then processes each training video in the set of training videos to extract one or more video segments that depict a target tool-tissue interaction from the training video, wherein the target tool-tissue interaction involves exerting a force by one or more surgical tools on a tissue. Next, for each video segment in the set of video segments, the process annotates each video image in the video segment with a set of force levels predefined for the target tool-tissue interaction. The process subsequently trains a machine-learning model using the annotated video images to obtain a trained machine-learning model for the target tool-tissue interaction.
A61B 1/00 - Instruments pour procéder à l'examen médical de l'intérieur des cavités ou des conduits du corps par inspection visuelle ou photographique, p.ex. endoscopes; Dispositions pour l'éclairage dans ces instruments
A61B 34/00 - Chirurgie assistée par ordinateur; Manipulateurs ou robots spécialement adaptés à l’utilisation en chirurgie
A61B 34/35 - Robots chirurgicaux pour la téléchirurgie
A61B 34/20 - Systèmes de navigation chirurgicale; Dispositifs pour le suivi ou le guidage d'instruments chirurgicaux, p.ex. pour la stéréotaxie sans cadre
A61B 1/04 - Instruments pour procéder à l'examen médical de l'intérieur des cavités ou des conduits du corps par inspection visuelle ou photographique, p.ex. endoscopes; Dispositions pour l'éclairage dans ces instruments combinés avec des dispositifs photographiques ou de télévision
62.
METHOD AND SYSTEM FOR AUTOMATICALLY TRACKING AND MANAGING INVENTORY OF SURGICAL TOOLS IN OPERATING ROOMS
Embodiments described herein provide various examples of automatically processing surgical videos to detect surgical tools and tool-related events, and extract surgical-tool usage information. In one aspect, a process for automatically tracking usages of robotic surgery tools is disclosed. This process can begin by receiving a surgical video captured during a robotic surgery. The process then processes the surgical video to detect a surgical tool in the surgical video. Next, the process determines whether the detected surgical tool has been engaged in the robotic surgery. If so, the process further determines whether the detected surgical tool is engaged for a first time in the robotic surgery. If the detected surgical tool is engaged for the first time, the process subsequently increments a total-engagement count of the detected surgical tool. Otherwise, the process continues monitoring the detected surgical tool in the surgical video.
G16H 40/20 - TIC spécialement adaptées à la gestion ou à l’administration de ressources ou d’établissements de santé; TIC spécialement adaptées à la gestion ou au fonctionnement d’équipement ou de dispositifs médicaux pour la gestion ou l’administration de ressources ou d’établissements de soins de santé, p.ex. pour la gestion du personnel hospitalier ou de salles d’opération
G06Q 10/08 - Logistique, p.ex. entreposage, chargement ou distribution; Gestion d’inventaires ou de stocks
63.
REDUNDANT ROBOT POWER AND COMMUNICATION ARCHITECTURE
An electronic circuit for a surgical robotic system includes a central power node, a first voltage bus that electrically couples a first power source to the node, a second voltage bus that electrically couples a second power source to the node, and several robotic arms, each arm is electrically coupled to the node via an output circuit breaker and is arranged to draw power from the node. Each bus is arranged to provide power from a respective power source to the node and each bus has an input circuit breaker that is arranged to limit a first output current flow from the node and into the bus. Each breaker that is arranged to limit a second output current flow from the node and into a respective arm. A breaker is arranged to open in response to a fault occurring within the respective arm, while the other breakers remain closed.
H02H 7/00 - Circuits de protection de sécurité spécialement adaptés pour des machines ou appareils électriques de types particuliers ou pour la protection sectionnelle de systèmes de câble ou ligne, et effectuant une commutation automatique dans le cas d'un chan
H02H 7/085 - Circuits de protection de sécurité spécialement adaptés pour des machines ou appareils électriques de types particuliers ou pour la protection sectionnelle de systèmes de câble ou ligne, et effectuant une commutation automatique dans le cas d'un chan pour moteurs dynamo-électriques contre une charge excessive
64.
GRAPHICAL USER GUIDANCE FOR A ROBOTIC SURGICAL SYSTEM
Graphical user guidance for a robotic surgical system is provided. In one embodiment, a graphical user interface for a robotic surgical system comprises a first region and a second region. The first region is used to display an endoscopic view of a surgical site inside a patient taken by an endoscopic camera of the robotic surgical system, and the second region is used to display user feedback information. The graphical user interface overlays a guidance message on top of the endoscopic view of the surgical site in the first region to provide user instructions for interacting with a user input device to engage a robotic arm of the robotic surgical system. Other embodiments are provided.
A61B 90/00 - Instruments, outillage ou accessoires spécialement adaptés à la chirurgie ou au diagnostic non couverts par l'un des groupes , p.ex. pour le traitement de la luxation ou pour la protection de bords de blessures
G16H 40/67 - TIC spécialement adaptées à la gestion ou à l’administration de ressources ou d’établissements de santé; TIC spécialement adaptées à la gestion ou au fonctionnement d’équipement ou de dispositifs médicaux pour le fonctionnement d’équipement ou de dispositifs médicaux pour le fonctionnement à distance
A61B 34/35 - Robots chirurgicaux pour la téléchirurgie
65.
Systems and methods for sensing of and docking with a trocar
A surgical robotic system has a tool drive coupled to a distal end of a robotic arm that has a plurality of actuators. The tool drive has a docking interface to receive a trocar. The system also includes one or more sensors that are operable to visually sense a surface feature of the trocar. One or more processors determine a position and orientation of the trocar, based on the visually sensed surface feature. In response, the processor controls the actuators to orient the docking interface to the determined orientation of the trocar and to guide the robotic arm toward the determined position of the trocar. Other aspects are also described and claimed.
A61B 34/35 - Robots chirurgicaux pour la téléchirurgie
A61B 90/50 - Supports pour instruments chirurgicaux, p.ex. bras articulés
A61B 34/20 - Systèmes de navigation chirurgicale; Dispositifs pour le suivi ou le guidage d'instruments chirurgicaux, p.ex. pour la stéréotaxie sans cadre
A61B 34/10 - Planification, simulation ou modélisation assistées par ordinateur d’opérations chirurgicales
A61B 17/00 - Instruments, dispositifs ou procédés chirurgicaux, p.ex. tourniquets
66.
REAL-TIME SURGICAL TOOL PRESENCE/ABSENCE DETECTION IN SURGICAL VIDEOS
Embodiments described herein provide various techniques and systems for building machine-learning surgical tool presence/absence detection models for processing surgical videos and predicting whether a surgical tool is present or absent in each video frame of a surgical video. In one aspect, a process for ensuring patient safety during a laparoscopic or robotic surgery involving an energy tool is disclosed. The process can begin receiving a real-time control signal indicating an operating state of an energy tool during the surgery. Next, the process receives real-time endoscope video images of the surgery. The process simultaneously applies a machine-learning surgical tool presence/absence detection model to the real-time endoscope video images to generate real-time decisions on a location of the energy tool in the real-time endoscope video images. The process then checks the real-time control signal against the real-time decisions to identify an unsafe event and takes a proper action when an unsafe event is identified.
A61B 18/12 - Instruments, dispositifs ou procédés chirurgicaux pour transférer des formes non mécaniques d'énergie vers le corps ou à partir de celui-ci par chauffage en faisant passer des courants à travers les tissus à chauffer, p.ex. des courants à haute fréquence
G16H 20/40 - TIC spécialement adaptées aux thérapies ou aux plans d’amélioration de la santé, p.ex. pour manier les prescriptions, orienter la thérapie ou surveiller l’observance par les patients concernant des thérapies mécaniques, la radiothérapie ou des thérapies invasives, p.ex. la chirurgie, la thérapie laser, la dialyse ou l’acuponcture
Disclosed are various face-detection and human de-identification systems and techniques based on deep learning. In one aspect, a process for de-identifying people captured in an operating room (OR) video is disclosed. This process can begin by receiving a sequence of video frames from an OR video. Next, the process applies a first machine-learning face detector based on a first deep-learning model to each video frame in the sequence of video frames to generate a first set of detected faces. The process further applies a second machine-learning face detector to the sequence of video frames to generate a second set of detected faces, wherein the second machine-learning face detector is constructed based on a second deep-learning model different from the first deep-learning model. The process subsequently de-identifies the received sequence of video frames by blurring out both the first set of detected faces and the second set of detected faces.
G06T 5/20 - Amélioration ou restauration d'image en utilisant des opérateurs locaux
G06V 40/16 - Visages humains, p.ex. parties du visage, croquis ou expressions
G06V 10/82 - Dispositions pour la reconnaissance ou la compréhension d’images ou de vidéos utilisant la reconnaissance de formes ou l’apprentissage automatique utilisant les réseaux neuronaux
Embodiments described herein provide various techniques and systems for building machine-learning surgical tool presence/absence detection models for processing surgical videos and predicting whether a surgical tool is present or absent in each video frame of a surgical video. In one aspect, a process for ensuring patient safety during a laparoscopic or robotic surgery involving an energy tool is disclosed. The process can begin receiving a real-time control signal indicating an operating state of an energy tool during the surgery. Next, the process receives real-time endoscope video images of the surgery. The process simultaneously applies a machine-learning surgical tool presence/absence detection model to the real-time endoscope video images to generate real-time decisions on a location of the energy tool in the real-time endoscope video images. The process then checks the real-time control signal against the real-time decisions to identify an unsafe event and takes a proper action when an unsafe event is identified.
A61B 34/20 - Systèmes de navigation chirurgicale; Dispositifs pour le suivi ou le guidage d'instruments chirurgicaux, p.ex. pour la stéréotaxie sans cadre
A61B 18/12 - Instruments, dispositifs ou procédés chirurgicaux pour transférer des formes non mécaniques d'énergie vers le corps ou à partir de celui-ci par chauffage en faisant passer des courants à travers les tissus à chauffer, p.ex. des courants à haute fréquence
A61B 90/00 - Instruments, outillage ou accessoires spécialement adaptés à la chirurgie ou au diagnostic non couverts par l'un des groupes , p.ex. pour le traitement de la luxation ou pour la protection de bords de blessures
A61B 1/00 - Instruments pour procéder à l'examen médical de l'intérieur des cavités ou des conduits du corps par inspection visuelle ou photographique, p.ex. endoscopes; Dispositions pour l'éclairage dans ces instruments
A61B 18/00 - Instruments, dispositifs ou procédés chirurgicaux pour transférer des formes non mécaniques d'énergie vers le corps ou à partir de celui-ci
69.
ROBUST OPERATING ROOM VIDEO ANONYMIZATION BASED ON ENSEMBLE DEEP LEARNING
Disclosed are various face-detection and human de-identification systems and techniques based on deep learning. In one aspect, a process for de-identifying people captured in an operating room (OR) video is disclosed. This process can begin by receiving a sequence of video frames from an OR video. Next, the process applies a first machine-learning face detector based on a first deep-learning model to each video frame in the sequence of video frames to generate a first set of detected faces. The process further applies a second machine-learning face detector to the sequence of video frames to generate a second set of detected faces, wherein the second machine-learning face detector is constructed based on a second deep-learning model different from the first deep-learning model. The process subsequently de-identifies the received sequence of video frames by blurring out both the first set of detected faces and the second set of detected faces.
Disclosed are various systems and techniques for tracking surgical tools in a surgical video. In one aspect, the system begins by receiving one or more established tracks for one or more previously-detected surgical tools in the surgical video. The system then processes a current frame of the surgical video to detect one or more objects using a first deep-learning model. Next, for each detected object in the one or more detected objects, the system further performs the flowing steps to assign the detected object to a right track: (1) computing a semantic similarity between the detected object and each of the one or more established tracks; (2) computing a spatial similarity between the detected object and the latest predicted location for each of the one or more established tracks; and (3) attempting to assign the detected object to one of the one or more established tracks based on the computed semantic similarity and the spatial similarity metric.
A61B 34/20 - Systèmes de navigation chirurgicale; Dispositifs pour le suivi ou le guidage d'instruments chirurgicaux, p.ex. pour la stéréotaxie sans cadre
A61B 90/00 - Instruments, outillage ou accessoires spécialement adaptés à la chirurgie ou au diagnostic non couverts par l'un des groupes , p.ex. pour le traitement de la luxation ou pour la protection de bords de blessures
G16H 20/40 - TIC spécialement adaptées aux thérapies ou aux plans d’amélioration de la santé, p.ex. pour manier les prescriptions, orienter la thérapie ou surveiller l’observance par les patients concernant des thérapies mécaniques, la radiothérapie ou des thérapies invasives, p.ex. la chirurgie, la thérapie laser, la dialyse ou l’acuponcture
G16H 30/20 - TIC spécialement adaptées au maniement ou au traitement d’images médicales pour le maniement d’images médicales, p.ex. DICOM, HL7 ou PACS
G16H 30/40 - TIC spécialement adaptées au maniement ou au traitement d’images médicales pour le traitement d’images médicales, p.ex. l’édition
71.
TRACKING MULTIPLE SURGICAL TOOLS IN A SURGICAL VIDEO
Disclosed are various systems and techniques for tracking surgical tools in a surgical video. In one aspect, the system begins by receiving one or more established tracks for one or more previously-detected surgical tools in the surgical video. The system then processes a current frame of the surgical video to detect one or more objects using a first deep-learning model. Next, for each detected object in the one or more detected objects, the system further performs the flowing steps to assign the detected object to a right track: (1) computing a semantic similarity between the detected object and each of the one or more established tracks; (2) computing a spatial similarity between the detected object and the latest predicted location for each of the one or more established tracks; and (3) attempting to assign the detected object to one of the one or more established tracks based on the computed semantic similarity and the spatial similarity metric.
G06T 7/246 - Analyse du mouvement utilisant des procédés basés sur les caractéristiques, p.ex. le suivi des coins ou des segments
G06V 10/26 - Segmentation de formes dans le champ d’image; Découpage ou fusion d’éléments d’image visant à établir la région de motif, p.ex. techniques de regroupement; Détection d’occlusion
G06V 10/82 - Dispositions pour la reconnaissance ou la compréhension d’images ou de vidéos utilisant la reconnaissance de formes ou l’apprentissage automatique utilisant les réseaux neuronaux
G06V 10/764 - Dispositions pour la reconnaissance ou la compréhension d’images ou de vidéos utilisant la reconnaissance de formes ou l’apprentissage automatique utilisant la classification, p.ex. des objets vidéo
G06V 10/74 - Appariement de motifs d’image ou de vidéo; Mesures de proximité dans les espaces de caractéristiques
G06V 10/77 - Dispositions pour la reconnaissance ou la compréhension d’images ou de vidéos utilisant la reconnaissance de formes ou l’apprentissage automatique utilisant l’intégration et la réduction de données, p.ex. analyse en composantes principales [PCA] ou analyse en composantes indépendantes [ ICA] ou cartes auto-organisatrices [SOM]; Séparation aveugle de source
G06V 10/774 - Dispositions pour la reconnaissance ou la compréhension d’images ou de vidéos utilisant la reconnaissance de formes ou l’apprentissage automatique utilisant l’intégration et la réduction de données, p.ex. analyse en composantes principales [PCA] ou analyse en composantes indépendantes [ ICA] ou cartes auto-organisatrices [SOM]; Séparation aveugle de source méthodes de Bootstrap, p.ex. "bagging” ou “boosting”
A61B 34/20 - Systèmes de navigation chirurgicale; Dispositifs pour le suivi ou le guidage d'instruments chirurgicaux, p.ex. pour la stéréotaxie sans cadre
72.
Surgical Robotic System and Method for Transitioning Control to a Secondary Robot Controller
A robotic surgical system and method are disclosed for transitioning control to a secondary robotic arm controller. In one embodiment, a robotic surgical system comprises a user console comprising a display device and a user input device; a robotic arm configured to be coupled to an operating table; a primary robotic arm controller configured to move the robotic arm in response to a signal received from the user input device at the user console; and a secondary robotic arm controller configured to move the robotic arm in response to a signal received from a user input device remote from the user console. Control over movement of the robotic arm is transitioned from the primary robotic arm controller to the secondary robotic arm controller in response to a failure in the primary robotic arm controller. Other embodiments are provided.
A foot pedal assembly for controlling a robotic surgical system. The foot pedal assembly including a foot pedal base, a foot pedal and a sensor. The foot pedal moves relative to the foot pedal base and has a contact surface extending from a distal end to a proximal end of the foot pedal. The contact surface is to come into contact with a foot of a user during use of the foot pedal assembly for controlling the robotic surgical system and the distal end is farther away from a heel of the foot than the proximal end during use of the assembly for controlling the robotic surgical system. The sensor is coupled to the contact surface of the foot pedal at a position closer to the proximal end than the distal end, and the sensor is operable to sense a target object positioned a distance over the contact surface.
A surgical robotic system has a robotic grasper, a user interface device (UID), and one or more processors communicatively coupled to the UID and the robotic grasper. The system detects a directive to engage or re-engage a teleoperation mode, determines that the system is in a non-teleoperation mode, receives a sequence of user actions through the UID, determines the UID matches a jaw angle or a grip force of the robotic grasper, and transitions into teleoperation mode. Other embodiments are also described and claimed.
G16H 40/67 - TIC spécialement adaptées à la gestion ou à l’administration de ressources ou d’établissements de santé; TIC spécialement adaptées à la gestion ou au fonctionnement d’équipement ou de dispositifs médicaux pour le fonctionnement d’équipement ou de dispositifs médicaux pour le fonctionnement à distance
75.
SURGEON DISENGAGEMENT DETECTION DURING TERMINATION OF TELEOPERATION
A method for disengagement detection of a surgical instrument of a surgical robotic system, the method comprising: determining whether a user's head is unstable prior to disengagement of a teleoperation mode; determining whether a pressure release has occurred relative to at least one of a first user input device or a second user input device for controlling a surgical instrument of the surgical robotic system during the teleoperation mode; and in response to determining the user's head is unstable or determining the pressure release has occurred, determining whether a distance change between the first user input device and the second user input device indicates the user is performing an unintended action prior to disengagement of the teleoperation mode.
A surgical stapler for a surgical robotic system, the surgical stapler including a jaw coupled to a base, the jaw having a first anvil that moves relative to a second anvil between an open position and a closed position; and a force sensor operable to detect a force applied to the jaw.
A61B 90/00 - Instruments, outillage ou accessoires spécialement adaptés à la chirurgie ou au diagnostic non couverts par l'un des groupes , p.ex. pour le traitement de la luxation ou pour la protection de bords de blessures
77.
Apparatus, systems, and methods for intraoperative visualization
Errors in a blended stream that would result in non-display or obscuring of a live video stream from a medical device may be automatically detected, and a failover stream corresponding to the first live video stream may be displayed to medical personnel. For example, one or more second input streams that are being blended may contain no data or invalid data which may result in the blended stream not displaying (or obscuring) the live video stream (if the blended were displayed). Switching from blending to a failover buffer may occur within the time to process a single video image frame. Upon detection (prior to display) that the blended stream would not display the live video stream, display of the live video stream from the failover buffer may be initiated. Other aspects are also described and claimed.
A61B 1/00 - Instruments pour procéder à l'examen médical de l'intérieur des cavités ou des conduits du corps par inspection visuelle ou photographique, p.ex. endoscopes; Dispositions pour l'éclairage dans ces instruments
H04N 19/59 - Procédés ou dispositions pour le codage, le décodage, la compression ou la décompression de signaux vidéo numériques utilisant le codage prédictif mettant en œuvre un sous-échantillonnage spatial ou une interpolation spatiale, p.ex. modification de la taille de l’image ou de la résolution
A61B 90/00 - Instruments, outillage ou accessoires spécialement adaptés à la chirurgie ou au diagnostic non couverts par l'un des groupes , p.ex. pour le traitement de la luxation ou pour la protection de bords de blessures
An energy tool for a surgical robotic system, the energy tool comprising: a jaw coupled to a base, the jaw having a first anvil that moves relative to a second anvil between an open position and a closed position; and at least one of a force sensor, a temperature sensor and an acoustic sensor coupled to the jaw.
A61B 90/00 - Instruments, outillage ou accessoires spécialement adaptés à la chirurgie ou au diagnostic non couverts par l'un des groupes , p.ex. pour le traitement de la luxation ou pour la protection de bords de blessures
A61B 5/00 - Mesure servant à établir un diagnostic ; Identification des individus
A surgical stapler for a surgical robotic system, the surgical stapler including a jaw coupled to a base, the jaw having a first anvil that moves relative to a second anvil between an open position and a closed position; and a force sensor operable to detect a force applied to the jaw.
The disclosed embodiments relate to systems and methods for a surgical tool or a surgical robotic system. One example system for detecting a hardstop for a surgical tool includes a wrist connected to and driven by a plurality of cables of a tool driver, a plurality of sensors configured to detect forces associated with the plurality of cables one or more processors configured to perform a comparison of the forces associated with the plurality of cables, selected a highest tension cable from the plurality of cables based on the comparison of the forces associated with the plurality of cables, set a force assigned to the highest tension cable to a predetermined value, calculate a variable torque threshold for the wrist based on a sum of the predetermined value for the highest tension cable and detected forces for remaining cables in the plurality of cables, receive a joint torque value for the wrist, perform a comparison of the received joint torque value for the wrist to a variable wrist torque threshold and identify a hardstop based on the comparison of the received joint torque value for the wrist to the variable wrist torque threshold.
A61B 90/00 - Instruments, outillage ou accessoires spécialement adaptés à la chirurgie ou au diagnostic non couverts par l'un des groupes , p.ex. pour le traitement de la luxation ou pour la protection de bords de blessures
B25J 3/00 - Manipulateurs de type à commande asservie, c. à d. manipulateurs dans lesquels l'unité de commande et l'unité commandée exécutent des mouvements correspondants dans l'espace
Errors in a blended stream that would result in non-display or obscuring of a live video stream from a medical device may be automatically detected, and a failover stream corresponding to the first live video stream may be displayed to medical personnel. For example, one or more second input streams that are being blended may contain no data or invalid data which may result in the blended stream not displaying (or obscuring) the live video stream (if the blended were displayed). Switching from blending to a failover buffer may occur within the time to process a single video image frame. Upon detection (prior to display) that the blended stream would not display the live video stream, display of the live video stream from the failover buffer may be initiated. Other aspects are also described and claimed.
A61B 90/00 - Instruments, outillage ou accessoires spécialement adaptés à la chirurgie ou au diagnostic non couverts par l'un des groupes , p.ex. pour le traitement de la luxation ou pour la protection de bords de blessures
A61B 34/00 - Chirurgie assistée par ordinateur; Manipulateurs ou robots spécialement adaptés à l’utilisation en chirurgie
G16H 30/20 - TIC spécialement adaptées au maniement ou au traitement d’images médicales pour le maniement d’images médicales, p.ex. DICOM, HL7 ou PACS
G16H 30/40 - TIC spécialement adaptées au maniement ou au traitement d’images médicales pour le traitement d’images médicales, p.ex. l’édition
G16H 20/40 - TIC spécialement adaptées aux thérapies ou aux plans d’amélioration de la santé, p.ex. pour manier les prescriptions, orienter la thérapie ou surveiller l’observance par les patients concernant des thérapies mécaniques, la radiothérapie ou des thérapies invasives, p.ex. la chirurgie, la thérapie laser, la dialyse ou l’acuponcture
82.
METHOD AND SYSTEM FOR CONTROLLING AND DISPLAYING VIDEO STREAMS
A method performed by a video controller. The method receives a first video stream captured by an endoscope of a surgical system, and receives a second video stream that comprises surgical data. The method displays the second video stream superimposed above an area of the first video stream, and determines that the second video stream is to cease being superimposed. Responsive to determining that the second video stream is to cease being superimposed, the method continues to display the first video stream.
A61B 1/00 - Instruments pour procéder à l'examen médical de l'intérieur des cavités ou des conduits du corps par inspection visuelle ou photographique, p.ex. endoscopes; Dispositions pour l'éclairage dans ces instruments
The disclosed embodiments relate to systems and methods for a surgical tool or a surgical robotic system. An end effector of the surgical tool is coupled to a tool driver. An actuator is driven by a motor of the tool driver and configured to drive a degree of freedom of the end effector. One or more processors are configured to receive a position command describing a desired position for the end effector, translate the desired position to a command for a joint associated with the end effector, calculate a compensation term to compensate for a source of hysteresis for backlash and/or compliance, and send a motor command for the motor coupled with the actuator based on the compensation term and the command for the end effector.
A61B 34/35 - Robots chirurgicaux pour la téléchirurgie
B25J 9/12 - Manipulateurs à commande programmée caractérisés par des moyens pour régler la position des éléments manipulateurs électriques
A61B 90/00 - Instruments, outillage ou accessoires spécialement adaptés à la chirurgie ou au diagnostic non couverts par l'un des groupes , p.ex. pour le traitement de la luxation ou pour la protection de bords de blessures
An energy tool for a surgical robotic system, the energy tool comprising: a jaw coupled to a base, the jaw having a first anvil that moves relative to a second anvil between an open position and a closed position; and at least one of a force sensor, a temperature sensor and an acoustic sensor coupled to the jaw.
Robotic arms and surgical robotic systems incorporating such arms are described. A robotic arm includes a roll joint connected to a prismatic link by a pitch joint and a tool drive connected to the prismatic link by another pitch joint. The prismatic link includes several prismatic sublinks that are connected by a prismatic joint. A surgical tool supported by the tool drive can insert into a patient along an insertion axis through a remote center of motion of the robotic arm. Movement of the robotic arm can be controlled to telescopically move the prismatic sublinks relative to each other by the prismatic joint while maintaining the remote center of motion fixed. Other embodiments are also described and claimed.
The disclosed embodiments relate to systems and methods for a surgical tool or a surgical robotic system. One example system for detecting a hardstop for a surgical tool includes a wrist connected to and driven by a plurality of cables of a tool driver, a plurality of sensors configured to detect forces associated with the plurality of cables one or more processors configured to perform a comparison of the forces associated with the plurality of cables, selected a highest tension cable from the plurality of cables based on the comparison of the forces associated with the plurality of cables, set a force assigned to the highest tension cable to a predetermined value, calculate a variable torque threshold for the wrist based on a sum of the predetermined value for the highest tension cable and detected forces for remaining cables in the plurality of cables, receive a joint torque value for the wrist, perform a comparison of the received joint torque value for the wrist to a variable wrist torque threshold and identify a hardstop based on the comparison of the received joint torque value for the wrist to the variable wrist torque threshold.
A61B 90/00 - Instruments, outillage ou accessoires spécialement adaptés à la chirurgie ou au diagnostic non couverts par l'un des groupes , p.ex. pour le traitement de la luxation ou pour la protection de bords de blessures
A61B 34/35 - Robots chirurgicaux pour la téléchirurgie
A61B 34/00 - Chirurgie assistée par ordinateur; Manipulateurs ou robots spécialement adaptés à l’utilisation en chirurgie
87.
PROXIMITY SENSORS FOR SURGICAL ROBOTIC ARM MANIPULATION
A surgical robotic system including a surgical table, a surgical robotic manipulator coupled to the surgical table and comprising a plurality of links coupled together by a plurality of joints that are operable to move with respect to one another to move the surgical robotic manipulator, at least one of the plurality of links or the plurality of joints having a portion that faces another of the plurality of links or the plurality of joints, a proximity sensing assembly coupled to the portion of the at least one of the plurality of links or the plurality of joints, the proximity sensing assembly operable to detect an object prior to the surgical robotic manipulator colliding with the object and to output a corresponding detection signal, and a processor operable to receive the corresponding detecting signal and cause the manipulator or the object to engage in a collision avoidance operation.
For control about a remote center of motion (RCM) of a surgical robotic system, possible configurations of a robotic manipulator are searched to find the configuration providing a greatest overlap of the workspace of the surgical instrument with the target anatomy. The force at the RCM may be measured, such as with one or more sensors on the cannula or in an adaptor connecting the robotic manipulator to the cannula. The measured force is used to determine a change in the RCM to minimize the force exerted on the patient at the RCM. Given this change, the configuration of the robotic manipulator may be dynamically updated. Various aspects of this RCM control may be used alone or in combination, such as to optimize the alignment of workspace to the target anatomy, to minimize force at the RCM, and/or to dynamically control the robotic manipulator configuration based on workspace alignment and force measurement.
A method for engaging and disengaging a surgical instrument of a surgical robotic system including receiving a sequence of user inputs from one or more user interface devices of the surgical robotic system; determining, by one or more processors communicatively coupled to the user interface devices and the surgical instrument, whether the sequence of user inputs indicates an intentional engagement or disengagement of a teleoperation mode in which the surgical instrument is controlled by user inputs received from the user interface devices; in response to determining of engagement, transition the surgical robotic system into the teleoperation mode; and in response to determining of disengagement, transition the surgical robotic system out of the teleoperation mode such that the user interface devices are prevented from controlling the surgical instrument.
G06F 3/033 - Dispositifs de pointage déplacés ou positionnés par l'utilisateur; Leurs accessoires
G06F 3/0346 - Dispositifs de pointage déplacés ou positionnés par l'utilisateur; Leurs accessoires avec détection de l’orientation ou du mouvement libre du dispositif dans un espace en trois dimensions [3D], p.ex. souris 3D, dispositifs de pointage à six degrés de liberté [6-DOF] utilisant des capteurs gyroscopiques, accéléromètres ou d’inclinaiso
G06F 3/0354 - Dispositifs de pointage déplacés ou positionnés par l'utilisateur; Leurs accessoires avec détection des mouvements relatifs en deux dimensions [2D] entre le dispositif de pointage ou une partie agissante dudit dispositif, et un plan ou une surface, p.ex. souris 2D, boules traçantes, crayons ou palets
A system and computerized method for detection of engagement of a surgical tool to a tool drive of a robotic arm of a surgical robotic system. The method may include activating an actuator of the tool drive to rotate a drive disk to be mechanically engaged with a tool disk in the surgical tool. One or more motor operating parameters of the actuator that is causing the rotation of the drive disk are monitored while activating the actuator. The method detects when the drive disk becomes mechanically engaged with the tool disk, based on the one or more monitored motor operating parameters. Other embodiments are also described and claimed.
The disclosed embodiments relate to systems and methods for a surgical tool or a surgical robotic system. An example computer-implemented method for evaluating calibrations of a surgical tool includes fixating a joint of the surgical tool at a first angle, the joint being driven by an actuator, measuring an actuator position corresponding to the first angle, accessing a calibrated offset corresponding to the first angle, determining an expected joint angle based on the measured actuator position and the calibrated offset, and reporting a first difference between the expected joint angle and the first angle.
Disclosed are various user-presence/absence detection techniques based on deep learning. These user-presence/absence detection techniques can include building/training a deep-learning model including a user-presence/absence classifier based on training images of a user-seating area of a surgeon console under various clinically-relevant conditions. The trained user-presence/absence classifier can then be used during teleoperation/surgical procedures to monitor/track users in the user-seating area of the surgeon console, and continuously classify captured real-time video images of the user-seating area into either a user-presence classification or a user-absence classification. In some embodiments, the disclosed techniques can be used to detect a user-switching event at the surgeon console when a second user is detected to have entered the user-seating area after a first user is detected to have exited the user-seating area. If the second user is identified as a new user, the disclosed techniques can trigger a recalibration procedure to recalibrate surgeon-console settings for the new user.
A computerized method for estimating joint friction in a joint of a robotic wrist of an end effector. Sensor measurements of force or torque in a transmission that mechanically couples a robotic wrist to an actuator, are produced. Joint friction in a joint of the robotic wrist that is driven by the actuator is computed by applying the sensor measurements of force or torque to a closed form mathematical expression that relates transmission force or torque variables to a joint friction variable. A tracking error of the end effector is also computed, using a closed form mathematical expression that relates the joint friction variable to the tracking error. Other aspects are also described and claimed.
For teleoperation of a surgical robotic system, the control of the surgical robotic system accounts for a limited degree of freedom of a tool in a surgical robotic system. A projection from the greater DOF of the user input commands to the lesser DOF of the tool is included within or as part of the inverse kinematics. The projection identifies feasible motion in the end-effector domain. This projection allows for a general solution that works for tools having different degrees of freedom and will converge on a solution.
For teleoperation of a surgical robotic system, the control of the surgical robotic system accounts for a limited degree of freedom of a tool in a surgical robotic system. A projection from the greater DOF of the user input commands to the lesser DOF of the tool is included within or as part of the inverse kinematics. The projection identifies feasible motion in the end-effector domain. This projection allows for a general solution that works for tools having different degrees of freedom and will converge on a solution.
For teleoperation of a surgical robotic system, the user command for the pose of the end effector is projected into a subspace reachable by the end effector. For example, a user command with six DOF is projected to a five DOF subspace. The six DOF user interface device may be used to more intuitively control, based on the projection, the end effector with the limited DOF relative to the user interface device.
For teleoperation of a surgical robotic system, the user command for the pose of the end effector is projected into a subspace reachable by the end effector. For example, a user command with six DOF is projected to a five DOF subspace. The six DOF user interface device may be used to more intuitively control, based on the projection, the end effector with the limited DOF relative to the user interface device.
For a scalable filtering infrastructure, a library of filters each usable at different control rates is provided by defining filters in a continuous time mode despite eventual use for digital filtering. For implementation, a filter is selected and discretized for the desired control rate. The discretized filter is then deployed as a discrete time realization for convolution. In a distributed system with multiple control rates, the library may be used to more rapidly and conveniently generate the desired filters.
G16H 40/63 - TIC spécialement adaptées à la gestion ou à l’administration de ressources ou d’établissements de santé; TIC spécialement adaptées à la gestion ou au fonctionnement d’équipement ou de dispositifs médicaux pour le fonctionnement d’équipement ou de dispositifs médicaux pour le fonctionnement local
For a scalable filtering infrastructure, a library of filters each usable at different control rates is provided by defining filters in a continuous time mode despite eventual use for digital filtering. For implementation, a filter is selected and discretized for the desired control rate. The discretized filter is then deployed as a discrete time realization for convolution. In a distributed system with multiple control rates, the library may be used to more rapidly and conveniently generate the desired filters.
A61B 34/00 - Chirurgie assistée par ordinateur; Manipulateurs ou robots spécialement adaptés à l’utilisation en chirurgie
G16H 40/40 - TIC spécialement adaptées à la gestion ou à l’administration de ressources ou d’établissements de santé; TIC spécialement adaptées à la gestion ou au fonctionnement d’équipement ou de dispositifs médicaux pour la gestion d’équipement ou de dispositifs médicaux, p.ex. pour planifier la maintenance ou les mises à jour
G16H 40/67 - TIC spécialement adaptées à la gestion ou à l’administration de ressources ou d’établissements de santé; TIC spécialement adaptées à la gestion ou au fonctionnement d’équipement ou de dispositifs médicaux pour le fonctionnement d’équipement ou de dispositifs médicaux pour le fonctionnement à distance
100.
Surgical robotic system having anthropometry-based user console
Surgical robotic systems including a user console for controlling a robotic arm or a surgical robotic tool are described. The user console includes components designed to automatically adapt to anthropometric characteristics of a user. A processor of the surgical robotic system is configured to receive anthropometric inputs corresponding to the anthropometric characteristics and to generate an initial console configuration of the user console based on the inputs using a machine learning model. Actuators automatically adjust a seat, a display, or one or more pedals of the user console to the initial console configuration. The initial console configuration establishes a comfortable relative position between the user and the console components. Other embodiments are described and claimed.
A47C 3/20 - Chaises ou tabourets à siège réglable dans le sens vertical
A47C 7/56 - Eléments ou parties constitutives de sièges se rabattant, p.ex. de fauteuils de théâtre
A47C 7/72 - Adaptations pour incorporer des lampes, des postes radios, des bars, des téléphones, des dispositifs de ventilation, de chauffage ou de refroidissement, ou analogues
A47C 1/02 - Chaises ou fauteuils de relaxation ou liseuses
G16H 20/40 - TIC spécialement adaptées aux thérapies ou aux plans d’amélioration de la santé, p.ex. pour manier les prescriptions, orienter la thérapie ou surveiller l’observance par les patients concernant des thérapies mécaniques, la radiothérapie ou des thérapies invasives, p.ex. la chirurgie, la thérapie laser, la dialyse ou l’acuponcture
A61B 34/00 - Chirurgie assistée par ordinateur; Manipulateurs ou robots spécialement adaptés à l’utilisation en chirurgie
B25J 13/06 - Postes de commande, p.ex. pupitres, tableaux de contrôle
A61B 17/29 - Pinces pour la chirurgie faiblement invasive
A61B 5/107 - Mesure de dimensions corporelles, p.ex. la taille du corps entier ou de parties de celui-ci