Patent application number | Description | Published |
20080294288 | Autonomous Mobile Robot - A mobile robot is equipped with a range finder and a stereo vision system. The mobile robot is capable of autonomously navigating through urban terrain, generating a map based on data from the range finder and transmitting the map to the operator, as part of several reconnaissance operations selectable by the operator. The mobile robot employs a Hough transform technique to identify linear features in its environment, and then aligns itself with the identified linear features in order to navigate through the urban terrain; while at the same time, a scaled vector field histogram technique is applied to the combination of range finder and stereo vision data to detect and avoid obstacles the mobile robot encounters when navigating autonomously. Also, the missions performed by the mobile robot may include limitation parameters based on distance or time elapsed, to ensure completion of the autonomous operations. | 11-27-2008 |
20110208357 | Autonomous Mobile Robot - A mobile robot is equipped with a range finder and a stereo vision system. The mobile robot is capable of autonomously navigating through urban terrain, generating a map based on data from the range finder and transmitting the map to the operator, as part of several reconnaissance operations selectable by the operator. The mobile robot employs a Hough transform technique to identify linear features in its environment, and then aligns itself with the identified linear features in order to navigate through the urban terrain; while at the same time, a scaled vector field histogram technique is applied to the combination of range finder and stereo vision data to detect and avoid obstacles the mobile robot encounters when navigating autonomously. Also, the missions performed by the mobile robot may include limitation parameters based on distance or time elapsed, to ensure completion of the autonomous operations. | 08-25-2011 |
Patent application number | Description | Published |
20100017046 | COLLABORATIVE ENGAGEMENT FOR TARGET IDENTIFICATION AND TRACKING - A collaborative engagement system comprises: at least two unmanned vehicles comprising an unmanned air vehicle including sensors configured to locate a target and an unmanned ground vehicle including sensors configured to locate and track a target; and a controller facilitating control of, and communication and exchange of data to and among the unmanned vehicles, the controller facilitating data exchange via a common protocol. The collaborative engagement system controls the unmanned vehicles to maintain line-of-sight between a predetermined target and at least one of the unmanned vehicles. | 01-21-2010 |
20100066587 | Method and System for Controlling a Remote Vehicle - A system for controlling a remote vehicle comprises: a LIDAR sensor, a stereo vision camera, and a UWB radar sensor; a sensory processor configured to process data from one or more of the LIDAR sensor, the stereo vision camera, and the UWB radar sensor; and a remote vehicle primary processor configured to receive data from the sensory processor and utilize the data to perform an obstacle avoidance behavior. | 03-18-2010 |
20110054717 | Remote Vehicle - A system for providing enhanced operator control of a remote vehicle driving at increased speeds comprises: a head-mounted display configured to be worn by the operator and track a position of the operator's head; a head-aimed camera mounted to the remote vehicle via a pan/tilt mechanism and configured to pan and tilt in accordance with the position of the operator's head, the head-aimed camera transmitting video to be displayed to the operator via the head-mounted display; and a computer running a behavior engine, the computer receiving input from the operator and one or more sensors, and being configured to utilize the behavior engine, operator input, sensor input, and one or more autonomous and/or semi-autonomous behaviors to assist the operator in driving the remote vehicle. The remote vehicle includes releasably mounted wheels and high-friction tracks. | 03-03-2011 |
20120035786 | Weight Shifting System for Remote Vehicle - The present teachings provide a system and a method to shift a center of gravity of an unmanned ground vehicle, the system configured to determine, by a movement sensor, a present turn angle of the vehicle, determine, by a processor, a desired turn angle of the vehicle according to a turn command received from a remote control device, determine, by the processor, a difference between the present turn angle and the desired turn angle, and control, by the processor, a weight shifting system of the vehicle to relocate a weight movably attached to the weight shifting system based on the difference between the present turn angle and the desired turn angle. | 02-09-2012 |
20120290152 | Collaborative Engagement for Target Identification and Tracking - A method for controlling unmanned vehicles to maintain line-of-sight between a predetermined target and at least one unmanned vehicle. The method comprises: providing an unmanned air vehicle including sensors configured to locate a target and an unmanned ground vehicle including sensors configured to locate and track the target; communicating and exchanging data to and among the unmanned ground vehicles; controlling the unmanned air vehicle and the unmanned ground vehicle to maintain line-of-sight between a predetermined target and at least one of the unmanned air vehicles; geolocating the predetermined target with the unmanned air vehicle using information regarding a position of the unmanned air vehicle and information regarding a position of the target relative to the unmanned air vehicle; and transmitting information defining the geolocation of the predetermined target to the unmanned ground vehicle so that the unmanned ground vehicle can perform path planning based on the geolocation. | 11-15-2012 |
20150231784 | ROBOT CONTROLLER LEARNING SYSTEM - A threshold learning control system for learning a controller of a robot. The system includes a threshold learning module, a regime classifier, and an exploratory controller, each receiving sensory inputs from a sensor system of the robot. The regime classifier determines a control regime based on the received sensor inputs and communicates the control regime to the threshold learning module. The exploratory controller also receives control parameters from the threshold learning module. A control arbiter receives commands from the exploratory controller and limits from the threshold learning module, The control arbiter issues modified commands based on the received limits to the robot controller. | 08-20-2015 |
Patent application number | Description | Published |
20090002217 | Touchpad-enabled remote controller and user interaction methods - The handheld case of the remote control unit includes at least one touchpad, and other sensors, such as acceleration sensors, case perimeter sensors, pressure sensors, RF signal sensors. These sensors provide a rich array of sensory inputs that are classified by a pattern recognizer to generate control commands for both the consumer electronic equipment and the remote control unit itself. A power management system to conserve unit battery power is also responsive to the pattern recognizer to allow intelligent power management control. The control system uses the display of the consumer electronic equipment to provide instructions to the user, and the behavior of the remote control system uses what is displayed on the display as context information for pattern recognition. | 01-01-2009 |
20090262073 | TOUCH SENSITIVE REMOTE CONTROL SYSTEM THAT DETECTS HAND SIZE CHARACTERISTICS OF USER AND ADAPTS MAPPING TO SCREEN DISPLAY - Sensors around the periphery of the remote control unit detect contact with the user's hand. A trained model-based pattern classification system analyzes the periphery sensor data and makes a probabilistic prediction of the user's hand size. The hand size is then used to control a mapping system that defines how gestures by the user's thumb upon a touchpad of the remote control unit are mapped to the control region upon a separate display screen. | 10-22-2009 |
20110018817 | TOUCHPAD-ENABLED REMOTE CONTROLLER AND USER INTERACTION METHODS - The hand held case of the remote control unit includes at least one touchpad, and other sensors, such as acceleration sensors, case perimeter sensors, pressure sensors, RF signal sensors. These sensors provide a rich array of sensory inputs that are classified by a pattern recognizer to generate control commands for both the consumer electronic equipment and the remote control unit itself. A power management system to conserve unit battery power is also responsive to the pattern recognizer to allow intelligent power management control. The control system uses the display of the consumer electronic equipment to provide instructions to the user, and the behavior of the remote control system uses what is displayed on the display as context information for pattern recognition. | 01-27-2011 |