Entries |
Document | Title | Date |
20080201016 | Robot and Method of Registering a Robot - A robot has a controllable arm which carries an instrument or tool. The robot is provided with a camera to obtain an image of a work piece, including images of markers and an indicator present on the work piece. The robot processes the images to determine the position of the markers within a spatial frame of reference. The robot is controlled to effect predetermined movements of the instrument or tool relative to the work piece. The processor is further configured to determine the position of the indicator and to respond to movement of the indicator within the spatial frame of reference of the robot when the markers are concealed to determine a new position of the indicator and thus the new position of the work piece. Subsequently, the robot is controlled, relative to the new position of the work piece, to effect predetermined movements relative to the work piece. | 08-21-2008 |
20080201017 | Medical tele-robotic system - A robotic system that includes a remote controlled robot. The robot may include a camera, a monitor and a holonomic platform all attached to a robot housing. The robot may be controlled by a remote control station that also has a camera and a monitor. The remote control station may be linked to a base station that is wirelessly coupled to the robot. The cameras and monitors allow a care giver at the remote location to monitor and care for a patient through the robot. The holonomic platform allows the robot to move about a home or facility to locate and/or follow a patient. | 08-21-2008 |
20080215184 | Method for searching target object and following motion thereof through stereo vision processing and home intelligent service robot using the same - A home intelligent service robot for recognizing a user and following the motion of a user and a method thereof are provided. The home intelligent service robot includes a driver, a vision processor, and a robot controller. The driver moves an intelligent service robot according to an input moving instruction. The vision processor captures images through at least two or more cameras in response to a capturing instruction for following a target object, minimizes the information amount of the captured image, and discriminates objects in the image into the target object and obstacles. The robot controller provides the capturing instruction for following the target object in a direction of collecting instruction information to the vision processor when the instruction information is collected from outside, and controls the intelligent service robot to follow and move the target object while avoiding obstacles based on the discriminating information from the vision processor. | 09-04-2008 |
20080215185 | Unmanned ground robotic vehicle having an alternatively extendible and retractable sensing appendage - An unmanned robotic vehicle is capable of sensing an environment at a location remote from the immediate area of the vehicle frame. The unmanned robotic vehicle includes a retractable appendage with a sensing element. The sensing element can include a camera, chemical sensor, optical sensor, force sensor, or the like. | 09-04-2008 |
20080221734 | Categorical Color Perception System - The present invention relates to a categorical color perception system which automatically judges a categorical color and aims to judge a categorical color name correctly under various ambient lights. Test color measured at an experiment is inputted to an input layer portion corresponding to test color components | 09-11-2008 |
20080228320 | Robot - To provide a robot whose degree of freedom of design is not limited, and which has simple structure and further reduces load of an actuator of a neck part, the present invention provides a robot at least including a head part, a body part, and a neck link which connects the head part and the body part, wherein a surrounding object distance measurement means is provided adjacently to the neck link and in an upper portion of the body part between the head part and the body part, and a distance scanning field of the surrounding object distance measurement means is provided in parallel with a horizontal plane. | 09-18-2008 |
20080234866 | MASTER-SLAVE MANIPULATOR SYSTEM - In a master-slave manipulator system, manipulation device can be manipulated intuitively even when clutch manipulation is performed. A master-slave manipulator system includes: mode switching device for switching between a master-slave mode, in which the slave manipulator is controlled, and an observation device visual field tracking clutch mode, in which transmission of an operation command to the slave manipulator from the manipulation device is cut off to move the manipulation device to an optional position and orientation; a switching unit control section that reads a signal of the mode switching device to forward a mode signal to the manipulation device control section; and a visual field transform section that forwards a third control command to the manipulator control section and forwards a fourth control command to the visual field change control section on the basis of an operation command read by the manipulation device control section at the time of the observation device visual field tracking clutch mode so as to make an agreement between a direction of motion of an image of the slave manipulator displayed on the display device and a direction of manipulation of the manipulation device. | 09-25-2008 |
20080249663 | Robot control information generator and robot - A robot control information generator generates control information for operating a robot equipped with a camera and a hand to grasp an object based on a two-dimensional code on the object. The two-dimensional code includes position identifying patterns and an information pattern, the position within the two-dimensional code of each of the position-identifying patterns is specified beforehand, and the information pattern is generated by encoding of information. The robot control information generator comprises an image input unit, a pattern detection unit, a position/posture calculation unit, a decoding device, and a control information-generating unit which generates the control information based on the decoded information decoded by the decoding device and the position/posture information calculated by the position/posture calculation unit. | 10-09-2008 |
20080281470 | AUTONOMOUS COVERAGE ROBOT SENSING - An autonomous coverage robot detection system includes an emitter configured to emit a directed beam, a detector configured to detect the directed beam and a controller configured to direct the robot in response to a signal detected by the detector. In some examples, the detection system detects a stasis condition of the robot. In some examples, the detection system detects a wall and can follow the wall in response to the detected signal. | 11-13-2008 |
20080300723 | Teaching position correcting device - A teaching position correcting device which can easily correct, with high precision, teaching positions after shifting at least one of a robot and an object worked by the robot. Calibration is carried out using a vision sensor (i.e., CCD camera) that is mounted on a work tool. The vision sensor measures three-dimensional positions of at least three reference marks not aligned in a straight line on the object. The vision sensor is optionally detached from the work tool, and at least one of the robot and the object is shifted. After the shifting, calibration (this can be omitted when the vision sensor is not detached) and measuring of three-dimensional positions of the reference marks are carried out gain. A change in a relative positional relationship between the robot and the object is obtained using the result of measuring three-dimensional positions of the reference marks before and after the shifting respectively. To compensate for this change, the teaching position data that is valid before the shifting is corrected. The robot can have a measuring robot mechanical unit having a vision sensor, and a separate working robot mechanical unit that works the object. In this case, positions of the working robot mechanical unit before and after the shifting, respectively, are also measured. | 12-04-2008 |
20080312771 | Robots with Occlusion Avoidance Functionality - A method for controlling a robot having at least one visual sensor. A target for a motion of the robot is defined. A motion control signal adapted for the robot reaching the target is calculated. A collision avoidance control signal based on the closest points of segments of the robot and a virtual object between the visual sensing means and the target is calculated. The motion control signal and the collision avoidance control signal are weighted and combined. The weight of the motion control signal is higher when a calculated collision risk is lower. The motion of the robot is controlled according to the combined signal so that no segment of the robot enters the space defined by the virtual object. | 12-18-2008 |
20090018699 | WORK POSITIONING DEVICE - Regarding predetermined positioning criteria (M | 01-15-2009 |
20090024251 | Method and apparatus for estimating pose of mobile robot using particle filter - A method and apparatus for estimating the pose of a mobile robot using a particle filter is provided. The apparatus includes an odometer which detects a variation in the pose of a mobile robot, a feature-processing module which extracts at least one feature from an upward image captured by the mobile robot, and a particle filter module which determines current poses and weights of a plurality of particles by applying the mobile robot pose variation detected by the odometer and the feature extracted by the feature-processing module to previous poses and weights of the particles. | 01-22-2009 |
20090055023 | Telepresence robot with a printer - A remote controlled robot system that includes a robot and a remote controlled station. The robot includes a camera and a printer coupled to a mobile platform. The remote control station may display one or more graphical user interfaces with data fields. The graphical user interfaces allow a user to enter information into the data fields. The information is then transmitted to the robot and printed by the robot printer. The information may include a medical prescription and the name of the patient. Providing a robot printer allows the user to directly provide a medical prescription while remotely observing and interacting with the patient. | 02-26-2009 |
20090055024 | Robotic arm and control system - A robotic arm and control system includes a robotic arm which moves in response to one or more command signals. One or more “active” fiducials are located on the arm, each of which emits its own light. A 3D camera having an associated field-of-view is positioned such that at least one fiducial and a target object to be manipulated are in the FOV. To determine their spatial positions, the arm fiducials are activated and the target object is preferably illuminated with a scanning laser; the camera produces output signals which vary with the spatial locations of the fiducials and target object. A controller receives the output signals and uses the spatial position information as feedback to continuously guide the arm towards the target object. Multiple active fiducials may be employed, each having respective characteristics with which they can be differentiated. | 02-26-2009 |
20090082905 | METHOD AND APPARATUS FOR TRANSFORMING COORDINATE SYSTEMS IN A TELEMANIPULATION SYSTEM - In a telemanipulation system for manipulating objects located in a workspace at a remote worksite by an operator from an operator's station, such as in a remote surgical system, the remote worksite having a manipulator with an end effector for manipulating an object at the workspace, such as a body cavity, a controller including a hand control at the control operator's station for remote control of the manipulator, an image capture device, such as a camera, and image output device for reproducing a viewable real-time image, the improvement wherein a position sensor associated with the image capture device senses position relative to the end effector and a processor transforms the viewable real-time image into a perspective image with correlated manipulation of the end effector by the hand controller such that the operator can manipulate the end effector and the manipulator as if viewing the workspace in true presence. Image transformation according to the invention includes translation, rotation and perspective correction. | 03-26-2009 |
20090105881 | Medical Tele-Robotic System - A robotic system that includes a remote controlled robot. The robot may include a camera, a monitor and a holonomic platform all attached to a robot housing. The robot may be controlled by a remote control station that also has a camera and a monitor. The remote control station may be linked to a base station that is wirelessly coupled to the robot. The cameras and monitors allow a care giver at the remote location to monitor and care for a patient through the robot. The holonomic platform allows the robot to move about a home or facility to locate and/or follow a patient. | 04-23-2009 |
20090105882 | Medical Tele-Robotic System - A robotic system that includes a remote controlled robot. The robot may include a camera, a monitor and a holonomic platform all attached to a robot housing. The robot may be controlled by a remote control station that also has a camera and a monitor. The remote control station may be linked to a base station that is wirelessly coupled to the robot. The cameras and monitors allow a care giver at the remote location to monitor and care for a patient through the robot. The holonomic platform allows the robot to move about a home or facility to locate and/or follow a patient. | 04-23-2009 |
20090118864 | METHOD AND SYSTEM FOR FINDING A TOOL CENTER POINT FOR A ROBOT USING AN EXTERNAL CAMERA - Disclosed is a method and system for finding a relationship between a tool-frame of a tool attached at a wrist of a robot and robot kinematics of the robot using an external camera. The position and orientation of the wrist of the robot define a wrist-frame for the robot that is known. The relationship of the tool-frame and/or the Tool Center Point (TCP) of the tool is initially unknown. For an embodiment, the camera captures an image of the tool. An appropriate point on the image is designated as the TCP of the tool. The robot is moved such that the wrist is placed into a plurality of poses. Each pose of the plurality of poses is constrained such that the TCP point on the image falls within a specified geometric constraint (e.g. a point or a line). A TCP of the tool relative to the wrist frame of the robot is calculated as a function of the specified geometric constraint and as a function of the position and orientation of the wrist for each pose of the plurality of poses. An embodiment may define the tool-frame relative to the wrist frame as the calculated TCP relative to the wrist frame. Other embodiments may further refine the calibration of the tool-frame to account for tool orientation and possibly for a tool operation direction. An embodiment may calibrate the camera using a simplified extrinsic technique that obtains the extrinsic parameters of the calibration, but not other calibration parameters. | 05-07-2009 |
20090118865 | ROBOT - Described herein is a robot having a camera mount that is movable along a curved surface of an upper part of the head, on a front side thereof, and two units of cameras that are mounted in the camera mount. The cameras are disposed such that the cameras are laterally separated from each other at an interval substantially equivalent to a lateral width of the head, respective extremities of the cameras are substantially flush with the front end of the head, and the cameras are positioned in close proximity of the upper end of the robot. | 05-07-2009 |
20090138123 | Robotic CBRNE Automated Deployment, Detection, and Reporting System - A system having a plurality of networked detectors for detecting chemical, biological, or radiological, nuclear, or explosive agents is disclosed. The system includes a plurality of remote units, wherein each remote unit includes a robotic base, a lift, a sensor module, transceiver, navigation system, and power source. The remote units communicate with a base station, which receives data from the remotes and determines, based on the data, if an alarm condition exists. | 05-28-2009 |
20090143912 | SYSTEM AND METHOD FOR GRAPHICALLY ALLOCATING ROBOT'S WORKING SPACE - System and method for graphically allocating robot's working space are provided. The system includes an image extractor, a task-allocating server and a robot. A graphic user interface (GUI) of the task-allocating server includes a robot's working scene area, a space attribute allocating area and a robot's task area. Thus, a user assigns one certain space area in the robot's working scene area with a “wall” attribute, or another space area with a “charging station” attribute. Meanwhile, by using the GUI, the user directly assigns the robot to execute a specific task at a certain area. Hence, the user or remote controller facilitates the robot to provide safer and more effective service through his/her environment recognition. | 06-04-2009 |
20090143913 | IMAGE-BASED SELF-DIAGNOSIS APPARATUS AND METHOD FOR ROBOT - An image-based self-diagnosis apparatus and method for a robot determines the abnormality of a driving unit of a mobile robot by using a camera and reporting to the user in real time. The image-based self-diagnosis method may include: capturing a reference image at the current location and storing the captured reference image; capturing a comparison image after making at least one of linear moves and rotational moves by a preset amount, and storing the captured comparison image; and determining the abnormality of the mobile robot by comparing the stored reference image and the comparison image with each other. | 06-04-2009 |
20090157228 | USER INTERFACE DEVICE OF REMOTE CONTROL SYSTEM FOR ROBOT DEVICE AND METHOD USING THE SAME - A user interface device of a remote control system for a robot and a method using the same are provided. The user interface device includes: a radio frequency (RF) unit for receiving, from a remote control robot, camera data and at least one sensor data detecting a distance; a display unit having a main screen and at least one auxiliary screen; and a controller having an environment evaluation module for determining whether the received camera data are in a normal condition, and having a screen display mode change module for displaying, if the received camera data are in a normal condition, the camera data on the main screen and displaying, if the received camera data are in an abnormal condition, the sensor data on the main screen. | 06-18-2009 |
20090177323 | COMPANION ROBOT FOR PERSONAL INTERACTION - A mobile robot guest for interacting with a human resident performs a room-traversing search procedure prior to interacting with the resident, and may verbally query whether the resident being sought is present. Upon finding the resident, the mobile robot may facilitate a teleconferencing session with a remote third party, or interact with the resident in a number of ways. For example, the robot may carry on a dialogue with the resident, reinforce compliance with medication or other schedules, etc. In addition, the robot incorporates safety features for preventing collisions with the resident; and the robot may audibly announce and/or visibly indicate its presence in order to avoid becoming a dangerous obstacle. Furthermore, the mobile robot behaves in accordance with an integral privacy policy, such that any sensor recording or transmission must be approved by the resident. | 07-09-2009 |
20090198380 | METHODS FOR REAL-TIME AND NEAR REAL-TIME INTERACTIONS WITH ROBOTS THAT SERVICE A FACILITY - In accordance with aspects of the present invention, a service robot and methods for controlling such a robot are provided. In particular, the robot is configured to sense the presence of a person and to take a next action in response to sensing the presence of the person. As examples, the robot could leave the area, await commands from the person, or enter an idle or sleep state or mode until the person leaves. | 08-06-2009 |
20090198381 | METHODS FOR REPURPOSING TEMPORAL-SPATIAL INFORMATION COLLECTED BY SERVICE ROBOTS - Robots and methods implemented therein implement an active repurposing of temporal-spatial information. A robot can be configured to analyze the information to improve the effectiveness and efficiency of the primary service function that generated the information originally. A robot can be configured to use the information to create a three dimensional (3D) model of the facility, which can be used for a number of functions such as creating virtual tours of the environment, or porting the environment into video games. A robot can be configured to use the information to recognize and classify objects in the facility so that the ensuing catalog can be used to locate selected objects later, or to provide a global catalog of all items, such as is needed for insurance documentation of facility effects. | 08-06-2009 |
20090204260 | Door Opener Arrangement for Use with an Industrial Robot - A door opener arrangement for a robot coating device, used for detecting a position of a part ( | 08-13-2009 |
20090210092 | Method for self-localization of robot based on object recognition and environment information around recognized object - A method for self-localization of a robot, the robot including a camera unit, a database storing a map around a robot traveling path, and a position arithmetic unit estimating the position of the robot, includes: acquiring an image around the robot, in the camera unit. Further, the method includes recognizing, in the position arithmetic unit, an individual object in the image acquired by the camera unit, to generate position values on a camera coordinate system of local feature points of the individual objects and local feature points of a surrounding environment including the individual objects; and estimating, in the position arithmetic unit, the position of the robot on the basis of the map and the position values on the camera coordinate system of local feature points of the individual objects and local feature points of a surrounding environment including the individual objects. | 08-20-2009 |
20090240371 | Remote presence system mounted to operating room hardware - A robot system that includes a remote station and a robot face. The robot face includes a camera that is coupled to a monitor of the remote station and a monitor that is coupled to a camera of the remote station. The robot face and remote station also have speakers and microphones that are coupled together. The robot face may be coupled to a boom. The boom can extend from the ceiling of a medical facility. Alternatively, the robot face may be attached to a medical table with an attachment mechanism. The robot face and remote station allows medical personnel to provide medical consultation through the system. | 09-24-2009 |
20090240372 | EXTERNAL SYSTEM FOR ROBOTIC ACCURACY ENHANCEMENT - The inventive concept of the metrology system (the system) actively determines the 6 Degree of Freedom (6-DOF) pose of a motion device such as, but not limited to, an industrial robot employing an end of arm tool (EOAT). A concept of the system includes using laser pointing devices without any inherent ranging capability in conjunction with the EOAT-mounted targets to actively determine the pose of the EOAT at distinct work positions of at least one motion device. | 09-24-2009 |
20090265035 | Robotic Device Tester - A system, method, and device may include software and hardware which simplify and quicken configuration of the system for testing a device, enhance testing procedures which may be performed, and provide data via which to easily discern a cause and nature of an error which may result during testing. A camera may capture still images of a display screen of a tested device and another camera may capture video images of the tested device and a partner device. A wizard may be used to generate a configuration file based on one previously generated for a similar device. A mount for a tested device may be structured so that: it is suitable for mounting thereon a plurality of differently structured devices; and adjustments in a vertical direction and a horizontal direction in a plane and adjustments of an angle of the device relative to the plane may be easily made. | 10-22-2009 |
20090265036 | ROBOT OPERATOR CONTROL UNIT CONFIGURATION SYSTEM AND METHOD - A unified framework is provided for building common functionality into diverse operator control units. A set of tools is provided for creating controller configurations for varied robot types. Preferred controllers do one or more the following: allow uploading of configuration files from a target robot, adhere to common user interface styles and standards, share common functionality, allow extendibility for unique functionality, provide flexibility for rapid prototype design, and allow dynamic communication protocol switching. Configuration files may be uploaded from robots to configure their operator control units. The files may include scene graph control definitions; instrument graphics; control protocols; or mappings of control functions to scene graphics or control inputs. | 10-22-2009 |
20090271038 | SYSTEM AND METHOD FOR MOTION CONTROL OF HUMANOID ROBOT - A system and method for motion control of a humanoid robot are provided. The system includes a remote controller for recognizing three-dimensional image information including two-dimensional information and distance information of a user, determining first and second reference points on the basis of the three-dimensional image information, calculating variation in angle of a joint on the basis of three-dimensional coordinates of the first and second reference points, and transmitting a joint control signal through a wired/wireless network. The system also includes a robot for checking joint control data from the joint control signal received from the remote controller and varying an angle of the joint to move according to the user's motion. | 10-29-2009 |
20090281662 | Simulator for visual inspection apparatus - A simulator for a visual inspection apparatus is provided. The apparatus is equipped with a robot having an arm and a camera attached to a tip end of the arm, the camera inspecting a point being inspected of a workpiece. Using 3D profile data of a workpiece, information of lenses of cameras, operational data of a robot, simulation for imaging is made for a plurality of points being inspected of the workpiece. For allowing the camera to image the points being inspected of the workpiece, a position and an attitude of the tip end of the arm of the robot are obtained. Based on the obtained position and attitude, it is determined whether or not the imaging is possible. When the imaging is possible, installation-allowed positions of the robot are decided and outputted as candidates of positions for actually installing the robot. | 11-12-2009 |
20090287353 | SYSTEM AND METHOD FOR CONTROLLING A BIPEDAL ROBOT VIA A COMMUNICATION DEVICE - A system for controlling a bipedal robot via a communication device. The system acquires a mapping data and a current location of the bipedal robot via a Global Positioning System (GPS), determines a route on the mapping data, and directs movement of the bipedal robot until it reaches a preset destination. A method for controlling the robot and a storage device containing computer instructions for execution of the method are also provided. | 11-19-2009 |
20090312871 | SYSTEM AND METHOD FOR CALCULATING LOCATION USING A COMBINATION OF ODOMETRY AND LANDMARKS - Disclosed is a system and method for calculation a location in a real-time manner using a combination of odometry and artificial landmarks. The system for calculating a location comprising a landmark detection unit detecting an image coordinates value of the artificial landmark corresponding to a location in a two-dimensional image coordinate system with respect to a mobile robot from an image obtained by photographing a specific space where the artificial landmarks are provided; a landmark identification unit comparing a predicted image value of the artificial landmark, obtained by converting a location coordinates value of the artificial landmark into an image coordinates value corresponding to the location in the two-dimensional image coordinate system with respect to a location coordinates value corresponding to a location in an actual three-dimensional spatial coordinate system of the mobile robot, with an image coordinates value detected by the landmark detection unit to detect the location coordinates value of the artificial landmark; a first location calculation unit calculating a current location coordinates value of the mobile robot using a predetermined location calculation algorithm based on the image coordinates value detected by the landmark detection unit and the location coordinates value detected by the landmark identification unit; a second location calculation unit calculating a current location coordinates value of the mobile robot using a predetermined location calculation algorithm based on odometry information of the mobile robot; and a main control unit updating the current location coordinates value of the mobile robot, using the location coordinates value calculated by the first location calculation unit when the location coordinates value calculated by the first location calculation unit exists, or using the location coordinate value obtained from the second location calculation unit when the location coordinates value calculated by the first location calculation unit does not exist. | 12-17-2009 |
20090319083 | Robot Confinement - A method of confining a robot in a work space includes providing a portable barrier signal transmitting device including a primary emitter emitting a confinement beam primarily along an axis defining a directed barrier. A mobile robot including a detector, a drive motor and a control unit controlling the drive motor is caused to avoid the directed barrier upon detection by the detector on the robot. The detector on the robot has an omnidirectional field of view parallel to the plane of movement of the robot. The detector receives confinement light beams substantially in a plane at the height of the field of view while blocking or rejecting confinement light beams substantially above or substantially below the plane at the height of the field of view. | 12-24-2009 |
20100004784 | Apparatus and method for effectively transmitting image through stereo vision processing in intelligent service robot system - A data transmission apparatus of an intelligent robot system and a method thereof are provided. The data transmission apparatus includes a vision processor collector, a communicating unit, and a controller. The vision processor collects images captured through a camera, and performs an image process on the collected image to minimize a quantity of information about unnecessary regions in the collected image. The communicating unit communicates with the robot server, transmits the processed image data from the vision processor to the robot server, and receives corresponding result data from the robot server. The controller controls the image process and the transmission of the processed image data in the vision processor, and a corresponding operation of the robot terminal performed according to result data received from the robot server. | 01-07-2010 |
20100010672 | Docking system for a tele-presence robot - A remote controlled robot system that includes a mobile robot with a robot camera and a battery plug module, and a remote control station that transmits commands to control the mobile robot. The system also includes a battery charging module that mates with the mobile robot battery plug module, and an alignment system that aligns the battery plug module with the battery charging module. The battery modules may also be aligned with the aid of video images of the battery charging module provided to the remote station by a camera located within the battery plug module. | 01-14-2010 |
20100017035 | ASSEMBLY OF A MILKING ROBOT WITH A MILKING ROBOT FEEDING PLACE, AND A DEVICE FOR GRIPPING AND DISPLACING MATERIAL - The invention provides a device for gripping and displacing material, provided with a gripper, comprising controller for the gripper and a sensor for forming an image of an observation area, which sensor is connected to the controller, wherein the sensor comprises a source of radiation for modulated electromagnetic radiation, a receiver device for radiation reflected by an object, comprising a matrix of receivers, an optical device for displaying the reflected radiation on the receiver device, and sensor image processor in order to determine for each receiver a phase difference between the electromagnetic radiation emitted and the electromagnetic radiation reflected in order to calculate a distance from the receiver to the object. A device equipped with such a sensor is capable of functioning in a very reliable, safe and multifunctional manner, because it is capable of processing spatial images during operation. The invention also provides an assembly of the device and a feeding place, in particular of a milking robot and a milking robot feeding place. | 01-21-2010 |
20100049367 | METHOD OF CONTROLLING ROBOT FOR BRIDGE INSPECTION - The present invention relates to a method of controlling a robot for bridge inspection. In the present invention, whether a defect image is being received from a robot device is determined. As a result of the determination, when the defect image is being received, a current location of the robot device is stored. Whether a predetermined period of time has been elapsed after the storage of the current location is determined. When the predetermined period of time has elapsed, a control command for moving the robot device to a prestored location is output. Whether a defect image at a same location as the prestored location is being received is determined. When the defect image at the same location is being received, a defect image at a previous time is compared with a defect image at a current time. A result of the comparison is displayed. | 02-25-2010 |
20100049368 | ROBOT - An exemplary robot includes an information collecting module, a controlling system and a driving module. The information collecting module comprises a voice identifying device, a detecting device and a motion sensing device. The information collecting module is configured for identifying identities of robot users, detecting distances between the robot and objects located therearound thereof and sensing motion states of the robot. The controlling system is configured for generating a controlling signal and sending the controlling signal to the driving module. The driving module is configured for receiving the controlling signal, and driving the robot to move and adjusting the movement of the robot based on the controlling signal. | 02-25-2010 |
20100063629 | SYSTEM AND METHOD FOR RECIRCULATING PARTS - This invention relates to a system and method for feeding and recirculating parts for vision-based pickup. The system and method have a feeder that automatically recirculates parts that are not picked by a robot. The system has a feeder bowl, ramp and interchangeable picking plate, all of which may be vibrated to both feed parts and cause recirculation. | 03-11-2010 |
20100070078 | Apparatus and method for building map - An apparatus and method for building a map are provided. According to the apparatus and method, a path is generated on the basis of the degrees of uncertainty of features extracted from an image obtained while a mobile robot explores unknown surroundings, and the mobile robot travels along the generated path. The path based on the degrees of uncertainty of the features is generated and this may increase the accuracy of a feature map of the mobile robot or accuracy in self localization. | 03-18-2010 |
20100070079 | MOBILE VIDEOCONFERENCING ROBOT SYSTEM WITH NETWORK ADAPTIVE DRIVING - A remote control station that controls a robot through a network. The remote control station transmits a robot control command that includes information to move the robot. The remote control station monitors at least one network parameter and scales the robot control command as a function of the network parameter. For example, the remote control station can monitor network latency and scale the robot control command to slow down the robot with an increase in the latency of the network. Such an approach can reduce the amount of overshoot or overcorrection by a user driving the robot. | 03-18-2010 |
20100076600 | MOBILE ROBOT FOR TELECOMMUNICATION - A mobile robot provides telecommunication service between a remote user at a remote terminal and a local user in proximity to the mobile robot. The remote user can connect to the mobile robot via the Internet using a peer-to-peer VoIP protocol, and control the mobile robot to navigate about the mobile robot's environment. The mobile robot includes a microphone, a video camera and a speaker for providing telecommunication functionality between the remote user and the local user. Also, a hand-held RC unit permits the local user to navigate the mobile robot locally or to engage privacy mode for the mobile robot. When NAT or a firewall obstructs connection from the remote terminal to the mobile robot, an Internet server facilitates connection using methods such as STUN, TURN, or relaying. | 03-25-2010 |
20100100240 | TELEPRESENCE ROBOT WITH A CAMERA BOOM - A remote controlled robot with a head that supports a monitor and is coupled to a mobile platform. The mobile robot also includes an auxiliary camera coupled to the mobile platform by a boom. The mobile robot is controlled by a remote control station. By way of example, the robot can be remotely moved about an operating room. The auxiliary camera extends from the boom so that it provides a relatively close view of a patient or other item in the room. An assistant in the operating room may move the boom and the camera. The boom may be connected to a robot head that can be remotely moved by the remote control station. | 04-22-2010 |
20100100241 | AUTONOMOUS FOOD AND BEVERAGE DISTRIBUTION MACHINE - The invention proposes an autonomous mobile robotic device in the form of an integrated machine for producing beverages or liquid comestibles. | 04-22-2010 |
20100114374 | Apparatus and method for extracting feature information of object and apparatus and method for creating feature map - Technology for creating a feature map for localizing a mobile robot and extracting feature information of surroundings is provided. According to one aspect, feature information including a reflection function is extracted from information acquired using a 3D distance sensor and used as a basis for creating a feature map. Thus, a feature map that is less sensitive to change in the surrounding environment can be created, and a success rate of feature matching can be increased. | 05-06-2010 |
20100131102 | SERVER CONNECTIVITY CONTROL FOR TELE-PRESENCE ROBOT - A robot system with a robot that has a camera and a remote control station that can connect to the robot. The connection can include a plurality of privileges. The system further includes a server that controls which privileges are provided to the remote control station. The privileges may include the ability to control the robot, joint in a multi-cast session and the reception of audio/video from the robot. The privileges can be established and edited through a manager control station. The server may contain a database that defines groups of remote control station that can be connected to groups of robots. The database can be edited to vary the stations and robots within a group. The system may also allow for connectivity between a remote control station at a user programmable time window. | 05-27-2010 |
20100131103 | SERVER CONNECTIVITY CONTROL FOR TELE-PRESENCE ROBOT - A robot system with a robot that has a camera and a remote control station that can connect to the robot. The connection can include a plurality of privileges. The system further includes a server that controls which privileges are provided to the remote control station. The privileges may include the ability to control the robot, joint in a multi-cast session and the reception of audio/video from the robot. The privileges can be established and edited through a manager control station. The server may contain a database that defines groups of remote control station that can be connected to groups of robots. The database can be edited to vary the stations and robots within a group. The system may also allow for connectivity between a remote control station at a user programmable time window. | 05-27-2010 |
20100138042 | ROBOT SYSTEM - Teaching images are acquired at a plurality of separate teaching points on a running route extending from a running start position to a goal position, respectively, under a first light environmental condition and a light environmental condition different from the first light environmental condition, and the teaching images are stored. A present teaching image serving as a target for a robot body in a running direction at present is selected from the stored teaching images. A driving mechanism is controlled so as to increase the matching degree between the present teaching image and an actual image taken by a camera. | 06-03-2010 |
20100152897 | METHOD & APPARATUS FOR CONTROLLING THE ATTITUDE OF A CAMERA ASSOCIATED WITH A ROBOTIC DEVICE - A robot movement control device is connected to a communications network in a remote location relative to a robotic device that is also connected to the communications network. The robot movement control device is an electronic device with a video display for displaying a real-time video image sent to it by a camera associated with the robot. A robot movement control mechanism is included in the robot control device and robot movement control commands are generated by the movement control mechanism which commands include speed and directional information. The control commands are sent by the robot control device over the network to the robot which uses the commands to adjust its speed and direction of movement of the robot. A relationship between the motion of the robot and the attitude of the camera associated with the robot is establish and used in conjunction with the detected motion of the robot to automatically adjust the attitude of the video camera associated with the robot. | 06-17-2010 |
20100161129 | SYSTEM AND METHOD FOR ADJUSTING AN IMAGE CAPTURING DEVICE ATTRIBUTE USING AN UNUSED DEGREE-OF-FREEDOM OF A MASTER CONTROL DEVICE - An image capturing device is robotically positioned and oriented in response to operator manipulation of a master control device. An unused degree-of-freedom of the master control device is used to adjust an attribute such as focusing of the image capturing device relative to a continually updated set-point. A deadband is provided to avoid inadvertent adjusting of the image capturing device attribute and haptic feedback is provided back to the master control device so that the operator is notified when adjusting of the attribute is initiated. | 06-24-2010 |
20100168918 | OBTAINING FORCE INFORMATION IN A MINIMALLY INVASIVE SURGICAL PROCEDURE - Methods of and a system for providing force information for a robotic surgical system. The method includes storing first kinematic position information and first actual position information for a first position of an end effector; moving the end effector via the robotic surgical system from the first position to a second position; storing second kinematic position information and second actual position information for the second position; and providing force information regarding force applied to the end effector at the second position utilizing the first actual position information, the second actual position information, the first kinematic position information, and the second kinematic position information. Visual force feedback is also provided via superimposing an estimated position of an end effector without force over an image of the actual position of the end effector. Similarly, tissue elasticity visual displays may be shown. | 07-01-2010 |
20100174409 | Robot slip detection apparatus and method - A technique of detecting a slip of a robot using a particle filter and feature information of a ceiling image is disclosed. A first position of the robot is computed using a plurality of particles, a second position of the robot is computed using the feature information of the ceiling image, and whether a slip has occurred is determined based on a distance between the first position and the second position. | 07-08-2010 |
20100179691 | Robotic Platform - The present invention is a robotic mobile platform vehicle that can be thrown into hostile or hazardous environments for gathering information and transmitting that information to a remotely located control station and a system comprising the robotic mobile platform. The system of the invention is adapted to provide it's operator with significant information without being exposed directly to actual or potential danger. One of the key features of the invention is that at least four imaging assemblies are mounted on the robotic platform and that the system has the processing ability to stitch the views taken by the four imaging devices together into an Omni-directional image, allowing simultaneous viewing of a 360 degree field of view surrounding the mobile platform. Another feature is that the system comprises a touch screen GUI and the robotic mobile platform is equipped with processing means and appropriate software. This combination enables the user to steer the robotic platform simply by touching an object in one of the displayed images that he wants to investigate. The robotic platform can then either point its sensors towards that object or, if so instructed, compute the direction to the object and travel to it without any further input from the user. | 07-15-2010 |
20100185327 | MOVABLE ROBOT - A technique to wholly recognize the surrounding environment may be provided by excluding unknown environment which arises due to parts of a body of a robot hindering the sight of the robot during operations. | 07-22-2010 |
20100185328 | Robot and control method thereof - Disclosed herein are a robot that supplies a projector service according to a user's context and a controlling method thereof. The robot includes a user detection unit detecting a user; a user recognition unit recognizing the user; an object recognition unit recognizing an object near the user; a position perception unit perceiving relative positions of the object and the user; a context awareness unit perceiving the user's context based on information on the user, the object and the relative positions between the user and the object; and a projector supplying a projector service corresponding to the user's context. | 07-22-2010 |
20100185329 | VISION AIDED CASE/BULK PALLETIZER SYSTEM - The vision aided case/bulk palletizer system of this invention is a process and apparatus for: providing a camera positioned over the dunnage supply line; initiating a frame grab of the dunnage supply line with the camera; using the frame grab to determine the position of the dunnage; using the frame grab to position the programmable robot over the dunnage; feeding the dunnage from the dunnage supply line to the load building area; and controlling the steps with the single programmable robot, microprocessor and software. This system provides for transfer of the dunnage when the position of the dunnage is skewed by using the frame grab to position the programmable robot over the skewed dunnage. In another embodiment, the camera is used to determine any void in the tier of product during the build of a tier of product, and also provides for error-proofing the transfer of dunnage. | 07-22-2010 |
20100191375 | DOCUMENTATION THROUGH A REMOTE PRESENCE ROBOT - A robotic system that is used in a tele-presence session. For example, the system can be used by medical personnel to examine, diagnose and prescribe medical treatment in the session. The system includes a robot that has a camera and is controlled by a remote station. The system further includes a storage device that stores session content data regarding the session. The data may include a video/audio taping of the session by the robot. The session content data may also include time stamps that allow a user to determine the times that events occurred during the session. The session content data may be stored on a server that accessible by multiple users. Billing information may be automatically generated using the session content data. | 07-29-2010 |
20100191376 | NETWORK ARCHITECTURE FOR REMOTE ROBOT WITH INTERCHANGEABLE TOOLS - Systems, methods and devices for the remote control of a robot which incorporates interchangeable tool heads. Although applicable to many different industries, the core structure of the system includes a robot with a tool head interface for mechanically, electrically and operatively interconnecting a plurality of interchangeable tool heads to perform various work functions. The robot and tool head may include several levels of digital feedback (local, remote and wide area) depending on the application. The systems include a single umbilical cord to send power, air, and communications signals between the robot and a remote computer. Additionally, all communication (including video) is preferably sent in a digital format. Finally, a GUI running on the remote computer automatically queries and identifies all of the various devices on the network and automatically configures its user options to parallel the installed devices. Systems according to the preferred embodiments find particular application in the pipeline arts. For example, interchangeable tool heads may be designed to facilitate inspection, debris clearing, cleaning, relining, lateral cutting after relining, mapping, and various other common pipeline-related tasks. | 07-29-2010 |
20100234998 | MOVING ROBOT AND OPERATING METHOD THEREOF - A moving robot and its operation method are disclosed. The moving robot includes a moving body/object sensing unit that senses a movement of a human body within a certain distance, a traveling unit that controls a traveling speed and direction, and a controller that outputs a control signal for controlling the traveling speed according to pre-set data to the traveling unit. In a state that the moving robot performs a cleaning operation while moving its locations, when a movement of a human body is sensed by the moving body/object sensing unit, the traveling speed is reduced to allow the user to easily control the external operation, and the efficiency can be increased by utilizing the moving body/object sensing unit for operations of different modes. | 09-16-2010 |
20100262290 | Data matching apparatus, data matching method and mobile robot - A three-dimensional data matching system is disclosed. Data matching is performed by merging distance information and image information. Therefore, matching accuracy is improved even if a sensor with relatively low sensitivity is used. Matching data generated as a result of matching range data and CAD data is projected onto an image captured by a camera, an effective edge is extracted from the image, and an error of the matching data is corrected based on the effective edge, thereby improving matching accuracy. | 10-14-2010 |
20100268385 | MOVING ROBOT AND OPERATING METHOD FOR SAME - There are provided a moving robot and a method of operating the same. A bottom surface is photographed to sense a moving distance and a moving direction based on input image data. The amount of light radiated to photograph the bottom surface is sensed to feedback control the light emission degree of a light source unit. The light source unit is controlled when errors are generated in sensing the image data. Therefore, the sensing ratio of the photographed image is improved so that correctness of calculating the position of the moving robot is improved. | 10-21-2010 |
20100274390 | METHOD AND SYSTEM FOR THE HIGH-PRECISION POSITIONING OF AT LEAST ONE OBJECT IN A FINAL LOCATION IN SPACE - The invention relates to a method and a system for the high-precision positioning of at least one object in a final location in space. An object ( | 10-28-2010 |
20100274391 | DETERMINING THE POSITION OF AN OBJECT - A method for determining the position of at least one object present within a working range of a robot by an evaluation system, wherein an image of at least one part of the working range of the robot is generated by a camera mounted on a robot. The image is generated during a motion of the camera and image data are fed to the evaluation system in real time, together with further data, from which the position and/or orientation of the camera when generating the image can be derived. The data are used for determining the position of the at least one object. | 10-28-2010 |
20100286827 | ROBOT WITH VISION-BASED 3D SHAPE RECOGNITION - The invention relates to a method for processing video signals from a video sensor, in order to extract 3d shape information about objects represented in the video signals, the method comprising the following steps:
| 11-11-2010 |
20100292840 | FLEXIBLE TWO-WHEELED SELF-BALANCING ROBOT SYSTEM AND ITS MOTION CONTROL METHOD - A flexible two-wheeled self-balancing robot system and its motion control method include a main controller | 11-18-2010 |
20100292841 | ROBOT WITH 3D GRASPING CAPABILITY - A robotic harvester has a mobile platform. A programmable multi-axis robot arm is connected to the platform. The robot arm is mounted to a computer controller. A stereovision camera connected to the computer is mounted on the mobile platform. The camera views the area under the mobile platform and identifies objects in geometric coordinates. The robot arm is directed to the location of the object and a gripper on the robot arm grasps the object. The stem is separated from the object and the object is deposited on a sorting conveyor. The harvester is incrementally moved. A method of harvesting is disclosed. | 11-18-2010 |
20100298977 | MOBILE ROBOT AND PATH PLANNING METHOD THEREOF FOR MANIPULATING TARGET OBJECTS - A mobile robot and a path planning method are provided for the mobile robot to manipulate the target objects in a space, wherein the space consists of a periphery area and a central area. With the present method, an initial position is defined and the mobile robot is controlled to move within the periphery area from the initial position. Next, the latest image is captured when the mobile robot moves, and a manipulating order is arranged according to the distances estimated between the mobile robot and each of target objects in the image. The mobile robot is controlled to move and perform a manipulating action on each of the target object in the image according to the manipulating order. The steps of obtaining the image, planning the manipulating order, and controlling the mobile robot to perform the manipulating action are repeated until the mobile robot returns to the initial position. | 11-25-2010 |
20100298978 | MANIPULATOR WITH CAMERA - Provided is a manipulator with at least one camera capable of observing an end effector from a direction suitable for work. A rotating portion rotatable coaxially with the end effector is provided to a link adjacent to a link located at a manipulator tip end. At least one camera for recognizing a work piece as a object is arranged on the rotating portion through a camera platform. An actuator for controlling a rotation angle of the rotating portion is driven according to a rotation angle of the link located at the manipulator tip end, and thus the camera is arranged in a direction perpendicular to a plane where end effector can move when the end effector performs a grip work. In an assembly work, the rotating portion is rotated such that the camera is arranged in a direction parallel to the plane where end effector can move. | 11-25-2010 |
20100305756 | Image taking system and electronic-circuit-component mounting machine - An image taking system including: (a) a lighting device capable of changing a light emission time to various time length values; (b) an image taking device configured to take an image of a subject portion while light is being emitted by the lighting device; (c) a subject-portion moving device configured to move the subject portion relative to the image taking device, and capable of changing a movement velocity of the subject portion relative to the image taking device, to various velocity values; and (d) a control device configured, during movement of the subject portion by the subject-portion moving device, to cause the lighting device to emit the light for one of the time length values as the light emission time and to cause the image taking device to take the image, and is configured to control the movement velocity, such that an amount of the movement of the subject portion for the above-described one of the time length values is not larger than a predetermined movement amount. | 12-02-2010 |
20100312393 | ROBOT WITH CAMERA - A shutter chance of a camera mounted on a robot arm is optimized for improving assembling efficiency. An image of a point on a finger and a position of an alignment mark is taken, and a position of the camera is measured by image processing. When the position of the camera is a preset position threshold value or lower, and when a velocity of the camera measured by a velocity sensor is a preset velocity threshold value or lower, a shutter is released. Furthermore, logical AND with another condition that a acceleration of the camera measured as a differential value of the velocity sensor is a preset acceleration threshold value or lower is taken for releasing the shutter. Thus, blur of an image for searching for a work is prevented, a position error is reduced, and working efficiency is improved by releasing the shutter earlier. | 12-09-2010 |
20100324735 | METHOD AND DEVICE FOR FINE POSITIONING OF A TOOL HAVING A HANDLING APPARATUS - The present invention relates to a method and a device for the machining of an object using a tool, in which the tool ( | 12-23-2010 |
20100324736 | Robot cleaner, docking station, robot cleaner system including robot cleaner and docking station, and method of controlling robot cleaner - A robot cleaner system is described including a docking station to form a docking area within a predetermined angle range of a front side thereof, to form docking guide areas which do not overlap each other on the left and right sides of the docking area, and to transmit a docking guide signal such that the docking guide areas are distinguished as a first docking guide area and a second docking guide area according to an arrival distance of the docking guide signal. The robot cleaner system also includes a robot cleaner to move to the docking area along a boundary between the first docking guide area and the second docking guide area when the docking guide signal is sensed and to move along the docking area so as to perform docking when reaching the docking area. | 12-23-2010 |
20100324737 | SHAPE DETECTION SYSTEM - A shape detection system includes a distance image sensor that detects an image of a plurality of detection objects and distances to the detection objects, the detection objects being randomly arranged in a container, a sensor controller that detects a position and an orientation of each of the detection objects in the container on the basis of the result of the detection performed by the distance image sensor and a preset algorithm, and a user controller that selects the algorithm to be used by the sensor controller and sets the algorithm for the sensor controller. | 12-23-2010 |
20100332033 | CONTROL OF MEDICAL ROBOTIC SYSTEM MANIPULATOR ABOUT KINEMATIC SINGULARITIES - A medical robotic system includes an entry guide with articulatable instruments extending out of its distal end, an entry guide manipulator providing controllable four degrees-of-freedom movement of the entry guide relative to a remote center, and a controller configured to manage operation of the entry guide manipulator in response to operator manipulation of one or more input devices. As the entry guide manipulator approaches a yaw/roll singularity, the controller modifies its operation to allow continued movement of the entry guide manipulator without commanding excessive joint velocities while maintaining proper orientation of the entry guide. | 12-30-2010 |
20110022231 | Apparatuses, Systems and Methods for Automated Crop Picking - Automated apparatuses and related methods for scanning, spraying, pruning, and harvesting crops from plant canopies. The apparatuses include a support structure comprising a frame, a central vertical shaft, and at least one module support member capable of rotating around a plant canopy. The support member supports a plurality of movable arms, each arm having at least one detector for probing the plant canopy. Embodiments further comprise applicators and/or manipulators for spraying, pruning, and harvesting crops from within the plant canopy. The methods of the present invention include causing the moveable arms attached to the support structure to be extended into the plant canopy, searching for crops, and transmitting and/or storing the search data. Embodiments further comprise detaching crops from the plant canopy and transporting them to a receptacle, applying a controlled amount of material within the plant canopy, or pruning inside of the plant canopy. | 01-27-2011 |
20110040409 | ROBOTIC VEHICLE WITH DRIVE MEANS AND METHOD FOR ACTIVATING DRIVE MEANS - The invention relates to a robotic vehicle ( | 02-17-2011 |
20110046784 | ASYMMETRIC STEREO VISION SYSTEM - The different illustrative embodiments provide an apparatus that includes an autonomous vehicle, a modular navigation system, and an asymmetric vision module. The modular navigation system is coupled to the autonomous vehicle. The asymmetric vision module is configured to interact with the modular navigation system. | 02-24-2011 |
20110046785 | METHOD AND DEVICE FOR THE REMOVAL OF A LEAF FROM A CROP - Method and device for the removal of a part of a crop, such as a leaf ( | 02-24-2011 |
20110054691 | METHOD AND APPARATUS FOR BIRDS CONTROL USING MOBILE ROBOT - Provides is a method including receiving information on a surrounding situation detected by the mobile robot; detecting birds from the received surrounding situation information; allocating a birds control mission to the mobile robot by extracting a birds control pattern corresponding to the surrounding situation; and verifying a result in accordance with performing the allocated birds control mission from the mobile robot. By controlling the birds so as to, in advance, prevent a loss of lives and economical loss which may be caused when the birds collide with airplanes at the airport, it is possible to improve productivity and efficiency of a birds repelling job in an airport and provide construction of a new type of aviation maintenance business model by activating an air traffic control industry through providing a safer airplane operating model while saving operating personnel costs for preventing collision of birds. | 03-03-2011 |
20110071679 | EMBEDDED DIAGNOSTIC, PROGNOSTIC, AND HEALTH MANAGEMENT SYSTEM AND METHOD FOR A HUMANOID ROBOT - A robotic system includes a humanoid robot with multiple compliant joints, each moveable using one or more of the actuators, and having sensors for measuring control and feedback data. A distributed controller controls the joints and other integrated system components over multiple high-speed communication networks. Diagnostic, prognostic, and health management (DPHM) modules are embedded within the robot at the various control levels. Each DPHM module measures, controls, and records DPHM data for the respective control level/connected device in a location that is accessible over the networks or via an external device. A method of controlling the robot includes embedding a plurality of the DPHM modules within multiple control levels of the distributed controller, using the DPHM modules to measure DPHM data within each of the control levels, and recording the DPHM data in a location that is accessible over at least one of the high-speed communication networks. | 03-24-2011 |
20110082585 | METHOD AND APPARATUS FOR SIMULTANEOUS LOCALIZATION AND MAPPING OF MOBILE ROBOT ENVIRONMENT - Techniques that optimize performance of simultaneous localization and mapping (SLAM) processes for mobile devices, typically a mobile robot. In one embodiment, erroneous particles are introduced to the particle filtering process of localization. Monitoring the weights of the erroneous particles relative to the particles maintained for SLAM provides a verification that the robot is localized and detection that it is no longer localized. In another embodiment, cell-based grid mapping of a mobile robot's environment also monitors cells for changes in their probability of occupancy. Cells with a changing occupancy probability are marked as dynamic and updating of such cells to the map is suspended or modified until their individual occupancy probabilities have stabilized. In another embodiment, mapping is suspended when it is determined that the device is acquiring data regarding its physical environment in such a way that use of the data for mapping will incorporate distortions into the map, as for example when the robotic device is tilted. | 04-07-2011 |
20110082586 | HANDLING APPARATUS, CONTROL DEVICE, CONTROL METHOD, AND PROGRAM - A handling apparatus having a belt conveyor ( | 04-07-2011 |
20110098859 | ROBOT SYSTEM AND WORKPIECE PICKING METHOD - A robot system includes a robot. A robot control device is configured to control an operation of the robot, and includes a workpiece shape memory configure to store a shape of workpieces. A shape sensor is configured to detect shape information about the workpieces. A target workpiece detector is configured to detect a graspable workpiece based on the shape information detected by the shape sensor. A grasping information memory is configured to store a grasping position indicating which portion of the graspable workpiece is to be grasped by the robot. A grasping operation controller is configured to control the robot to grasp the graspable workpiece detected by the target workpiece detector and to pick the grasped workpiece. A disturbing operation controller is configured to control, if no graspable workpiece is detected by the target workpiece detector, the robot to perform a workpiece disturbing operation. | 04-28-2011 |
20110106312 | System and Method For Multiple View Machine Vision Target Location - A machine vision system for controlling the alignment of an arm in a robotic handling system. The machine vision system includes an optical imager aligned to simultaneously capture an image that contains a view of the side of an object, such as a test tube, along with a view of the top of the object provided by a mirror appropriately positioned on the robotic arm. The machine vision system further includes a microcontroller or similar device for interpreting both portions of the image. For example, the microcontroller may be programmed to determine the location of the object in the reflected portion of the image and transpose that information into the location of the object relative to the robotic arm. The microcontroller may also be programmed to decode information positioned on the object by interpreting visual information contained in the other portion of the captured image. | 05-05-2011 |
20110106313 | BRIDGE INSPECTION ROBOT CAPABLE OF CLIMBING OBSTACLE - Provided is a bridge inspection robot which is capable of climbing over an obstacle, the bridge inspection robot including: a climbing-over portion ( | 05-05-2011 |
20110137463 | SYSTEMS AND METHODS ASSOCIATED WITH HANDLING AN OBJECT WITH A GRIPPER - A system associated with handling an object with a gripper includes a sensor that is configured to measure spatially distributed data that represents the position of the object that is handled by the gripper. The system further includes a computing unit that is configured to determine the behavior of the object. | 06-09-2011 |
20110153082 | SENSOR SYSTEM FOR DETECTING THE SURFACE STRUCTURES OF SEVERAL PACKAGED ARTICLES - An exemplary embodiment of the invention relates to a sensor system for detecting the surface structures of several packaged articles. An exemplary system comprises at least one laser distance detector that functions according to a triangulation principle and that determines the distance between the laser distance detector and a surface structure of a packaged article. The laser distance detector has at least one analog output via which a distance-proportional analog signal can be emitted. The analog output of at least one laser distance detector is in communication with an evaluation unit via an amplifier circuit. The amplifier circuit encompasses at least one operational amplifier that has two inputs, and the analog signal of the laser distance detector is present at a first input of the at least one operational amplifier. A variable reference voltage is present at the other input of the at least one operational amplifier. The reference voltage may be obtained from the analog signal of the analog output, and this analog signal may be present at the other input of the operational amplifier via a low-pass filter. The output of the at least one operational amplifier may be connected to the evaluation unit, as a result of which the amplifier circuit is configured in such a way that abrupt changes in the analog signal bring about a change in the output signal of the at least one operational amplifier. More gradual changes in the analog signal do not bring about a substantial change in the output signal of the at least one operational amplifier. The evaluation unit may evaluate the output signal of the at least one operational amplifier. | 06-23-2011 |
20110172821 | AUTOMATED TIRE INFLATION SYSTEM - A system and method for automatically inflating tires mounted on a vehicle without requiring the occupants of the vehicle from leaving the interior of the vehicle. In one aspect, the present invention is directed to a system for automatically inflating a tire of a vehicle. The system determines a location of a valve stem of the tire and a robotic arm for inflating the tire. The robotic arm attaches to the located valve stem based on the determined valve stem location. The robotic arm supplies air to the tire from an air supply. The valve stem is located and the robotic arm attaches to the valve stem based on the determined valve stem location, inflates the tire and detaches from the tire upon determining that a predetermined tire air pressure is attained. | 07-14-2011 |
20110172822 | Companion Robot for Personal Interaction - A mobile robot guest for interacting with a human resident performs a room-traversing search procedure prior to interacting with the resident, and may verbally query whether the resident being sought is present. Upon finding the resident, the mobile robot may facilitate a teleconferencing session with a remote third party, or interact with the resident in a number of ways. For example, the robot may carry on a dialogue with the resident, reinforce compliance with medication or other schedules, etc. In addition, the robot incorporates safety features for preventing collisions with the resident; and the robot may audibly announce and/or visibly indicate its presence in order to avoid becoming a dangerous obstacle. Furthermore, the mobile robot behaves in accordance with an integral privacy policy, such that any sensor recording or transmission must be approved by the resident. | 07-14-2011 |
20110184558 | Robot And Method For Controlling A Robot - The invention relates to a robot (R, | 07-28-2011 |
20110190936 | Portable Power Tool - The present invention relates to a portable power tool ( | 08-04-2011 |
20110196534 | APPARATUS AND METHOD FOR INSPECTION OF UNDERGROUND PIPES - A system for inspecting an underground conduit from within comprises a data acquisition subsystem configured to be placed within the conduit and to move along at least a portion of the conduit to obtain data regarding the conduit. The system comprises a data storage subsystem configured to be placed within the conduit and to move along the conduit. The data storage subsystem receives and stores at least a portion of the data from the data acquisition subsystem for retrieval after the data acquisition subsystem has moved along the conduit. | 08-11-2011 |
20110196535 | LINE INSPECTION ROBOT AND SYSTEM - The present invention relates to an overhead transmission line inspection robot and system for inspecting transmission line components and right of way conditions. The overhead transmission line inspection robot includes a communications and control system adapted to control the robot and transmit information and a drive system for propelling the robot along a shield wire to enable inspection over a large area. The robot further includes a camera adapted to inspect right of way and component conditions; a light detection and ranging (LiDar) sensor adapted to measure conductor position, vegetation, and nearby structures; and a global positioning system adapted to identify the robot's position and speed. | 08-11-2011 |
20110196536 | LINE INSPECTION ROBOT AND SYSTEM - The present invention relates to an overhead transmission line inspection robot and system for inspecting transmission line components and right of way conditions. The line inspection robot includes at least one drive system for propelling the robot along a line, a platform adapted to pivot relative to the at least one drive system, and a control system adapted to control the robot. | 08-11-2011 |
20110208358 | APPARATUS FOR SPLASH ZONE OPERATIONS - System for maintenance and inspection of structures located in hard to reach places, using a remote controlled arm that consists of arrangement for fixing said remote controlled arm to the structure, said remote controlled arm consists of at least two joints, said remote controlled arm has the ability to change working equipment, said remote controlled arm has a camera, said remote controlled arm is controlled from a control centre. | 08-25-2011 |
20110218674 | REMOTE PRESENCE SYSTEM INCLUDING A CART THAT SUPPORTS A ROBOT FACE AND AN OVERHEAD CAMERA - A tele-presence system that includes a cart. The cart includes a robot face that has a robot monitor, a robot camera, a robot speaker, a robot microphone, and an overhead camera. The system also includes a remote station that is coupled to the robot face and the overhead camera. The remote station includes a station monitor, a station camera, a station speaker and a station microphone. The remote station can display video images captured by the robot camera and/or overhead camera. By way of example, the cart can be used in an operating room, wherein the overhead camera can be placed in a sterile field and the robot face can be used in a non-sterile field. The user at the remote station can conduct a teleconference through the robot face and also obtain a view of a medical procedure through the overhead camera. | 09-08-2011 |
20110218675 | ROBOT SYSTEM COMPRISING VISUAL SENSOR - A robot system ( | 09-08-2011 |
20110245973 | PROTOCOL FOR A REMOTELY CONTROLLED VIDEOCONFERENCING ROBOT - A robotic system that includes a robot and a remote station. The remote station can generate control commands that are transmitted to the robot through a broadband network. The control commands can be interpreted by the robot to induce action such as robot movement or focusing a robot camera. The robot can generate reporting commands that are transmitted to the remote station through the broadband network. The reporting commands can provide positional feedback or system reports on the robot. | 10-06-2011 |
20110245974 | ROBOT DEVICE, METHOD OF CONTROLLING ROBOT DEVICE, AND PROGRAM - There is provided a robot device including an instruction acquisition unit that acquires an order for encouraging a robot device to establish joint attention on a target from a user, a position/posture estimation unit that estimates a position and posture of an optical indication device, which is operated by the user to indicate the target by irradiation of a beam, in response to acquisition of the order, and a target specifying unit that specifies a direction of the target indicated by irradiation of the beam based on an estimation result of the position and posture and specifies the target on an environment map representing a surrounding environment based on a specifying result of the direction. | 10-06-2011 |
20110245975 | MILKING APPARATUS AND PROCESS - The present invention relates to milking apparatus. The milking apparatus comprises sensor apparatus ( | 10-06-2011 |
20110282492 | METHOD OF CONTROLLING A ROBOTIC TOOL - A method of controlling a robot system includes the steps of providing a tool supported by a moveable mechanism of the robot system, providing a workpiece supported by a holder, generating an image of the workpiece, extracting a data from the image, the data relating to a feature of the workpiece, generating a continuous three-dimensional path along the workpiece using data extracted from the image, and moving the tool along the path. | 11-17-2011 |
20110288682 | Telepresence Robot System that can be Accessed by a Cellular Phone - A robot system with a robot that has a camera, monitor, a microphone and a speaker. A communication link can be established with the robot through a cellular phone. The link may include an audio only communication. Alternatively, the link may include audio and video communication between the cellular phone and the robot. The phone can transmit its resolution to the robot and cause the robot to transmit captured images at the phone resolution. The user can cause the robot to move through input on the cellular phone. For example, the phone may include an accelerometer that senses movement, and movement commands are then sent to the robot to cause a corresponding robot movement. The phone may have a touch screen that can be manipulated by the user to cause robot movement and/or camera zoom. | 11-24-2011 |
20110301758 | METHOD OF CONTROLLING ROBOT ARM | 12-08-2011 |
20110301759 | GRAPHICAL INTERFACE FOR A REMOTE PRESENCE SYSTEM - A robot system that includes a robot and a remote station. The remote station may be a personal computer coupled to the robot through a broadband network. A user at the remote station may receive both video and audio from a camera and microphone of the robot, respectively. The remote station may include a display user interface that has a variety of viewable fields and selectable buttons. | 12-08-2011 |
20120004775 | ROBOT APPARATUS AND CONTROL METHOD THEREFOR - A robot apparatus includes a robot mechanism having a plurality of joints, and actuators that drive joint axes of the robot mechanism. The robot apparatus includes a robot controller that controls the driving of the actuators based on a cost function that is a function of torque reference inputs for the actuators. | 01-05-2012 |
20120016522 | PROCESS AND MACHINE FOR IDENTIFICATION AND WORKING OF DEFECTS ON USED TYRES - An automatic process for the identification and working of defects ( | 01-19-2012 |
20120022691 | AUTOMATED POSITIONING OF AN ORGANIC POLARIZED OBJECT - A method, system and apparatus to position an organic polarized object to a predetermined orientation and a predetermined location are provided. In an embodiment, an image of the organic polarized object is captured through an image capture device. The image of the organic polarized object is converted to an image data set. This image data set if further converted to a dimension data set. A first location and a first orientation of the organic polarized object are determined through a processor. A pressure is applied to secure organic polarized object. The organic polarized object is secured through a robotic end effector and may be moved to a predetermined location and a predetermined orientation. The organic polarized object is adjusted to the predetermined orientation. The organic polarized object is positioned at a predetermined location. The predetermined location and predetermined orientation may be selected by a user. | 01-26-2012 |
20120059517 | OBJECT GRIPPING SYSTEM, OBJECT GRIPPING METHOD, STORAGE MEDIUM AND ROBOT SYSTEM - A system comprises: a measurement unit adapted to measure a position/orientation of at least one target object based on an image obtained by capturing the at least one target object; a selection unit adapted to select at least one grippable target object based on the position/orientation; a determination unit adapted to determine, as an object to be gripped, a grippable target object in a state with a highest priority from the at least one grippable target object based on priorities set in advance for states including gripping positions/directions; a gripping unit adapted to grip the object to be gripped in the state with the highest priority; and a changing unit adapted to change the state of the gripped object, to a state in which the gripped object is assembled to the other object. | 03-08-2012 |
20120065779 | ROBOT - A robot includes a gripping section adapted to grip an object by open and close a pair of finger sections, a moving device adapted to relatively move the object and the gripping section, and a control device adapted to control the moving device to move the gripping section relatively toward the object, and dispose the pair of finger sections in a periphery of the object, and then control the gripping section to open and close the pair of finger sections in a plane parallel to a mounting surface on which the object is mounted, pinch the object between the pair of finger sections from a lateral side of the object, and grip the object with the gripping section at least three contact points. | 03-15-2012 |
20120065780 | ROBOT - A robot includes a gripping section and a main body section to which the pair of finger sections are attached, having one end sections of the pair of finger sections rotatably connected to each other around a first rotating shaft disposed at a position separate from the main body section, and adapted to open and close the pair of finger sections by swinging the other side of the pair of finger sections on a plane parallel to a mounting surface on which an object is mounted centered on the first rotating shaft to thereby grip the object, a moving device adapted to relatively move the object and the gripping section, and a control device adapted to control the moving device to move the gripping section relatively toward the object, and grip the object with the gripping section at at least three contact points. | 03-15-2012 |
20120072023 | Human-Robot Interface Apparatuses and Methods of Controlling Robots - A method of controlling a robot using a human-robot interface apparatus in two-way wireless communication with the robot includes displaying on a display interface a two-dimensional image, an object recognition support tool library, and an action support tool library. The method further includes receiving a selected object image representing a target object, comparing the selected object image with a plurality of registered object shape patterns, and automatically recognizing a registered object shape pattern associated with the target object if the target object is registered with the human-robot interface. The registered object shape pattern may be displayed on the display interface, and a selected object manipulation pattern selected from the action support tool library may be received. Control signals may be transmitted to the robot from the human-robot interface. Embodiments may also include human-robot apparatuses (HRI) programmed to remotely control a robot. | 03-22-2012 |
20120072024 | TELEROBOTIC SYSTEM WITH DUAL APPLICATION SCREEN PRESENTATION - A robot system that includes a robot and a remote station. The remote station may be a personal computer coupled to the robot through a broadband network. A user at the remote station may receive both video and audio from a camera and a microphone of the robot, respectively. The remote station may include a visual display that displays both a first screen field and a second screen field. The first screen field may display a video image provided by a robot camera. The second screen field may display information such as patient records. The information from the second screen field may be moved to the first screen field and also transmitted to the robot for display by a robot monitor. The user at the remote station may annotate the information displayed by the robot monitor to provide a more active video-conferencing experience. | 03-22-2012 |
20120095597 | ROBOT CLEANER - Provided is a robot cleaner, and more particularly to a robot cleaner which detest whether a foreign material storage unit is separated. The robot cleaner includes a main body including a suction motor, a foreign material storage unit separably disposed within the main body, the foreign material storage unit storing foreign materials contained in sucked air, a foreign material cover for selectively shielding one side of the foreign material storage unit, and a detection unit for detecting whether the foreign material cover is opened. | 04-19-2012 |
20120109376 | CLEANER AND CONTROLLING METHOD OF THE SAME - Disclosed are a robot cleaner and a method for controlling the same. The robot cleaner may prevent repeated executions of a cleaning operation by recognizing its position through an absolute position recognition unit, in a case that the cleaning operation is performed again after being forcibly stopped due to arbitrary causes. And, the robot cleaner may solve a position recognition error by a relative position recognition unit with using an image detection unit, and may effectively perform a cleaning operation based on a similarity between an image detected by the image detection unit and an image with respect to a cleaning region. This may improve the system efficiency and stability, and enhance a user's convenience. | 05-03-2012 |
20120109377 | AUTOFOCUS AND/OR AUTOSCALING IN TELESURGERY - Robotic, telerobotic, and/or telesurgical devices, systems, and methods take advantage of robotic structures and data to calculate changes in the focus of an image capture device in response to movement of the image capture device, a robotic end effector, or the like. As the size of an image of an object shown in the display device varies with changes in a separation distance between that object and the image capture device used to capture the image, a scale factor between a movement command input may be changed in response to moving an input device or a corresponding master/slave robotic movement command of the system. This may enhance the perceived correlation between the input commands and the robotic movements as they appear in the image presented to the system operator. | 05-03-2012 |
20120109378 | ROBOT REFRIGERATOR AND SYSTEM HAVING THE SAME - Disclosed are a robot refrigerator and a robot refrigerator system. The robot refrigerator can be remotely controlled. The robot refrigerator generates image information from a surrounding image and transmits the generated image information to a wireless communication device. Then, the wireless communication device remotely controls the robot refrigerator, or monitors or remotely controls the robot refrigerator in real time, so that the robot refrigerator can easily avoid an obstacle to thus minimize a movement time of the robot refrigerator. Thus, user convenience and system reliability can be improved. | 05-03-2012 |
20120116588 | ROBOT SYSTEM AND CONTROL METHOD THEREOF - A robot system and a control method thereof in which, when a robot is located in a docking region, the robot calculates a distance by emitting infrared rays and detecting ultrasonic waves oscillated from a charging station, measures a distance from the charging station and performs docking with charging station. The distance between the robot and the charging station is precisely measured, thereby performing smooth and correct docking of the robot with the charging station. Further, the robot emits infrared rays only while performing docking with the charging station and thus reduces power consumption required for infrared ray emission, and wakes up a circuit in the charging station based on the infrared rays emitted from the robot and thus reduces power consumption of the charging station. | 05-10-2012 |
20120143373 | AUTOMATED STEERING WHEEL LEVELING SYSTEM AND METHOD - The present invention provides an automated steering wheel leveling system and method. Particularly, the automated steering wheel leveling system includes a machine vision, a plurality of motor cylinders, a motor, and a robot, each operated by a process PC. The machine vision photographs a steering wheel to obtain position information of the steering wheel and determines a stroke of a motor cylinder and a grip position of a gripper using the position information. The plurality of motor cylinders move a plurality of grippers to steering wheel to secure the steering wheel. The motor rotates the steering wheel in order to adjust a zero-point of the steering wheel. The robot then moves the machine vision, the motor cylinder, and the motor to the steering wheel to align a shaft of the servo motor with a shaft of the steering wheel. | 06-07-2012 |
20120143374 | ROBOT ACTION BASED ON HUMAN DEMONSTRATION - Embodiments of the invention provide an approach for reproducing a human action with a robot. The approach includes receiving data representing motions and contact forces of the human as the human performs the action. The approach further includes approximating, based on the motions and contact forces data, the center of mass (CoM) trajectory of the human in performing the action. Finally, the approach includes generating a planned robot action for emulating the designated action by solving an inverse kinematics problem having the approximated human CoM trajectory as a hard constraint and the motion capture data as a soft constraint. | 06-07-2012 |
20120143375 | MILKING ROBOT AND METHOD FOR TEAT CUP ATTACHMENT - A milking robot for teat cup attachment includes a robot arm having a gripper for holding at least one teat cup at a time; an image recording device mounted on the robot arm and provided to record at least one image of the teats of a milking animal; and a control device provided to control the robot arm to position the teat cup at a teat of the milking animal based on the at least one image of her teats. The image recording device is, before being provided to record the at least one image of the teats of the milking animal, provided to record at least one image of her hind legs; and the control device is, before being provided to control the robot arm to attach the teat cup to the teat of the milking animal, provided to control the robot arm to move the teat cup between her hind legs, from her rear and towards her udder, based on the at least one image of her hind legs. The milking robot further including a pivoting device for pivoting the image recording device with respect to the gripper of the robot arm between the recording of the at least one image of the hind legs of the milking animal and the recording of the at least one image of her teats. | 06-07-2012 |
20120158179 | ROBOT CONTROL APPARATUS - According to an embodiment, a target trajectory that takes into account the hardware constraints of a robot is generated, based on results obtained by calculating, temporally interpolating, and estimating image feature amounts from a captured image. | 06-21-2012 |
20120158180 | OBJECT GRIPPING APPARATUS, METHOD OF CONTROLLING THE SAME AND STORAGE MEDIUM - An object gripping apparatus includes an image capturing unit for capturing a region including a plurality of works, an obtaining unit for obtaining distance information of the region, a measurement unit for measuring three-dimensional positions/orientations of a plurality of gripping-candidate works out of the plurality of works based on the image and distance information, thereby generating three-dimensional position/orientation information, a selection unit for selecting a gripping-target work based on the three-dimensional position/orientation information, a gripping unit for gripping the gripping-target work, and an updating unit for updating the three-dimensional position/orientation information by measuring three-dimensional positions/orientations of the gripping-candidate works at a time interval during gripping of the gripping-target work. When the gripping ends, the next gripping-target work is selected based on the updated three-dimensional position/orientation information of the gripping-candidate works. | 06-21-2012 |
20120165984 | MOBILE ROBOT APPARATUS, DOOR CONTROL APPARATUS, AND DOOR OPENING AND CLOSING METHOD THEREFOR - A mobile robot apparatus includes a video recognition unit for recognizing a position of an opening button mounted around a door through video analysis after acquiring peripheral video information. Further, the mobile robot apparatus includes a mobile controller for performing an operation on the opening button at the position recognized by the video recognition unit to generate an opening selection signal, thereby allowing a door control apparatus to open the door according to the generated opening selection signal. | 06-28-2012 |
20120165985 | MAINTAINING A WIND TURBINE WITH A MAINTENANCE ROBOT - The present invention relates to a wind turbine maintenance system and a method of maintenance therein. A wind turbine maintenance system is provided, for carrying out a maintenance task in a nacelle of a wind turbine, comprising a maintenance robot, further comprising a detection unit, for identifying a fault in a sub-system in the nacelle and generating fault information, a processor unit, adapted to receive fault information from the detection unit and control the maintenance robot to perform a maintenance task, a manipulation arm to perform the maintenance task on the identified sub-system. In another aspect, a method of carrying out a maintenance task in a wind turbine is provided. | 06-28-2012 |
20120165986 | ROBOTIC PICKING OF PARTS FROM A PARTS HOLDING BIN - A robot system ( | 06-28-2012 |
20120185093 | ROBOT MOUNTING DEVICE - A robot mounting device includes a pair of spaced-apart arms adapted to retain robot body of a surveillance robot. The robot mounting device also includes a latching mechanism to secure the robot mounting device to a rifle. The positioning of the robot can be adjusted within robot mounting device to site a camera in the axle of the robot with respect to the rifle. The rifle can then be oriented to obtain visual imagery of an environment. | 07-19-2012 |
20120185094 | Mobile Human Interface Robot - A mobile robot that includes a drive system, a controller in communication with the drive system, and a volumetric point cloud imaging device supported above the drive system at a height of greater than about one feet above the ground and directed to be capable of obtaining a point cloud from a volume of space that includes a floor plane in a direction of movement of the mobile robot. The controller receives point cloud signals from the imaging device and issues drive commands to the drive system based at least in part on the received point cloud signals. | 07-19-2012 |
20120185095 | Mobile Human Interface Robot - A mobile human interface robot that includes a base defining a vertical center axis and a forward drive direction and a holonomic drive system supported by the base. The drive system has first, second, and third driven drive wheels, each trilaterally spaced about the vertical center axis and having a drive direction perpendicular to a radial axis with respect to the vertical center axis. The robot further includes a controller in communication with the holonomic drive system, a torso supported above the base, and a touch sensor system in communication with the controller. The touch sensor system is responsive to human contact. The controller issues drive commands to the holonomic drive system based on a touch signal received from the touch sensor system. | 07-19-2012 |
20120185096 | Operating a Mobile Robot - A method of operating a mobile robot to traverse a threshold includes detecting a threshold proximate the robot. The robot includes a holonomic drive system having first, second, and third drive elements configured to maneuver the robot omni-directionally. The method further includes moving the first drive element onto the threshold from a first side and moving the second drive element onto the threshold to place both the first and second drive elements on the threshold. The method includes moving the first drive element off a second side of the threshold, opposite to the first side of the threshold, and moving the third drive element onto the threshold, placing both the second and third drive elements on the threshold. The method includes moving both the second and third drive elements off the second side of the threshold. | 07-19-2012 |
20120185097 | CONTROL COMPUTER AND METHOD OF CONTROLLING ROBOTIC ARM - A computer determines a first origin of a first coordinate system of a PCB, and controls a robotic arm to position a probe above the first origin. Furthermore, the computer determines a second origin of a second coordinate system of the robotic arm, and determines displacement values from the first origin to a test point in controlling movements of the robotic arm in the second coordinate system. A graph representing the test point is recognized in an image of the PCB, pixel value differences between the graph center and the image center are determined and converted to displacement correction values for controlling the movements of the robotic arm and determining 3D coordinates of the test point. The robotic arm is moved along a Z-axis of the second coordinate system to precisely position the probe on the test point of the PCB. | 07-19-2012 |
20120191246 | MOBILE TELE-PRESENCE SYSTEM WITH A MICROPHONE SYSTEM - A remote controlled robot system that includes a robot and a remote control station. The robot includes a binaural microphone system that is coupled to a speaker system of the remote control station. The binaural microphone system may include a pair of microphones located at opposite sides of a robot head. the location of the microphones roughly coincides with the location of ears on a human body. Such microphone location creates a mobile robot that more effectively simulates the tele-presence of an operator of the system. The robot may include two different microphone systems and the ability to switch between systems. For example, the robot may also include a zoom camera system and a directional microphone. The directional microphone may be utilized to capture sound from a direction that corresponds to an object zoomed upon by the camera system. | 07-26-2012 |
20120197439 | INTERFACING WITH A MOBILE TELEPRESENCE ROBOT - A telepresence robot may include a drive system, a control system, an imaging system, and a mapping module. The mapping module may access a plan view map of an area and tags associated with the area. In various embodiments, each tag may include tag coordinates and tag information, which may include a tag annotation. A tag identification system may identify tags within a predetermined range of the current position and the control system may execute an action based on an identified tag whose tag information comprises a telepresence robot action modifier. The telepresence robot may rotate an upper portion independent from a lower portion. A remote terminal may allow an operator to control the telepresence robot using any combination of control methods, including by selecting a destination in a live video feed, by selecting a destination on a plan view map, or by using a joystick or other peripheral device. | 08-02-2012 |
20120209429 | ROBOT APPARATUS, POSITION DETECTING DEVICE, POSITION DETECTING PROGRAM, AND POSITION DETECTING METHOD - A robot apparatus includes: an image pickup device; a goal-image storing unit that stores, according to sensitivity represented by an amount of change of a pixel value at the time when a target aligned with a goal position on an image at a pixel level is displaced by a displacement amount at a sub-pixel level, goal image data in a state in which the target is arranged; and a target detecting unit that calculates a coincident evaluation value of the target on the basis of comparison of image data including the target and the goal image data stored by the goal-image storing unit and detects positional deviation of the target with respect to the goal position on the basis of the coincidence evaluation value. | 08-16-2012 |
20120209430 | POSITION DETECTION DEVICE FOR ROBOT, ROBOTIC SYSTEM, AND POSITION DETECTION METHOD FOR ROBOT - A position detection device for a horizontal articulated robot includes a camera for imaging the robot or a work as an imaging object, a control section for calculating a location of the imaging object from an image, an acquisition section (I/O) for obtaining the drive amounts of first and second electric motors of the robot, and a storage section for storing the calculated location of the imaging object and the drive amounts so as to correspond to each other. A common trigger signal for detecting the location of the imaging object is input to the camera and the I/O. The camera starts to image the imaging object in response to the input of the trigger signal. The I/O starts to obtain the drive amounts in response to the input of the trigger signal. | 08-16-2012 |
20120209431 | ROBOTIC BASED HEALTH CARE SYSTEM - A robotic system that can be used to treat a patient. The robotic system includes a mobile robot that has a camera. The mobile robot is controlled by a remote station that has a monitor. A physician can use the remote station to move the mobile robot into view of a patient. An image of the patient is transmitted from the robot camera to the remote station monitor. A medical personnel at the robot site can enter patient information into the system through a user interface. The patient information can be stored in a server. The physician can access the information from the remote station. The remote station may provide graphical user interfaces that display the patient information and provide both a medical tool and a patient management plan. | 08-16-2012 |
20120209432 | HYBRID CONTROL DEVICE - A brain-based device (BBD) for moving in a real-world environment has sensors that provide data about the environment, actuators to move the BBD, and a hybrid controller which includes a neural controller having a simulated nervous system being a model of selected areas of the human brain and a non-neural controller based on a computational algorithmic network. The neural controller and non-neural controller interact with one another to control movement of the BBD. | 08-16-2012 |
20120209433 | SOCIAL ROBOT - Social robot formed by an artificial vision system composed of webcam cameras, a voice recognition system formed by three microphones arranged in a triangular configuration, an expression system composed of an LED matrix, formed by a plurality of LEDs and a status LED, and eyelids formed by half-moons connected to gearwheels which engage with respective servomotors via transmission wheels, a speech synthesis system composed of loudspeakers, a system for detecting obstacles which is formed by ultrasound sensors, and a movement system formed by two driving wheels. | 08-16-2012 |
20120215358 | ROBOTIC ARM SYSTEM - A robotic arm for use with a robotic system and methods for making and using the same are described. The arm can have multiple joints and can have one or more articulating end effectors. The arm and end effectors can have safety releases to prevent over-rotation. The arm can have individual cooling. | 08-23-2012 |
20120221144 | Disruptor Guidance System and Methods Based on Scatter Imaging - A system and method for guiding a disruptor robot in disruption of an explosive device. The system includes a source of penetrating radiation, having a coordinated position on the robot with respect to a disrupter coupled to robot, and at least one detector for detecting radiation produced by the source and scattered by the explosive device. An analyzer produces an image of the explosive device and facilitates identification of a disruption target of the explosive device. A controller positions the disruptor with respect to the explosive device so that the disruptor is aimed at the disruption target. | 08-30-2012 |
20120221145 | MASTER INPUT DEVICE AND MASTER-SLAVE MANIPULATOR - A master input device operates a slave manipulator which includes joints corresponding to a plurality of degrees of freedom. The device includes an operating unit and detection units of two or more systems. The operating unit is capable of being changed in position and orientation by an operator's operation. The operating unit is provided command values of a position and orientation of the slave manipulator as the position and orientation thereof change. The detection units individually detect different physical quantities related to the operating unit in order to detect the position and orientation of the operating unit. | 08-30-2012 |
20120226382 | ROBOT-POSITION DETECTING DEVICE AND ROBOT SYSTEM - A robot-position detecting device includes: a position-data acquiring unit that acquires position data indicating actual positions of a robot; a position-data input unit that receives the position data output from the position-data acquiring unit; and a position calculating unit that calculates a computational position of the robot through linear interpolation using first and second position data input to the position-data input unit at different times. | 09-06-2012 |
20120232697 | ROBOT CLEANER AND CONTROLLING METHOD THEREOF - Disclosed are a robot cleaner capable of performing a cleaning operation by selecting a cleaning algorithm suitable for the peripheral circumstances based on an analysis result of captured image information, and a controlling method thereof. The robot cleaner comprises an image sensor unit configured to capture image information when an operation instructing command is received, and a controller configured to analyze the image information captured by the image sensor unit, and configured to control a cleaning operation based on a first cleaning algorithm selected from a plurality of pre-stored cleaning algorithms based on a result of the analysis. | 09-13-2012 |
20120239196 | Natural Human to Robot Remote Control - The subject disclosure is directed towards controlling a robot based upon sensing a user's natural and intuitive movements and expressions. User movements and/or facial expressions are captured by an image and depth camera, resulting in skeletal data and/or image data that is used to control a robot's operation, e.g., in a real time, remote (e.g., over the Internet) telepresence session. Robot components that may be controlled include robot “expressions” (e.g., audiovisual data output by the robot), robot head movements, robot mobility drive operations (e.g., to propel and/or turn the robot), and robot manipulator operations, e.g., an arm-like mechanism and/or hand-like mechanism. | 09-20-2012 |
20120259465 | CLEANING SYSTEM - A cleaning system including a first virtual wall, a second virtual wall and a cleaning robot is disclosed. The first virtual wall includes a first specific pattern. When a light emits the first specific pattern, a first specific reflected light is generated. The second virtual wall includes a second specific pattern. When the light emits the second specific pattern, a second specific reflected light is generated. The cleaning robot, based on the first and the second specific reflected lights, obtains and records positions of the first and the second virtual walls. The cleaning robot defines a first virtual line according to the recorded positions. A traveling path of the cleaning robot is limited by the first virtual line. | 10-11-2012 |
20120265343 | AUTONOMOUS COVERAGE ROBOT SENSING - An autonomous coverage robot detection system includes an emitter configured to emit a directed beam, a detector configured to detect the directed beam and a controller configured to direct the robot in response to a signal detected by the detector. In some examples, the detection system detects a stasis condition of the robot. In some examples, the detection system detects a wall and can follow the wall in response to the detected signal. | 10-18-2012 |
20120265344 | ROBOT SYSTEM AND METHOD FOR OPERATING ROBOT SYSTEM - This robot system includes a first imaging portion detachably mounted to a robot arm and a control portion controlling the operation of the robot arm and a grasping portion, and the control portion is so formed as to detach the first imaging portion from the robot arm before moving an object to be grasped that is being grasped by the grasping portion to a prescribed processing position. | 10-18-2012 |
20120265345 | ROBOT SYSTEM AND PROCESSED OBJECT MANUFACTURING METHOD - In this robot system, a control portion is configured to control a robot to grasp an object to be grasped by a grasping portion, and control a first imaging portion to examine the object to be grasped while driving a robot arm to change a posture of the object to be grasped multiple times. | 10-18-2012 |
20120265346 | AUTONOMOUS COVERAGE ROBOT SENSING - An autonomous coverage robot detection system includes an emitter configured to emit a directed beam, a detector configured to detect the directed beam and a controller configured to direct the robot in response to a signal detected by the detector. In some examples, the detection system detects a stasis condition of the robot. In some examples, the detection system detects a wall and can follow the wall in response to the detected signal. | 10-18-2012 |
20120277913 | Vision System for Robotic Attacher - In certain embodiments, a system includes a controller operable to access a first image generated by a first camera. The controller determines a reference point from at least one main feature of a dairy livestock included in the first image. The controller is further operable to access a second image generated by the second camera. The second image includes at least a portion of an udder of the dairy livestock. The controller determines a location of a teat of the dairy livestock based on the second image. | 11-01-2012 |
20120277914 | Autonomous and Semi-Autonomous Modes for Robotic Capture of Images and Videos - The subject disclosure is directed towards a set of autonomous and semi-autonomous modes for a robot by which the robot captures content (e.g., still images and video) from a location such as a house. The robot may produce a summarized presentation of the content (a “botcast”) that is appropriate for a specific scenario, such as an event, according to a specified style. Modes include an event mode where the robot may interact with and simulate event participants to provide desired content for capture. A patrol mode operates the robot to move among locations (e.g., different rooms) to capture a panorama (e.g., 360 degrees) of images that can be remotely viewed. | 11-01-2012 |
20120290134 | ESTIMATION OF A POSITION AND ORIENTATION OF A FRAME USED IN CONTROLLING MOVEMENT OF A TOOL - A robotic system includes a camera having an image frame whose position and orientation relative to a fixed frame is determinable through one or more image frame transforms, a tool disposed within a field of view of the camera and having a tool frame whose position and orientation relative to the fixed frame is determinable through one or more tool frame transforms, and at least one processor programmed to identify pose indicating points of the tool from one or more camera captured images, determine an estimated transform for an unknown one of the image and tool frame transforms using the identified pose indicating points and known ones of the image and tool frame transforms, update a master-to-tool transform using the estimated and known ones of the image and tool frame transforms, and command movement of the tool in response to movement of a master using the updated master-to-tool transform. | 11-15-2012 |
20120296473 | ROBOT ARM AND DETECTING DEVICE HAVING SAME - A robot arm includes a support arm, an adjusting rod and a detecting unit. The adjusting rod rotatably extends through the support arm. The detecting unit is attached to the adjusting rod. The detecting unit includes an image capture device and a probe device. The image capture device captures images of a workpiece. The probe device includes a driving device and a probe. The driving device may drive the probe to move between a first position where the probe does not visually prevent images of the workpiece being captured by the image capture device, and a second position where the probe does block the images of the workpiece being captured by the image capture device. | 11-22-2012 |
20120296474 | ROBOT SYSTEM - A robot system according to an embodiment includes a robot a switching determination unit and a rearrangement instruction unit The switching determination unit performs determination of switching between the operation of transferring the workpiece and the operation of rearranging the workpiece based on the state of transferring the workpiece by the robot The rearrangement instruction unit instructs the robot to rearrange the workpiece. | 11-22-2012 |
20120303160 | COMPANION ROBOT FOR PERSONAL INTERACTION - A mobile robot guest for interacting with a human resident performs a room-traversing search procedure prior to interacting with the resident, and may verbally query whether the resident being sought is present. Upon finding the resident, the mobile robot may facilitate a teleconferencing session with a remote third party, or interact with the resident in a number of ways. For example, the robot may carry on a dialogue with the resident, reinforce compliance with medication or other schedules, etc. In addition, the robot incorporates safety features for preventing collisions with the resident; and the robot may audibly announce and/or visibly indicate its presence in order to avoid becoming a dangerous obstacle. Furthermore, the mobile robot behaves in accordance with an integral privacy policy, such that any sensor recording or transmission must be approved by the resident. | 11-29-2012 |
20120323365 | DOCKING PROCESS FOR RECHARGING AN AUTONOMOUS MOBILE DEVICE - Described herein are technologies pertaining to autonomously docking a mobile robot at a docking station for purposes of recharging batteries of the mobile robot. The mobile robot uses vision-based navigation and a known map of the environment to navigate toward the docking station. Once sufficiently proximate to the docking station, the mobile robot captures infrared images of the docking station, and granularly aligns itself with the docking station based upon the captured infrared images of the docking station. As the robot continues to drive towards the docking station, the robot monitors infrared sensors for infrared beams emitted from the docking station. If the infrared sensors receive the infrared beams, the robot continues to drive forward until the robot successfully docks with the docking station. | 12-20-2012 |
20120323366 | MANIPULATOR WITH CAMERA - Provided is a manipulator with at least one camera capable of observing an end effector from a direction suitable for work. A rotating portion rotatable coaxially with the end effector is provided to a link adjacent to a link located at a manipulator tip end. At least one camera for recognizing a work piece as a object is arranged on the rotating portion through a camera platform. An actuator for controlling a rotation angle of the rotating portion is driven according to a rotation angle of the link located at the manipulator tip end, and thus the camera is arranged in a direction perpendicular to a plane where end effector can move when the end effector performs a grip work. In an assembly work, the rotating portion is rotated such that the camera is arranged in a direction parallel to the plane where end effector can move. | 12-20-2012 |
20130006423 | TARGET OBJECT GRIPPING APPARATUS, METHOD FOR CONTROLLING THE SAME AND STORAGE MEDIUM - A target object gripping apparatus comprises: an estimation unit configured to estimate an orientation of a target object based on orientation estimation parameters; a gripping unit configured to grip the target object based on the orientation of the target object estimated by the estimation unit; a detection unit configured to detect a failure of gripping by the gripping unit; and a modifying unit configured to modify the orientation estimation parameters based on the orientation of the target object when the detection unit detects a gripping failure. | 01-03-2013 |
20130006424 | METHOD AND APPARATUS FOR TISSUE TRANSFER - A handheld tool is disclosed which may be used to transfer a plurality of plant tissue explants from a first container to a second container. The handheld tool may include a disposable tip member which couples the plurality of plant tissue explants through use of negative pressure. An automated system which transfers a plurality of plant tissue explants from a first container to a second container is also disclosed. The automated system may include a first presentment system which moves the first container to a region, a second presentment system which moves the second container to the region, and a robot system that transfers the plurality of plant tissue explants from the first container to the second container. | 01-03-2013 |
20130013112 | Constrained Resolved Acceleration Control - A system, method, and computer program product for controlling an articulated system are described. The system estimates kinematics of body segments of the articulated system and constructs a weighted pseudo-inverse matrix to enforce kinematic constraints as well as achieve dynamic consistency based on the estimated kinematics. The system converts task descriptors to joint commands using the weighted pseudo-inverse matrix and controls the articulated system at both the velocity level and the acceleration level and enforces kinematic constraints using the joint commands. | 01-10-2013 |
20130024025 | Autonomous Robot and A Positioning Method Thereof - An autonomous robot and a positioning method thereof are disclosed. The autonomous robot includes an environment information detection device, a map construction module, a setting module, a path planning module, and a driving module. The environment information detection device is for detecting environment information about an environment where the autonomous robot is situated. An environment map is constructed based on the environment information detected by the environment information detection device. The setting module is used for setting a working boundary on the environment map. The path planning module is for planning a moving path in a working zone and is electrically connected to the setting module. The driving module for driving the autonomous robot to move along the moving path is electrically connected to the path planning module. | 01-24-2013 |
20130030570 | ROBOT DEVICE, METHOD OF CONTROLLING THE SAME, COMPUTER PROGRAM, AND ROBOT SYSTEM - Provided is a robot device including an image input unit for inputting an image of surroundings, a target object detection unit for detecting an object from the input image, an object position detection unit for detecting a position of the object, an environment information acquisition unit for acquiring surrounding environment information of the position of the object, an optimum posture acquisition unit for acquiring an optimum posture corresponding to the surrounding environment information for the object, an object posture detection unit for detecting a current posture of the object from the input image, an object posture comparison unit for comparing the current posture of the object to the optimum posture of the object, and an object posture correction unit for correcting the posture of the object when the object posture comparison unit determines that there is a predetermined difference or more between the current posture and the optimum posture. | 01-31-2013 |
20130030571 | ROBOTIZED SURGERY SYSTEM WITH IMPROVED CONTROL - A robotized surgery system ( | 01-31-2013 |
20130041508 | SYSTEMS AND METHODS FOR OPERATING ROBOTS USING VISUAL SERVOING - A system and method for providing intuitive, visual based remote control is disclosed. The system can comprise one or more cameras disposed on a remote vehicle. A visual servoing algorithm can be used to interpret the images from the one or more cameras to enable the user to provide visual based inputs. The visual servoing algorithm can then translate that commanded motion into the desired motion at the vehicle level. The system can provide correct output regardless of the relative position between the user and the vehicle and does not require any previous knowledge of the target location or vehicle kinematics. | 02-14-2013 |
20130054028 | SYSTEM AND METHOD FOR CONTROLLING ROBOT - In a method for controlling a robot using a computing device, 3D images of an operator are captured in real-time. Different portions of the operator are determined in one of the 3D images according to moveable joints of the robot, and each of the determined portions is correlated with one of the moveable joints. Motion data of each of the determined portions is obtained from the 3D images. A control command is sent to the robot according to the motion data of each of the determined portions, to control each moveable joint of the robot to implement a motion of a determined portion that is correlated with the moveable joint. | 02-28-2013 |
20130054029 | AUTO-REACH METHOD FOR A REMOTE VEHICLE - The present teachings provide a method of controlling a remote vehicle having an end effector and an image sensing device. The method includes obtaining an image of an object with the image sensing device, determining a ray from a focal point of the image to the object based on the obtained image, positioning the end effector of the remote vehicle to align with the determined ray, and moving the end effector along the determined ray to approach the object. | 02-28-2013 |
20130054030 | OBJECT GRIPPING APPARATUS, OBJECT GRIPPING METHOD, AND OBJECT GRIPPING PROGRAM - In an object gripping apparatus according to the present invention, based on three-dimensional position and attitude of a gripping object and a gripping position that is preliminarily set for each gripping object, an operation of a grip part is controlled such that the grip part grips the gripping position on the gripping object. Thereby, an intended gripping position can be identified, and the object can be appropriately gripped. | 02-28-2013 |
20130066469 | MOBILE VIDEOCONFERENCING ROBOT SYSTEM WITH NETWORK ADAPTIVE DRIVING - A remote control station that controls a robot through a network. The remote control station transmits a robot control command that includes information to move the robot. The remote control station monitors at least one network parameter and scales the robot control command as a function of the network parameter. For example, the remote control station can monitor network latency and scale the robot control command to slow down the robot with an increase in the latency of the network. Such an approach can reduce the amount of overshoot or overcorrection by a user driving the robot. | 03-14-2013 |
20130073087 | SYSTEM FOR CONTROLLING ROBOTIC CHARACTERS TO ENHANCE PHOTOGRAPHIC RESULTS - A method for controlling a robotic apparatus to produce desirable photographic results. The method includes, with a motor controller, first operating a robotics assembly to animate the robotic apparatus and, then, detecting an upcoming image capture. The method further includes, with the motor controller in response to the detecting of the upcoming image capture, second operating the robotics assembly to pose the robotic apparatus for the upcoming image capture. In some embodiments, the detecting includes a sensor mounted on the robotic apparatus sensing a pre-flash of light from a red-eye effect reduction mechanism of a camera. In other cases, the detecting includes a sensor mounted on the robotics apparatus sensing a range finder signal from a range finder of a camera. The posing may include opening eyes, moving a mouth into a smile, or otherwise striking a pose that is held temporarily to facilitate image capture with a camera. | 03-21-2013 |
20130073088 | MOBILE ROBOT AND CONTROLLING METHOD OF THE SAME - In a mobile robot and a controlling method of the same, the mobile robot is able to recognize a precise position thereof by detecting a plurality of images through an image detection unit, extracting one or more feature points from the plurality of images, and comparing and matching information related to the feature points. The mobile robot is also able to easily detect a position of a charging station based on image information, and quickly move to the charging station upon the lack of residual battery capacity. The mobile robot is also able to detect a position of the charging station based on the image information and receive a guideline signal within a signal reception range, so as to easily dock with the charging station. | 03-21-2013 |
20130073089 | ROBOT SYSTEM AND IMAGING METHOD - A robot system includes: an imaging unit including an imaging device and a distance measuring part; and a robot to which the imaging unit is attached. The imaging device preliminarily images a workpiece. The robot preliminarily moves the imaging unit based on the result of the preliminary imaging. The distance measuring part measures the distance to the workpiece. The robot actually moves the imaging unit based on the result of the measurement. The imaging device actually images the workpiece. | 03-21-2013 |
20130073090 | ROBOT SYSTEM - A robot system includes: a projecting unit for projecting a slit light on a specified placement region and moving the slit light in a specified direction; an imaging unit for imaging the slit light moving on a work on the placement region; an estimated projection region determining unit for determining an estimated projection region such that the length of the estimated projection region in a direction substantially parallel to the moving direction grows larger toward the center of the image in the intersection direction; a projection position detecting unit for detecting a projection position of the slit light within the estimated projection region. The robot system further includes a robot for gripping the workpiece. | 03-21-2013 |
20130073091 | ROBOT CONTROL APPARATUS AND ROBOT SYSTEM - A robot control apparatus, which controls motions of an industrial robot based on processing results of an image processing apparatus which images the robot or objects around the robot, includes: a first communication unit which communicates with a computer for development as an external computer; a second communication unit which is connected to the image processing apparatus via a network; and a command processing unit which opens a communication port of the second communication unit and causes the second communication unit to start communication with the image processing apparatus via a server on the network in response to an open command received by the first communication unit. | 03-21-2013 |
20130085605 | ROBOT SYSTEM AND METHOD FOR PRODUCING A TO-BE-PROCESSED MATERIAL - A robot system includes a container, a disposed-state detector, and a robot arm. The container is configured to accommodate a plurality of to-be-held objects and includes a reticulated portion. The disposed-state detector is configured to detect disposed states of the plurality of respective to-be-held objects disposed in the container. The robot arm includes a holder configured to hold a to-be-held object among the plurality of to-be-held objects based on the disposed states of the plurality of respective to-be-held objects detected by the disposed-state detector. | 04-04-2013 |
20130116825 | ROBOT CLEANER AND CONTROLLING METHOD OF THE SAME - Disclosed are a robot cleaner and a method for controlling the same. A plurality of images are detected through an image detecting unit such as an upper camera, and two or more feature points are extracted from the plurality of images. Then, a feature point set consisting of the feature points is created, and the feature points included in the feature point set are matched with each other. This may allow the robot cleaner to precisely recognize a position thereof. Furthermore, this may allow the robot cleaner to perform a cleaning operation or a running operation by interworking a precisely recognized position with a map. | 05-09-2013 |
20130116826 | ROBOT CLEANER AND CONTROLLING METHOD OF THE SAME - Disclosed are a robot cleaner and a method for controlling the same. The robot cleaner is capable of recognizing a position thereof by extracting one or more feature points having 2D coordinates information with respect to each of a plurality of images, by matching the feature points with each other, and then by creating a matching point having 3D coordinates information. Matching points having 3D coordinates information are created to recognize a position of the robot cleaner, and the recognized position is verified based on a moving distance measured by using a sensor. This may allow a position of the robot cleaner to be precisely recognized, and allow the robot cleaner to perform a cleaning operation or a running operation by interworking the precisely recognized position with a map. | 05-09-2013 |
20130123985 | TRANSPARENT OBJECT DETECTION SYSTEM AND TRANSPARENT FLAT PLATE DETECTION SYSTEM - A disclosed transparent body detection system includes an image acquisition unit acquiring a vertical polarization image and a horizontal polarization image by acquiring an image of a first region, the image including a transparent body having characteristics in which a polarization direction of transmission light changes; a placing table on which the transparent body is to be placed; a polarization filter disposed opposite to the image acquisition unit across the placing table and at a position including a second region, an image of the second region including at least the transparent body in the first region and being acquired; and an image processing apparatus detecting the transparent body based on distribution of vertical/lateral polarization degree of a vertical/lateral polarization degree image based on the vertical polarization image and the horizontal polarization image. | 05-16-2013 |
20130123986 | METHOD AND APPARATUS FOR TISSUE TRANSFER - A handheld tool is disclosed which may be used to transfer a plurality of plant tissue explants from a first container to a second container. The handheld tool may include a disposable tip member which couples the plurality of plant tissue explants through use of negative pressure. An automated system which transfers a plurality of plant tissue explants from a first container to a second container is also disclosed. The automated system may include a first presentment system which moves the first container to a region, a second presentment system which moves the second container to the region, and a robot system that transfers the plurality of plant tissue explants from the first container to the second container. | 05-16-2013 |
20130123987 | ROBOTIC SYSTEM, ROBOT CONTROL METHOD AND ROBOT CONTROL PROGRAM - A robotic system includes: a detection unit that detects at least one of a voice, light and an image of a content outputted by a content output device; a decision unit that assesses information detected by the detection unit on the basis of reference data so as to assess the content outputted by the content output device; and a control unit that controls a behavior or a state of the robotic system on the basis of the assessment made by the decision unit. | 05-16-2013 |
20130131866 | Protocol for a Remotely Controlled Videoconferencing Robot - A robotic system that includes a robot and a remote station. The remote station can generate control commands that are transmitted to the robot through a broadband network. The control commands can be interpreted by the robot to induce action such as robot movement or focusing a robot camera. The robot can generate reporting commands that are transmitted to the remote station through the broadband network. The reporting commands can provide positional feedback or system reports on the robot. | 05-23-2013 |
20130144438 | OPTICAL POSITION DETECTING DEVICE, ROBOT HAND, AND ROBOT ARM - An optical position detecting device includes a plurality of light source sections which emits detection light, a light detection section which receives the detection light reflected by a target object located in an emitting space of the detection light, a light source driving section which turns on some light source sections among the plurality of light source sections in a first period and turns on, in a second period, light source sections different from the light source sections turned on in the first period, and a position detecting section which detects the position of the target object on the basis of a light-receiving result of the light detection section in the first period and the second period. Each of the light source sections includes a plurality of light-emitting elements arrayed in a direction intersecting the direction of the optical axis of the detection light. | 06-06-2013 |
20130158709 | ROBOT CONTROL DURING AN E-STOP EVENT - A system for a work cell having a carrier that moves a product along an assembly line includes an assembly robot, sensor, and controller. An arm of the robot moves on the platform adjacent to the carrier. The sensor measures a changing position of the carrier and encodes the changing position as a position signal. The controller receives the position signal and calculates a lag value of the robot with respect to the carrier using the position signal. The controller detects a requested e-stop of the carrier when the arm and product are in mutual contact, and selectively transmits a speed signal to the robot to cause a calibrated deceleration of the platform before executing the e-stop event. This occurs only when the calculated tracking position lag value is above a calibrated threshold. A method is also disclosed for using the above system in the work cell. | 06-20-2013 |
20130158710 | TAKING OUT DEVICE HAVING FUNCTION FOR CORRECTING POSTURE OF AN ARTICLE - A taking out device capable of correcting a posture of an article to be taken out and taking out the article, while considering interference between a robot hand and a container for containing the article. Since the article is inclined to the left side, the hand approaches and contacts the article from the left side. Then, the hand pushes to the right side while claws of the hand engage a hole portion of the article in order to correct the posture of the article such that the positional relationship between the article and the hand represents a reference position/posture. In this way, the hand is positioned at a second position/posture in which the posture of the article relative to the claws allows the article to be taken out. | 06-20-2013 |
20130158711 | ACOUSTIC PROXIMITY SENSING - An acoustic pretouch sensor or proximity sensor ( | 06-20-2013 |
20130166070 | OBTAINING FORCE INFORMATION IN A MINIMALLY INVASIVE SURGICAL PROCEDURE - Methods of and a system for providing force information for a robotic surgical system. The method includes storing first kinematic position information and first actual position information for a first position of an end effector; moving the end effector via the robotic surgical system from the first position to a second position; storing second kinematic position information and second actual position information for the second position; and providing force information regarding force applied to the end effector at the second position utilizing the first actual position information, the second actual position information, the first kinematic position information, and the second kinematic position information. Visual force feedback is also provided via superimposing an estimated position of an end effector without force over an image of the actual position of the end effector. Similarly, tissue elasticity visual displays may be shown. | 06-27-2013 |
20130197696 | ROBOT APPARATUS, ASSEMBLING METHOD, AND RECORDING MEDIUM - A robot apparatus includes a gripping unit configured to grip a first component, a force sensor configured to detect, as detection values, a force and a moment acting on the gripping unit, a storing unit having stored therein contact states of the first component and a second component and transition information in association with each other, a selecting unit configured to discriminate, on the basis of the detection values, a contact state of the first component and the second component and select, on the basis of a result of the discrimination, the transition state stored in the storing unit, and a control unit configured to control the gripping unit on the basis of the transition information selected by the selecting unit. | 08-01-2013 |
20130204436 | APPARATUS FOR CONTROLLING ROBOT AND CONTROL METHOD THEREOF - An apparatus for controlling a robot capable of controlling the motion of the arm of the robot, and a control method thereof, the apparatus including an image obtaining unit configured to obtain a three-dimensional image of a user, a driving unit configured to drive an arm of the robot that is composed of a plurality of segments, and a control unit configured to generate a user model that corresponds to a motion of the joint of the user based on the three-dimensional image, to generate a target model having a length of the segment that varies based on the user model, and to allow the arm of the robot to be driven based on the target model. | 08-08-2013 |
20130204437 | AGRICULTURAL ROBOT SYSTEM AND METHOD - An agricultural robot system and method of harvesting, pruning, culling, weeding, measuring and managing of agricultural crops. Uses autonomous and semi-autonomous robot(s) comprising machine-vision using cameras that identify and locate the fruit on each tree, points on a vine to prune, etc., or may be utilized in measuring agricultural parameters or aid in managing agricultural resources. The cameras may be coupled with an arm or other implement to allow views from inside the plant when performing the desired agricultural function. A robot moves through a field first to “map” the plant locations, number and size of fruit and approximate positions of fruit or map the cordons and canes of grape vines. Once the map is complete, a robot or server can create an action plan that a robot may implement. An action plan may comprise operations and data specifying the agricultural function to perform. | 08-08-2013 |
20130211594 | Proxy Robots and Remote Environment Simulator for Their Human Handlers - A system for controlling a human-controlled proxy robot surrogate is presented. The system includes a plurality of motion capture sensors for monitoring and capturing all movements of a human handler such that each change in joint angle, body posture or position; wherein the motion capture sensors are similar in operation to sensors utilized in motion picture animation, suitably modified to track critical handler movements in near real time. A plurality of controls attached to the proxy robot surrogate is also presented that relays the monitored and captured movements of the human handler as “follow me” data to the proxy robot surrogate in which the plurality of controls are configured such that the proxy robot surrogate emulates the movements of the human handler. | 08-15-2013 |
20130218341 | CONTROL METHOD FOR CLEANING ROBOTS - An embodiment of the invention provides a control method of a cleaning robot with a non-omnidirectional light detector. The method includes the steps of: detecting a light beam via the non-omnidirectional light detector; stopping the cleaning robot and spinning the non-omnidirectional light detector when the non-omnidirectional light detector detects the light beam; stopping the spinning of the non-omnidirectional light detector and estimating a first spin angle when the non-omnidirectional light detector does not detect the light beam; and adjusting a moving direction of the cleaning robot according to the first spin angle. | 08-22-2013 |
20130218342 | CONTROL METHOD FOR CLEANING ROBOTS - An embodiment of the invention provides a control method of a cleaning robot. The method includes the steps of: forming a cleaning area according to at least three points which are selected from a light generating device, a charging station or an obstacle; moving the cleaning robot along an outer of the cleaning area from a first position; recording a first cleaning route when the cleaning robot returns back to the first position; moving the cleaning robot to a second position and planning a second cleaning route according to the first cleaning route; and moving the cleaning robot along the second cleaning route. | 08-22-2013 |
20130218343 | CONTROL METHOD FOR CLEANING ROBOTS - An embodiment of the invention provides a control method for a cleaning robot with a quasi-omnidirectional detector and a directional light detector. The method includes: rotating the non-omnidirectional light detector when the non-omnidirectional light detector detects a light beam; when the non-omnidirectional light detector does not detect the light beam, the non-omnidirectional light detector is stopped from being spun and a rotation angle is estimated; determining a rotation direction according to the rotation angle; rotating the cleaning robot according to the rotation direction; stopping the rotation of the cleaning robot when the directional light detector detects the light beam. | 08-22-2013 |
20130218344 | CONTROL METHOD FOR CLEANING ROBOTS - An embodiment of the invention provides a control method of a cleaning robot. The method includes steps of moving the cleaning robot according to a first direction; keeping moving the cleaning robot according to the first direction when a light detector of the cleaning robot detects a light beam; moving the cleaning robot for a predetermined distance and then stopping the cleaning robot when the light detector does not detect the light beam; and moving the cleaning robot in a second direction. | 08-22-2013 |
20130231779 | Mobile Inspection Robot - A mobile inspection robot that includes a robot body and a drive system supporting the robot body and configured to maneuver the robot over a work surface. A controller communicates with the drive system and a sensor system. The controller executes a control system that includes a control arbitration system and a behavior system in communication with each other. The behavior system executes an inspection behavior, the inspection behavior influencing execution of commands by the control arbitration system based on sensor signals received from the sensor system to identify and inspect electrical equipment. | 09-05-2013 |
20130238130 | PATH RECORDING AND NAVIGATION - The instant application discloses, among other things, path recording and automatic navigation that may be applicable to various applications, including, but not limited to, lawn mowing. | 09-12-2013 |
20130238131 | ROBOT APPARATUS, METHOD FOR CONTROLLING THE SAME, AND COMPUTER PROGRAM - A robot apparatus includes an output unit that displays an image including an object on a screen, an input unit that receives an operation performed by a user for specifying information relating to an approximate range including the object in the image, an object extraction unit that extracts information regarding a two-dimensional contour of the object on the basis of the specification received by the input unit, and a position and attitude estimation unit that estimates information regarding a three-dimensional position and attitude of the object on the basis of the information regarding the two-dimensional contour. | 09-12-2013 |
20130245827 | METHOD AND APPARATUS FOR REMOTE MONITERING - A monitoring control method in an electronic device is provided. The method includes displaying an area map for monitoring, and determining a virtual boundary on the area map. The area map is a specific space of the indoor or outdoor and displays objects or articles that are within the specific space of the indoor or outdoor. The virtual boundary constitutes a movement area for a monitoring target therewithin. | 09-19-2013 |
20130245828 | MODEL GENERATION APPARATUS, INFORMATION PROCESSING APPARATUS, MODEL GENERATION METHOD, AND INFORMATION PROCESSING METHOD - A three-dimensional shape model of a target object is input. A position and orientation of at least one image sensing device used to capture an image of the target object is set so as to virtually set a relative position and orientation between the target object and the image sensing device for the three-dimensional shape model of the target object. At least one position and orientation of the image sensing device is selected. Geometric features are grouped based on a relationship between an image to be obtained at the selected position and orientation of the image sensing device and the geometric features of the three-dimensional shape model. | 09-19-2013 |
20130268118 | Operating A Mobile Robot - A robot system that includes an operator control unit, mission robot, and a repeater. The operator control unit has a display. The robot includes a robot body, a drive system supporting the robot body and configured to maneuver the robot over a work surface, and a controller in communication with the drive system and the operator control unit. The repeater receives a communication signal between the operator control unit and the robot and retransmits the signal. | 10-10-2013 |
20130274923 | Active Alignment Using Continuous Motion Sweeps and Temporal Interpolation - Methods and apparatus for actively aligning a first optical element, such as a lens, to a second optical element, such as an image sensor, use continuous scans, even absent a synchronization signal from one of the optical elements. During a scan, timed position information about the scanned optical element is collected, and then a relationship between position of the scanned optical element and time is estimated, such as by fitting a curve to a set of position-time pairs. This relationship can then be used to estimate locations of the scanned optical element at times when image data or other alignment quality-indicating data samples are acquired. From this alignment quality versus location data, an optimum alignment position can be determined, and the scanned optical element can then be positioned at the determined alignment position. | 10-17-2013 |
20130274924 | METHOD, MEDIUM AND APPARATUS CLASSIFYING AND COLLECTING AREA FEATURE INFORMATION ACCORDING TO A ROBOT'S MOVING PATH, AND A ROBOT CONTROLLED BY THE AREA FEATURES - A method of classifying and collecting feature information of an area according to a robot's moving path, a robot controlled by area features, and a method and apparatus for composing a user interface using the area features are disclosed. The robot includes a plurality of sensor modules to collect feature information of a predetermined area along a moving path of the robot, and an analyzer to analyze the collected feature information of the predetermined area according to a predetermined reference range and to classify the collected feature information into a plurality of groups. | 10-17-2013 |
20130289769 | WIND TURBINE ASSEMBLY AND MANAGEMENT ROBOT AND WIND TURBINE SYSTEM COMPRISING THE SAME - There is provided a wind turbine assembly and management robot that performs wind turbine assembly and management by itself. A wind turbine assembly and management robot according to exemplary embodiments of the present invention comprises: a recognizing unit configured to recognize a 3D space in a wind turbine and acquire and transmit image information; a working unit configured to bolt tower section flange coupling portions; an operating unit configured to move bolts, nuts, and the working unit to the flange coupling portions and perform bolting; a moving unit configured to horizontally move along the flange coupling portions or configured to move by using a ladder or an elevator in the wind turbine; a control unit configured to control any one or more of the recognizing unit, the working unit, the operating unit, and the moving unit; and a communication unit configured to communicate with a remote control system. | 10-31-2013 |
20130317649 | Nodding Mechanism For A Single-Scan Sensor - A single-scanning system for a robot can include a base and a nodding mechanism that pivotably attached to the base. A single-scan sensor such as a lidar or laser sensor can be fixed to the nodding mechanism, and a controller can be connected to the single-scan sensor and motor via a gear arrangement. The controller can manipulate nodding characteristics such as nodding range and nodding angular velocity dynamically, during operation of the robot, either in response to a command from a remote user, robot autonomy system, or according to written instructions included in the controller. An encoder disk can interconnect the nodding mechanism to the controller. With this configuration, the encoder disk can receive sensor data from the single-scan sensor, for further transmission to said controller. A transceiver can route sensor data to the remote user, and can also receive commands from the remote user. | 11-28-2013 |
20130317650 | ROBOTIC DEVICE TESTER - A system, method, and device may include software and hardware which simplify and quicken configuration of the system for testing a device, enhance testing procedures which may be performed, and provide data via which to easily discern a cause and nature of an error which may result during testing. A camera may capture still images of a display screen of a tested device and another camera may capture video images of the tested device and a partner device. A wizard may be used to generate a configuration file based on one previously generated for a similar device. A mount for a tested device may be structured so that: it is suitable for mounting thereon a plurality of differently structured devices; and adjustments in a vertical direction and a horizontal direction in a plane and adjustments of an angle of the device relative to the plane may be easily made. | 11-28-2013 |
20130325181 | NON-CONTACT OPTICAL DISTANCE AND TACTILE SENSING SYSTEM AND METHOD - The systems and methods are directed to mechanical arms and manipulators, and more particularly, to optical distance sensors in use for approach, grasping and manipulation. The system may include a manipulator having an arm and a multi fingered end-effector coupled to the distal end of the arm. The end-effector may include an optical proximity sensor configured to detect the distance to an object prior to contact with the object. The end-effector may include an optical proximity sensor configured detect a measurement of force applied to the object by the manipulator post contact with the object. The measurement of force may be a range of force measurements including a minimum, a maximum and a measurement between or within the minimum and the maximum. | 12-05-2013 |
20130331990 | OBSTACLE SENSING MODULE AND CLEANING ROBOT INCLUDING THE SAME CROSS-REFERENCE TO RELATED APPLICATION - Disclosed herein are an obstacle sensing module and a cleaning robot including the same. The cleaning robot includes a body, a driver to drive the body, an obstacle sensing module to sense an obstacle present around the body, and a control unit to control the driver, based on sensed results of the obstacle sensing module. The obstacle sensing module includes at least one light emitter including a light source, and a wide-angle lens to refract or reflect light from the light source so as to diffuse the incident light in the form of planar light, and a light receiver including a reflection mirror to again reflect reflection light reflected by the obstacle so as to generate reflection light, an optical lens spaced from the reflection mirror by a predetermined distance, to allow the reflection light to pass through the optical lens, and an image sensor, and an image processing circuit. | 12-12-2013 |
20130331991 | SELF-PROPELLED ROBOTIC HAND - A self-propelled robotic hand includes: a base capable of self-propulsion; an arm attached to the base; a hand attached to the arm for grasping an object; and a base securing unit attached to the base and configured to secure the base in place by electrostatic adhesion to a surface of a structure external to the self-propelled robotic hand. | 12-12-2013 |
20130338831 | ROBOT CLEANER AND CONTROLLING METHOD OF THE SAME - A robot cleaner is provided. The robot cleaner may precisely detect a peripheral obstacle using a particular optical pattern. An asymmetric cross-shaped optical pattern may be irradiated, and a pattern image with respect to the optical pattern-irradiated region may be analyzed to determine whether or not an obstacle is in the moving path, and a width or a height of the obstacle. Further, the robot cleaner may perform operations such as a forward motion, a backward motion, a stopping motion and a detour motion, based the obstacle detection result. | 12-19-2013 |
20130345872 | USER INTERFACES FOR ROBOT TRAINING - In accordance with various embodiments, a user interface embedded into a robot facilitates robot training via direct and intuitive physical interactions. | 12-26-2013 |
20130345873 | TRAINING AND OPERATING INDUSTRIAL ROBOTS - Robots may manipulate objects based on sensor input about the objects and/or the environment in conjunction with data structures representing primitive tasks and, in some embodiments, objects and/or locations associated therewith. The data structures may be created by instantiating respective prototypes during training by a human trainer. | 12-26-2013 |
20130345874 | TRAINING AND OPERATING INDUSTRIAL ROBOTS - Robots may manipulate objects based on sensor input about the objects and/or the environment in conjunction with data structures representing primitive tasks and, in some embodiments, objects and/or locations associated therewith. The data structures may be created by instantiating respective prototypes during training by a human trainer. | 12-26-2013 |
20130345875 | TRAINING AND OPERATING INDUSTRIAL ROBOTS - Robots may manipulate objects based on sensor input about the objects and/or the environment in conjunction with data structures representing primitive tasks and, in some embodiments, objects and/or locations associated therewith. The data structures may be created by instantiating respective prototypes during training by a human trainer. | 12-26-2013 |
20130345876 | SUSPENDED ROBOT SYSTEMS AND METHODS FOR USING SAME - Robotic systems and methods are provided for tending, manipulating, engaging, acting upon, observing and/or monitoring objects and conditions in a defined volume or space (“work space”) in or overlying a target area. The robotic system includes a mobile robot supported by or suspended from suspension cables secured to spaced apart anchor locations. | 12-26-2013 |
20140005832 | TERMINAL POSITIONING METHOD AND SYSTEM, AND MOBILE TERMINAL | 01-02-2014 |
20140031985 | APPARATUS AND METHOD OF TAKING OUT BULK STORED ARTICLES BY ROBOT - An article take-out apparatus comprising: a robot having a hand capable of holding an article; a 3D measuring device measuring surface positions of a plurality of articles stored in bulk in a 3D space to acquire a 3D point set composed of a plurality of 3D points; a local maximum point selecting unit selecting from the 3D point set a 3D point, where a coordinate value with respect to a predetermined coordinate axis is maximum; a processing unit determining a hand position and posture including a target position and target posture of the hand enabling an article near the local maximum point to be taken out based on the selected local maximum point; and a robot control unit controlling the robot so as to move the hand to the determined hand position and posture and take out the article from the hand position and posture. | 01-30-2014 |
20140039679 | APPARATUS FOR TAKING OUT BULK STORED ARTICLES BY ROBOT - An article take-out apparatus including, acquiring a reference container image including an open end face of a container by imaging operation by an camera, setting an image search region corresponding to a storage space of the container based on the reference container image, setting a reference plane including the open end face of the container, calculating a search region corresponding to the image search region based on a calibration data of the camera stored in advance, converting the search region to a converted search region, taking out 3D points included in the converted search region by projecting a plurality of 3D points measured by the 3D measuring device on the reference plane, and recognizing positions of articles inside the container using the 3D points. | 02-06-2014 |
20140039680 | Companion Robot For Personal Interaction - A mobile robot that includes a robot body, a drive system having one or more wheels supporting the robot body to maneuver the robot across a floor surface, and a riser having a proximal end and a distal end. The proximal end of the riser disposed on the robot body. The robot also includes a head disposed on the distal end of the riser. The head includes a display and a camera disposed adjacent the display. | 02-06-2014 |
20140046486 | ROBOT DEVICE - Provided is a small-sized robot device having high versatility without decreasing the work efficiency. The robot device includes: an arm; a hand or a tweezer tool that includes a stereo camera for measuring a three-dimensional position of a workpiece, the hand or the tweezer tool performing a work with respect to the workpiece whose three-dimensional position has been measured by the stereo camera; and a connection portion or a hand provided at a distal end of the arm, the connection portion or the hand releasably connecting the hand or the tweezer tool to the arm. | 02-13-2014 |
20140052296 | ROBOT SYSTEM AND METHOD FOR DRIVING THE SAME - Provided is a method of driving a system for a robot including obtaining scan data which includes information about at least one of a coordinate and a direction of the robot, estimating a plurality of location changes of the robot by matching a plurality of consecutive scan data pairs of the obtained scan data, generating a path of the robot by connecting the estimated location changes, estimating a position of a corrected instantaneous center of rotation (ICR), and correcting the plurality of consecutive scan data pairs based on the corrected ICR. | 02-20-2014 |
20140052297 | Apparatus for Automated Removal of Workpieces Arranged in a Container - A device for automated removal of workpieces arranged in a container has a detector device, for the purpose of detecting the workpiece, and a picker, which can be moved via a robot arm having at least six axes, for picking and removing the workpieces from the container. The device also has controller for evaluating the data of the detector device, for path planning, and for controlling the robot arm and the picker. The robot arm has a picker arm element, with at least two further axes of movement, for moving the picker. | 02-20-2014 |
20140058564 | VISUAL FORCE FEEDBACK IN A MINIMALLY INVASIVE SURGICAL PROCEDURE - Methods of and a system for providing a visual representation of force information in a robotic surgical system. A real position of a surgical end effector is determined. A projected position of the surgical end effector if no force were applied against the end effector is also determined. Images representing the real and projected positions are output superimposed on a display. The offset between the two images provides a visual indication of a force applied to the end effector or to the kinematic chain that supports the end effector. In addition, tissue deformation information is determined and displayed. | 02-27-2014 |
20140067126 | INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD AND STORAGE MEDIUM - There is provided with an information processing apparatus. An image including a target object is acquired. A coarse position and orientation of the target object is acquired. Information of a plurality of models which indicate a shape of the target object with different accuracy is held. A geometrical feature of the target object in the acquired image is associated with a geometrical feature indicated by at least one of the plurality of models placed at the coarse position and orientation. A position and orientation of the target object is estimated based on the result of association. | 03-06-2014 |
20140067127 | APPARATUS AND METHOD OF TAKING OUT BULK STORED ARTICLES BY ROBOT - An article take-out apparatus including: a 3D measuring device measuring surface positions of a plurality of articles stored in bulk in a 3D space so as to acquire position information of a plurality of 3D points; a connected set processing unit determining connected sets made by connecting 3D points which are close to each other, from the plurality of 3D points acquired by the 3D measuring device; an article identifying unit identifying positions and postures of the articles, based on position information of 3D points belonging to the connected sets; a hand position and posture processing unit determining positions and postures of the hand capable of taking out the identified articles; and a robot control unit controlling a robot to move the hand to the positions and postures determined by the hand position and posture processing unit and take out the articles. | 03-06-2014 |
20140074292 | ROBOT DEVICE, METHOD OF CONTROLLING ROBOT DEVICE, COMPUTER PROGRAM, AND PROGRAM STORAGE MEDIUM - Provided is an excellent robot device capable of preferably detecting difference between dirt and a scratch on a lens of a camera and difference between dirt and a scratch on a hand. | 03-13-2014 |
20140081458 | ROBOT SYSTEM AND ARTICLE MANUFACTURING METHOD - A robot system includes a controller that performs control such that corner portions of a glass substrate, which is different in size from a reference glass substrate, are detected by cameras in a state in which the glass substrate is held and shifted by an end effecter of a robot arm. | 03-20-2014 |
20140081459 | DEPTH MAPPING VISION SYSTEM WITH 2D OPTICAL PATTERN FOR ROBOTIC APPLICATIONS - A depth mapping device equipped with a 2D optical pattern projection mounted on a tool attached to a robot may be used to measure distance between the tool and an object. Depth data generated by the depth mapping device can be used to generate an augmented-reality image to provide real-time information about the object position, orientation, or other measurements to an operator performing a industrial robotic process. Images also may be generated with a camera located on the robot. Real-time depth information may be used to prevent collision. Fast depth information acquisition may be used to modify robot position for better processing. Real-time data acquisition plus fast processing may provide augmented-reality images to operators for better robot programming. Location data of the industrial process on the object may be used to improve analysis of the industrial process data. | 03-20-2014 |
20140088765 | METHOD FOR INVALIDATING SENSOR MEASUREMENTS AFTER A PICKING ACTION IN A ROBOT SYSTEM - The invention relates to a method and system for invalidating sensor measurements after a sorting action on a target area of a robot sorting system. In the method there are obtained sensor measurements using sensors from a target area. A first image is captured of the target area using a sensor over the target area. A first sorting action is performed in the target area using a robot arm based on the sensor measurements and the first image. Thereupon, a second image of the target area is captured using a sensor over the target area. The first and the second images are compared to determine invalid areas in the target area. The invalid areas are avoided in future sorting actions based on the sensor measurements. | 03-27-2014 |
20140100696 | WORKING METHOD USING SENSOR AND WORKING SYSTEM FOR PERFORMING SAME - Disclosed is a working method using a sensor, which increases recognition of a component to increase mounting of a component and enhancing productivity. The working method includes: extracting an object to be picked from a pile of objects using the sensor; picking the extracted object to move the picked object to a predetermined place; and estimating an angle of the moved object in the current position using the sensor. Accordingly, the working method can perform precise component recognition and posture estimation by two steps: a component picking step and a component recognition step, and effectively apply to a manufacturing line, thereby improving mounting of a component and enhancing productivity of a product. | 04-10-2014 |
20140107842 | HUMAN-TRACKING METHOD AND ROBOT APPARATUS FOR PERFORMING THE SAME - Provided are a human-tracking method and a robot apparatus. The human-tracking method includes receiving an image frame including a color image and a depth image, determining whether user tracking was successful in a previous image frame, and determining a location of a user and a goal position to which a robot apparatus is to move based on the color image and the depth image in the image frame, when user tracking was successful in the previous frame. Accordingly, a current location of the user can be predicted from the depth image, user tracking can be quickly performed, and the user can be re-detected and tracked using user information acquired in user tracking when detection of the user fails due to obstacles or the like. | 04-17-2014 |
20140114482 | ROOF INSPECTION SYSTEMS WITH AUTONOMOUS GUIDANCE - Devices, systems, and methods for inspecting and objectively analyzing the condition of a roof are presented. A vehicle adapted for traversing and inspecting an irregular terrain includes a chassis having a bottom surface that defines a higher ground clearance at an intermediate location, thereby keeping the center of mass low when crossing roof peaks. In another embodiment, the drive tracks include a partially collapsible treads made of resilient foam. A system for inspecting a roof includes a lift system and a remote computer for analyzing data. Vehicles and systems may gather and analyze data, and generate revenue by providing data, analysis, and reports for a fee to interested parties. | 04-24-2014 |
20140121835 | SERPENTINE ROBOTIC CRAWLER - A robotic crawler having a non-dedicated smart control system is disclosed. Such a crawler can include a first drive subsystem, a second drive subsystem, a multi-degree of freedom linkage subsystem coupling the first and second drive subsystems, and a non-dedicated, smart control device removably supported about one of the first drive subsystem, the second drive subsystem, and the linkage subsystem. The smart control device is configured to initiate and control operational functionality within the robotic crawler upon being connected to the robotic crawler. The crawler can also include a communication subsystem functionally coupled between the smart control device and the serpentine robotic crawler, the communication subsystem facilitating control by the smart control device of at least one of the first drive subsystem, the second drive subsystem, and the linkage subsystem. | 05-01-2014 |
20140121836 | OBJECT PICKUP DEVICE AND METHOD FOR PICKING UP OBJECT - A pickup device for picking up a target object from a plurality of objects randomly piled up in a container, and for placing the target object in a predetermined posture to a target location is provided. The device includes an approximate position obtaining part for obtaining information on an approximate position of the target object, based on information on a height distribution of the objects in the container, which is obtained by a first visual sensor. The device also includes a placement operation controlling part for controlling a robot so as to bring the target object into a predetermined position and posture relative to the target location, based on information on a position and posture of the target object relative to a robot, which is obtained by a second visual sensor. | 05-01-2014 |
20140135989 | INDUSTRIAL ROBOT SYSTEM HAVING SENSOR ASSEMBLY - An industrial robot system includes an end effector connectable to a robot arm, a drive assembly, and a controller. The end effector includes a distal housing, a spindle assembly rotatable about a rotational axis, a drill bit rotatable about the rotational axis, and a sensor assembly. The sensor assembly includes a first light source, a second light source, and a photosensitive array. The first light source produces a first fan of light which is projected as a first line of light on the object surface. The second light source produces a second fan of light, which is projected as a second line of light on the object surface. The photosensitive array detects a first reflection line corresponding to the first line of light and a second reflection line corresponding to the second line of light. | 05-15-2014 |
20140135990 | REMOTE PRESENCE SYSTEM INCLUDING A CART THAT SUPPORTS A ROBOT FACE AND AN OVERHEAD CAMERA - A tele-presence system that includes a cart. The cart includes a robot face that has a robot monitor, a robot camera, a robot speaker, a robot microphone, and an overhead camera. The system also includes a remote station that is coupled to the robot face and the overhead camera. The remote station includes a station monitor, a station camera, a station speaker and a station microphone. The remote station can display video images captured by the robot camera and/or overhead camera. By way of example, the cart can be used in an operating room, wherein the overhead camera can be placed in a sterile field and the robot face can be used in a non-sterile field. The user at the remote station can conduct a teleconference through the robot face and also obtain a view of a medical procedure through the overhead camera. | 05-15-2014 |
20140148951 | MANIPULATOR DEVICE - A manipulator device has an arm portion and a hand portion The hand portion includes one or more finger portions that manipulate a target object. Each finger portion includes a slip sensor and multiple contact sensors, with at least one contact sensor at a position proximate to the slip sensor and at least another contact sensor at a position remote from the slip sensor. When the contact sensors at the positions remote from the slip sensor detect contact of the target object and the contact sensors arranged at the positions proximate to the slip sensors do not detect contact, a position of the finger portion is moved by a distance corresponding to the distance between the contact sensors detecting contact of the target object and the contact sensors arranged at the positions proximate to the slip such that a detecting position of the slip sensor is coincident with a position of the target object. | 05-29-2014 |
20140156078 | SERVER CONNECTIVITY CONTROL FOR TELE-PRESENCE ROBOT - A robot system with a robot that has a camera and a remote control station that can connect to the robot. The connection can include a plurality of privileges. The system further includes a server that controls which privileges are provided to the remote control station. The privileges may include the ability to control the robot, joint in a multi-cast session and the reception of audio/video from the robot. The privileges can be established and edited through a manager control station. The server may contain a database that defines groups of remote control station that can be connected to groups of robots. The database can be edited to vary the stations and robots within a group. The system may also allow for connectivity between a remote control station at a user programmable time window. | 06-05-2014 |
20140163736 | COLLISION AVOIDANCE DURING CONTROLLED MOVEMENT OF IMAGE CAPTURING DEVICE AND MANIPULATABLE DEVICE MOVABLE ARMS - A system and method for movement control includes a controller coupled to a computer-assisted surgical device having a first movable arm coupled to a manipulatable device having a working end and a second movable arm coupled to an image capturing device. The controller is configured to receive first configurations for the first movable arm; receive second configurations for the second movable arm; receive a plurality of images of the working end from the image capturing device; determine a position and an orientation of the working end; determine a first movable arm position and trajectory for the first movable arm; determine a second movable arm position and trajectory for the second movable arm; determine whether motion of the movable arms will result in an undesirable relationship between the movable arms; and send a movement command to the first or second movable arm to avoid the undesirable relationship. | 06-12-2014 |
20140163737 | ROBOT SYSTEM - A robot system includes: a robot including a camera unit shooting an object in a container, a hand gripping the object, and a contact detector detecting that the hand contacts the container; and a robot control device, which includes a control unit causing the hand to contact the container; a contact detection unit detecting by the contact detector that the hand contacts the container, and finding a contact position thereof; a first processing unit calculating a position of the container from a stereo image of the container acquired by the camera unit; a second processing unit calculating a difference between the position of the container calculated by the first processing unit and the contact position found by the contact detection unit as a correction amount; and a third processing unit correcting information on a position in a height direction of the object in the container based on the correction amount. | 06-12-2014 |
20140172166 | TREATMENT DEVICE FOR HEMIPLEGIA - The treatment device for hemiplegia comprises a robot which is putted on the hemiplegic side of the body of a subject; a motion measurement unit for measuring the motion of the healthy side of the body of the subject; and a control which is connected with the robot and the motion measurement unit, wherein the control unit is configured to receive the healthy side's motion measured by the motion measurement unit and to control the robot, whereby the hemiplegic side having the robot put thereon moves in accordance with the motion of the healthy side of the body. | 06-19-2014 |
20140172167 | TEACHING DATA GENERATOR, ROBOT SYSTEM, AND METHOD FOR GENERATING TEACHING DATA - A teaching data generator includes a storage device. An arithmetic device includes a first window display section to cause a display device to display a first window displaying first images respectively corresponding to some pieces of work unit job data stored in the storage device and included in teaching data. The first images are arranged in an execution order of pieces of work respectively corresponding to the some pieces of the work unit job data. A first job editing section performs an editing operation including replacing the some pieces of the work unit job data with other pieces of the work unit job data stored in the storage device, and changing the execution order. A teaching data generation section generates the teaching data based on a display content of the first window changed in accordance with the editing operation. | 06-19-2014 |
20140180479 | Bagging With Robotic Arm - Systems and methods are disclosed for automatically or semi-automatically depositing retail items into a bag using a robotic arm. Embodiments of the present disclosure comprise a camera, an image processor, and a robotic arm control module to analyze and attempt to identify an item. Human intervention may be utilized to assist in item identification and/or robotic arm control. A human operator may visually identify the item and/or remotely control the robotic arm from a remote control station. | 06-26-2014 |
20140188278 | ROBOTIC SHOE - A robotic shoe includes a robot sole, a plurality of optical sensors, a plurality of optical sensors, and projections. The robot sole has an underside capable of contacting the ground when in use. Mounting spaces are longitudinally spaced in the sole. The optical sensors are disposed in respective ones of the mounting spaces. The projections protrude from the underside of the sole, and are capable of contacting the ground at positions corresponding to the mounting spaces. | 07-03-2014 |
20140195053 | VISUALLY CONTROLLED END EFFECTOR - A visually controlled end effector is disclosed. The end effector comprises two or more operational members capable of picking up one or more randomly placed items. A crank is capable of actuation by a robot to orient a first one of said two or more operational members to pick up a first one of said one or more randomly placed items. The crank is further capable of actuation by the robot to orient a second one of said two or more operational members to pick up a second one of said one or more randomly placed items. The crank is further capable of orienting the first and second ones of the said two or more operational members for placement of the first and second ones of said randomly placed items into a desired oriented condition. A time savings of three robot transfers is thus made over prior art systems that transferred each individual product one by one. | 07-10-2014 |
20140207285 | SYSTEM AND METHOD FOR MONITORING ENTRY OF OBJECT INTO SURROUNDING AREA OF ROBOT - A monitoring system monitors entry of an object into a surrounding area of a robot. The monitoring system includes visible light irradiation section, a sensor section and a monitoring control unit. The visible light irradiation section irradiates visible light from a level higher than that of the robot toward at least an outer edge portion of at least one of an operating area, which is set up to enclose a movable range of the robot, and a predetermined area which is set up around the operating area. The sensor section monitors entry of a new object, which is not registered in advance, into the operating area. The monitoring control unit issues a request for stopping the robot to a robot control unit that controls the robot, when entry of a new object into the operating area is detected by the sensor section. | 07-24-2014 |
20140207286 | INTERFACING WITH A MOBILE TELEPRESENCE ROBOT - A telepresence robot may include a drive system, a control system, an imaging system, and a mapping module. The mapping module may access a plan view map of an area and tags associated with the area. In various embodiments, each tag may include tag coordinates and tag information, which may include a tag annotation. A tag identification system may identify tags within a predetermined range of the current position and the control system may execute an action based on an identified tag whose tag information comprises a telepresence robot action modifier. The telepresence robot may rotate an upper portion independent from a lower portion. A remote terminal may allow an operator to control the telepresence robot using any combination of control methods, including by selecting a destination in a live video feed, by selecting a destination on a plan view map, or by using a joystick or other peripheral device. | 07-24-2014 |
20140214208 | ROBOT DEVICE, METHOD OF CONTROLLING THE SAME, COMPUTER PROGRAM, AND ROBOT SYSTEM - Provided is a robot device including an image input unit for inputting an image of surroundings, a target object detection unit for detecting an object from the input image, an object position detection unit for detecting a position of the object, an environment information acquisition unit for acquiring surrounding environment information of the position of the object, an optimum posture acquisition unit for acquiring an optimum posture corresponding to the surrounding environment information for the object, an object posture detection unit for detecting a current posture of the object from the input image, an object posture comparison unit for comparing the current posture of the object to the optimum posture of the object, and an object posture correction unit for correcting the posture of the object when the object posture comparison unit determines that there is a predetermined difference or more between the current posture and the optimum posture. | 07-31-2014 |
20140214209 | COMPLEX DEVICE AND ROBOT HAND DRIVE CONTROL APPARATUS - A complex device includes: a substrate having a thick portion, a cavity and a membrane for bridging the cavity; and multiple piezoelectric elements having a lower electrode, a piezoelectric film and an upper electrode. A part of the piezoelectric elements has a projecting portion arranged on the upper electrode. The part of piezoelectric elements ( | 07-31-2014 |
20140222205 | ELECTRONIC DOCKING SYSTEM AND METHOD FOR ROBOTIC POSITIONING SYSTEM - An apparatus includes a robotic positioning device and a locating mat. The locating mat includes a location pattern and can be disposed on a floor at a desired position relative to a movable cradle of an imaging system. The robotic positioning device is configured to be disposed, at least partially, above the locating mat. The robotic positioning device includes a docking device that includes an optical device and a guide manipulator supported on the docking device. The guide manipulator can be positioned relative to the movable cradle based, at least partially, on image data associated with the optical device and the location pattern of the locating mat. The guide manipulator can position an instrument guide relative to a patient disposed on the movable cradle. | 08-07-2014 |
20140222206 | Polarized Enhanced Confidentiality in Mobile Camera Applications - A system for selectively controlling visibility of information and for reducing or eliminating light energy in a robotic telepresence environment is provided. The system includes a robotic device in a work environment that is occupied or operated by a pilot at a station in a workspace remote from location of the robot in the work environment, a camera operably attached to the robotic device, a primary polarizing filter adjustably attached to the camera, a viewing area, at least one light source remote from the camera, and at least one secondary polarizing filter. The primary and at least one secondary polarizing filter can be adjusted to vary the lighting and artifacts seen by the pilot. | 08-07-2014 |
20140249676 | ADAPTING ROBOT BEHAVIOR BASED UPON HUMAN-ROBOT INTERACTION - Technologies pertaining to human-robot interaction are described herein. The robot includes a computer-readable memory that comprises a model that, with respect to successful completions of a task, is fit to observed data, where at least some of such observed data pertains to a condition that is controllable by the robot, such as position of the robot or distance between the robot and a human. A task that is desirably performed by the robot is to cause the human to engage with the robot. The model is updated while the robot is online, such that behavior of the robot adapts over time to increase the likelihood that the robot will successfully complete the task. | 09-04-2014 |
20140249677 | ROBOT - A robot includes a gripping section and a main body section to which the pair of finger sections are attached, having one end sections of the pair of finger sections rotatably connected to each other around a first rotating shaft disposed at a position separate from the main body section, and adapted to open and close the pair of finger sections by swinging the other side of the pair of finger sections on a plane parallel to a mounting surface on which an object is mounted centered on the first rotating shaft to thereby grip the object, a moving device adapted to relatively move the object and the gripping section, and a control device adapted to control the moving device to move the gripping section relatively toward the object, and grip the object with the gripping section at at least three contact points. | 09-04-2014 |
20140257563 | ROBOT CLEANER - A robot cleaner includes a main body, a light transmitting unit, an image sensor, a base, a rotation drive unit, a tilting unit, and a tilting drive unit. The light transmitting unit emits light. The light emitted from the light transmitting unit and reflected or scattered is formed on the image sensor. The base supports the light transmitting unit and the image sensor and is rotatably disposed in the main body. The rotation drive unit rotates the base. The tilting unit tilts the light transmitting unit and the image sensor. | 09-11-2014 |
20140257564 | ROBOT CLEANER - A robot cleaner includes a main body, a light transmitting unit, an image sensor, a base, a rotation drive unit, and an elevation drive unit. The light transmitting unit emits light. The light reflected or scattered by an obstacle is sensed by the image sensor. The base supports the light transmitting unit and the image sensor and is rotatably and vertically movably disposed in the main body. The rotation drive unit rotates the base. The elevation drive unit allows the base to retract or protract from the main body. | 09-11-2014 |
20140257565 | ROBOT CLEANER - A robot cleaner is disclosed. The robot cleaner includes a cleaner body, a position sensor disposed in the cleaner body, the position sensor including a light transmission unit to emit light and a light reception unit to receive light reflected or scattered from an obstacle after being emitted from the light transmission unit, and a transparent member to transmit the light emitted from the light transmission unit and the light to be received by the light reception unit. | 09-11-2014 |
20140277731 | ROBOT PICKING SYSTEM, CONTROL DEVICE, AND METHOD OF MANUFACTURING A WORKPIECE - A robot picking system includes a robot that picks up a work in a first stocker accommodating a plurality of works, a control device that controls an operation of the robot, and an image acquiring device that acquires image data including information related to the plurality of works. The control device includes a candidate data generating unit that generates candidate data including information of candidate works that are candidates of a picking-up target using the image data, and a target work selecting unit that selects a target work that is a picking-up target from the candidate works using the candidate data. | 09-18-2014 |
20140277732 | ROBOT APPARATUS - A robot apparatus that performs the work of attaching a cable includes a gripper that grips the cable having first and second ends, the first end being fixed; an arm main body that guides the second end of the cable being gripped by the gripper to the interior of a guide area; a force censer that is provided on the arm main body and detects that the gripper has come into contact with the second end of the cable; a camera that captures an image of the guide area when the force center detects that the gripper has come into contact with the second end of the cable; and an image processor that detects the posture of the second end of the cable based on the captured image. The robot apparatus performs the work of attaching the cable in accordance with the detected posture of the connector. | 09-18-2014 |
20140277733 | ROBOT SYSTEM AND METHOD FOR PRODUCING TO-BE-PROCESSED MATERIAL - A robot system includes robot, an image capture device, a plurality of illumination devices, and a control device. The robot is configured to perform a predetermined work on a to-be-processed material. The image capture device is configured to capture an image of the to-be-processed material and has a dynamic range. The plurality of illumination devices are configured to illuminate the to-be-processed material. The control device is configured to control at least one illumination device among the plurality of illumination devices to keep an amount of light received by the image capture device within the dynamic range of the image capture device. | 09-18-2014 |
20140277734 | ROBOT SYSTEM AND A METHOD FOR PRODUCING A TO-BE-PROCESSED MATERIAL - A robot system includes a first robot, a second robot, a stocker, and a controller. The first robot includes a first sensor. The second robot includes a second sensor. The stocker is configured to accommodate a plurality of workpieces that are to be held by at least one of the first robot and the second robot. The controller is configured to control the first robot and the second robot. When the first robot holds a first workpiece among the plurality of workpieces, the controller is configured to control the first sensor to recognize shapes of the plurality of workpieces in the stocker and control the second sensor to detect a holding state of the first workpiece held by the first robot. | 09-18-2014 |
20140277735 | APPARATUS AND METHODS FOR PROVIDING A PERSISTENT COMPANION DEVICE - A method includes providing a telecommunications enabled robotic device adapted to persist in an environment of a user, receiving an instruction to photograph one or more persons in the environment according to a time parameter and photographing the one or more persons in accordance with the time parameter resulting in one or more photographs. | 09-18-2014 |
20140277736 | GEOMETRICALLY APPROPRIATE TOOL SELECTION ASSISTANCE FOR DETERMINED WORK SITE DIMENSIONS - A robotic system includes a processor that is programmed to determine and cause work site measurements for user specified points in the work site to be graphically displayed in order to provide geometrically appropriate tool selection assistance to the user. The processor is also programmed to determine an optimal one of a plurality of tools of varying geometries for use at the work site and to cause graphical representations of at least the optimal tool to be displayed along with the work site measurements. | 09-18-2014 |
20140277737 | ROBOT DEVICE AND METHOD FOR MANUFACTURING PROCESSING OBJECT - A robot device includes according to one embodiment of the present disclosure: a robot controller configured to operate a robot based on a motion program specifying a motion of the robot; a robot imaging unit configured to acquire image data of an image including the robot; and a data processor. The data processor includes: a virtual-space-data holder configured to hold virtual space data including information on a virtual object in a virtual space, the virtual space simulating a real working space of the robot, the virtual object simulating an object present in the real working space; and an augmented-reality-space-data generator configured to generate augmented-reality-space data by use of the image data and the virtual space data. | 09-18-2014 |
20140277738 | CONTROL OF MEDICAL ROBOTIC SYSTEM MANIPULATOR ABOUT KINEMATIC SINGULARITIES - A medical robotic system includes an entry guide with articulatable instruments extending out of its distal end, an entry guide manipulator providing controllable four degrees-of-freedom movement of the entry guide relative to a remote center, and a controller configured to manage operation of the entry guide manipulator in response to operator manipulation of one or more input devices. As the entry guide manipulator approaches a yaw/roll singularity, the controller modifies its operation to allow continued movement of the entry guide manipulator without commanding excessive joint velocities while maintaining proper orientation of the entry guide. | 09-18-2014 |
20140288709 | ROBOT CLEANER AND METHOD OF OPERATING THE SAME - A robot cleaner includes a main body, a traveling unit, a cleaning unit, a sensor unit, and a controller. The traveling unit allows the main body to travel. The cleaning unit suctions foreign substances around the main body during the traveling. The sensor unit is rotatable and senses an obstacle using light reflected or scattered by the obstacle. The controller controls the traveling unit so as to travel along a traveling path and controls the cleaning unit so as to perform cleaning. Here, the sensor unit includes a first mode and a second mode that are set to differ from each other in sensitivity with respect to the reflected or scattered light, and the controller changes the sensitivity of the sensor unit according to a traveling mode. | 09-25-2014 |
20140288710 | ROBOT SYSTEM AND CALIBRATION METHOD - A robot system includes: a robot arm; a camera; a calibration jig with a marker that allows image recognition; and a calibration apparatus configured to derive a correlation between camera coordinates being coordinates in a photographed image and robot coordinates using the robot arm as a reference. The robot arm is configured to have a posture corresponding to a relative position of the camera with respect to the marker. The calibration apparatus sets a plurality of photographing positions by changing the relative position, acquires the camera coordinates of the marker in the plurality of photographing positions and information of the posture of the robot arm, and derives the correlation. | 09-25-2014 |
20140288711 | ROBOT SYSTEM AND METHOD FOR MANUFACTURING TO-BE-PROCESSED-MATERIAL - A robot system includes a robot, an image capture device, and a setting device. The image capture device is fixed at a position external to the robot, and configured to capture an image of a range including the robot and a vicinity of the robot. The setting device is configured to generate area information on an area defining an operation of the robot based on the image captured by the image capture device. | 09-25-2014 |
20140324220 | REMOTE TESTING METHOD AND SYSTEM - In one example, a method is provided for a computing device to facilitate testing of an electronic apparatus. The method includes receiving a request from an apparatus coupled to a network to which the computing device is also coupled. The method further includes retrieving a command for a robot controlled by the computing device from the request, and configuring the robot to physically interface with the electronic apparatus to perform one or more tests according to the command. | 10-30-2014 |
20140336819 | ROBOTIC DEVICE TESTER - A system, method, and device may include software and hardware which simplify and quicken configuration of the system for testing a device, enhance testing procedures which may be performed, and provide data via which to easily discern a cause and nature of an error which may result during testing. A camera may capture still images of a display screen of a tested device and another camera may capture video images of the tested device and a partner device. A wizard may be used to generate a configuration file based on one previously generated for a similar device. A mount for a tested device may be structured so that: it is suitable for mounting thereon a plurality of differently structured devices; and adjustments in a vertical direction and a horizontal direction in a plane and adjustments of an angle of the device relative to the plane may be easily made. | 11-13-2014 |
20140343728 | MULTI-JOINT UNDERWATER ROBOT HAVING COMPLEX MOVEMENT FUNCTIONS OF WALKING AND SWIMMING AND UNDERWATER EXPLORATION SYSTEM USING SAME - Disclosed is an underwater exploration system using a multi-joint underwater robot having a novel complex movement concept in which the multi-joint underwater robot moves through walking or swimming with multi-joint legs closely to a seafloor, differently from a conventional underwater robot to obtain a thrust through a propeller scheme. The underwater exploration system includes the multi-joint underwater robot having the complex movement function according, a depressor, and a mother ship to store data of an underwater state transmitted from the multi-joint underwater robot and to monitor and control a movement direction of the multi-joint underwater robot. The depressor is connected to the mother ship through a primary cable, the multi-joint underwater robot is connected to the depressor through a second cable, and resistance force of the primary cable is applied to the depressor without being transmitted to the multi-joint underwater robot. | 11-20-2014 |
20140350727 | Methods and Systems for Providing Functionality of an Interface to Control Orientations of a Camera on a Device - Methods and systems for providing functionality of an interface to control orientations of a camera on a device are provided. In one example, a method includes receiving an input on an interface indicating a command for an orientation of a camera on a robotic device, and the interface may be provided on a device remote from the robotic device. An indicator may be provided on the interface representing a location of the input, and the indicator may be representative of the command for the orientation of the camera on the robotic device. The method may also include determining that the location of the input on the interface is within a distance threshold to a pre-set location on the interface, and repositioning the indicator on the interface to be at the pre-set location. | 11-27-2014 |
20140365010 | WORKPIECE DETECTOR, ROBOT SYSTEM, METHOD FOR PRODUCING TO-BE-PROCESSED MATERIAL, METHOD FOR DETECTING WORKPIECE - A workpiece detector includes a camera to acquire a two-dimensional image of a search range within which workpieces are disposed. A three-dimensional sensor detects a three-dimensional shape of a three-dimensional detection area. A workpiece extraction section processes the two-dimensional image to extract candidate workpieces. An area setting section sets a three-dimensional detection areas respectively corresponding to the candidate workpieces. A prioritizing section sets an order of priority to the three-dimensional detection areas to give higher priority to one three-dimensional detection area containing more of the candidate workpieces. A sensor control section controls the three-dimensional sensor to detect the three-dimensional shape of each three-dimensional detection area in the order of priority. Every time the three-dimensional shape is detected, a workpiece detection section searches the workpieces based on the detected three-dimensional shape to detect a pickable workpiece. | 12-11-2014 |
20140365011 | Robot and Adaptive Placement System and Method - A method including moving a substrate, located on a first end effector of a robot, from a first location towards a second location by the robot; determining location of a fiducial on the substrate while the substrate is being moved from the first location towards the second location; comparing the determined location of the fiducial with a reference fiducial location while the robot is moving the substrate from the first location towards the second location. | 12-11-2014 |
20140371909 | CLEANING ROBOT AND METHOD FOR CONTROLLING THE SAME - A cleaning robot includes a main body, a moving assembly to move the main body, a cleaning tool provided at a bottom part of the main body to collect foreign substances on a floor, an imager to collect images around the main body and a controller to recognize motion of a hand by performing image processing of the collected images, identify a control command corresponding to the motion of the hand, plan a moving direction and a moving distance of the main body as movement information based on the control command, and control operations of the moving assembly and the cleaning tool based on the planned movement information. Since the user directly controls movement of the cleaning robot, it is possible to improve interactivity between human and cleaning robot, reduce the user's labor and increase convenience. | 12-18-2014 |
20140371910 | ROBOT SYSTEM AND ROBOT CONTROL METHOD - A robot system includes a robot body, a camera mounted on the robot body and capable of photographing a work piece; and a control device for driving and controlling the robot body based on a trajectory to an instruction point, which is set in advance, and, when the camera arrives at an area in which the camera is capable of photographing the work piece during this driving and controlling, driving and controlling the robot body so that the camera moves linearly toward the work piece, taking an image of the work piece with the camera while the camera is moving linearly, and measuring a position of the work piece from the taken image. | 12-18-2014 |
20140371911 | Pre-Screening for Robotic Work - A solution for pre-screening an object for further processing is provided. A pre-screening component can acquire image data of the object and process the image data to identify reference target(s) corresponding to the object, which are visible in the image data. Additionally, the pre-screening component can identify, using the reference target(s), the location of one or more components of the object. The pre-screening component can provide pre-screening data for use in further processing the object, which includes data corresponding to the set of reference targets and the location of the at least one component. A reference target can be, for example, an easily identifiable feature of the object and the component can be relevant for performing an operation on the object. | 12-18-2014 |
20140379130 | MOVABLE MEDICAL APPARATUS AND METHOD FOR CONTROLLING MOVEMENT OF THE SAME - A movable medical apparatus may include a sensing unit to sense force externally applied to the movable medical apparatus, a control unit to generate one or more control signals to move, rotate, or stop the movable medical apparatus in accordance with the sensed force, and an apparatus moving unit to move, rotate, or stop the movable medical apparatus in accordance with the one or more control signals. A method for controlling movement of the movable medical apparatus may be implemented by the movable medical apparatus by sensing a force externally applied to the movable medical apparatus, generating a control signal based on the sensed force, and controlling movement of the movable medical apparatus based on the generated control signal. | 12-25-2014 |
20150012135 | POSITION ADJUSTING SYSTEM AND METHOD - A position adjusting device includes a movable mechanical arm, suction, an adjusting mechanism, a bottom plate, a first camera, a second camera, and a processor. The processor controls the movable mechanism arm and the suction to move to a top of the first camera in response to a user operation, switches the first camera from a disabled state to the enabled state, acquires the first image captured by the first camera, and calculates a first angle between the first image and a first predetermined image. The processor adjusts the position of a second element based on the first angle and acquires a second angle between a second image of the second element and a second predetermined image. The processor adjusts the position of the second element until the first angle is the same as the second angle when the first angle is not the same as the second angle. | 01-08-2015 |
20150012136 | MOBILE TELE-PRESENCE SYSTEM WITH A MICROPHONE SYSTEM - A remote controlled robot system that includes a robot and a remote control station. The robot includes a binaural microphone system that is coupled to a speaker system of the remote control station. The binaural microphone system may include a pair of microphones located at opposite sides of a robot head. the location of the microphones roughly coincides with the location of ears on a human body. Such microphone location creates a mobile robot that more effectively simulates the tele-presence of an operator of the system. The robot may include two different microphone systems and the ability to switch between systems. For example, the robot may also include a zoom camera system and a directional microphone. The directional microphone may be utilized to capture sound from a direction that corresponds to an object zoomed upon by the camera system. | 01-08-2015 |
20150019014 | DEVICE AND METHOD FOR QUALITY INSPECTION OF AUTOMOTIVE PART - A device for quality inspection of an automotive part includes i) a jig unit for securing and supporting an inspection object, ii) rotary vision imager mounted to a fore end of a robot arm rotatable in a turret type for vision photographing a plurality of processed portions of the inspection object, and iii) a controller for obtaining a vision data from the rotary vision imager, and analyzing and processing the vision data for extracting a defect of the processed portion. | 01-15-2015 |
20150019015 | UNIT AND METHOD FOR THE AUTOMATIC HOOKING OF PARTS ONTO COMPLEX SUPPORTS - A device for automatic hooking of parts onto complex supports including
| 01-15-2015 |
20150032262 | APPARATUS FOR IDENTIFYING LAYER NUMBER OF AN OBJECT IN A CONTAINER AND SYSTEM FOR AUTOMATICALLY TAKING OUT AN OBJECT - An apparatus ( | 01-29-2015 |
20150057801 | Real Time Approximation for Robotic Space Exploration - A system and method for guidance of a moving robotic device through an approximated real time (ART) virtual video stream is presented. The system and method includes at least one camera for collecting images of a terrain in a remote location, at least one terrain data collecting device for collecting data from a remote location, a memory for storing images from the plurality of cameras, a communication device for transmitting the images and data over a path and a computer configured to calculate a delay between the cameras and the receiver. The calculated delay causes the computer to retrieve images and data from the receiver and memory and consequently generate an approximate real-time video and data stream for displaying the terrain-just-ahead of a moving robotic device at a distance proportional to the calculated delay and the ART video and data stream is used to guide the moving robotic device. | 02-26-2015 |
20150057802 | ROBOTIC ACTIVITY SYSTEM USING COLOR PATTERNS - A robotic activity system, which includes a board and an autonomous robotic device, is described herein. The board may display a line and one or more color patterns. The robotic device may traverse the line using one or more integrated sensors. For example, sensor data may include light intensity data for visible light reflected or emitted by the board. The sensor data may be analyzed to 1) ensure the robotic device follows the line and/or 2) detect color sequences associated with color patterns shown on the board. Upon detection of a color sequence, the robotic device may attempt to match the color sequence with a known color pattern definition. The color pattern definition may be associated with a function to be performed by the robotic device. Using multiple sets of color patterns and associated functions allows the robotic device to move in a variable and potentially unpredictable fashion. | 02-26-2015 |
20150066208 | ROBOTIC OR MECHANICAL HAND WITH ALARM FUNCTION - A robotic or mechanical hand includes a number of infrared distance sensors, an alarm unit, a driving unit, and a processing unit. When the presence of a person or object within a certain adjustable proximity is detected, the hand raises an alarm for a preset time period and directs the driving unit to stop the motion of the mechanical hand. | 03-05-2015 |
20150073595 | CONTROL APPARATUS AND CONTROL METHOD FOR MASTER SLAVE ROBOT, ROBOT, CONTROL PROGRAM FOR MASTER SLAVE ROBOT, AND INTEGRATED ELECTRONIC CIRCUIT FOR CONTROL OF MASTER SLAVE ROBOT - A control apparatus for a master slave robot causes a force information correcting unit to correct force information in accordance with a feature of a target object on a screen from target object information calculated by a target object information calculation unit. An operator can thus apply appropriate force while watching a picture projected on a display to perform a task. | 03-12-2015 |
20150073596 | CONTROL APPARATUS AND CONTROL METHOD FOR MASTER SLAVE ROBOT, ROBOT, CONTROL PROGRAM FOR MASTER SLAVE ROBOT, AND INTEGRATED ELECTRONIC CIRCUIT FOR CONTROL OF MASTER SLAVE ROBOT - A master slave robot is that receives force presentation according to a picture watched by an operator operating the master slave robot. The control apparatus for the master slave robot causes a force information correcting unit to correct force information in accordance with magnification percentage information acquired by a displayed information acquiring unit such that the force information is increased accordingly as the magnification percentage information is larger. An operator can thus apply appropriate force while watching the picture projected on a display to perform a task. | 03-12-2015 |
20150094855 | IMITATION LEARNING METHOD FOR A MULTI-AXIS MANIPULATOR - The present invention concerns an imitation learning method for a multi-axis manipulator ( | 04-02-2015 |
20150094856 | UNCALIBRATED VISUAL SERVOING USING REAL-TIME VELOCITY OPTIMIZATION - A robotic control method for a camera ( | 04-02-2015 |
20150100162 | SYSTEM AND METHOD FOR CONTROLLING A VISION GUIDED ROBOT ASSEMBLY - A method includes the following steps: actuating a robotic arm to perform an action at a start position; moving the robotic arm from the start position toward a first position; determining from a vision process method if a first part from the first position will be ready to be subjected to a first action by the robotic arm once the robotic arm reaches the first position; commencing the execution of the visual processing method for determining the position deviation of the second part from the second position and the readiness of the second part to be subjected to a second action by the robotic arm once the robotic arm reaches the second position; and performing a first action on the first part using the robotic arm with the position deviation of the first part from the first position predetermined by the vision process method. | 04-09-2015 |
20150105907 | ROBOT CONTROLLER, ROBOT SYSTEM, ROBOT, ROBOT CONTROL METHOD, AND PROGRAM - A robot includes a control unit that controls a movable unit of the robot to move an endpoint of the movable unit closer to a target position, and an image acquisition unit that acquires a target image as an image containing the end point when the end point is in the target position, and a current image as an image containing the end point when the end point is in a current position. The control unit controls movement of the movable unit based on the current image and the target image and output from a force detection unit that detects a force acting on the movable unit. | 04-16-2015 |
20150105908 | ROBOTIC PLACEMENT AND MANIPULATION WITH ENHANCED ACCURACY - Systems and methods for providing precise robotic operations without the need for special or task-specific components utilize, in one implementation, a spatial adjustment system, physically separate from the robotic manipulator, supports the target workpiece and works in concert with the robotic manipulator to perform tasks with high spatial precision. | 04-16-2015 |
20150120054 | INFORMATION PROCESSING APPARATUS AND INFORMATION PROCESSING METHOD - An orientation designated in advance as an orientation that a grip unit is to take to grip an object having a shape of rotational symmetry with respect to at least one axis is acquired as a reference orientation. The relative position and orientation of the object and grip unit when the grip unit grips the object is acquired as a taught position and orientation. The position and orientation of the object is recognized from an image, and an initial position and initial orientation in which the grip unit grips the object are derived based on the recognized and taught positions and orientations. A grip orientation to grip the object is decided based on the reference and the initial orientation, and a grip position and orientation in which the grip unit grips the object is decided based on the grip orientation and the initial position. | 04-30-2015 |
20150120055 | ROBOT CONTROL DEVICE, ROBOT SYSTEM, AND ROBOT - An image acquisition unit acquires an image including an object, and a controller starts a visual servo using the acquired image, on the basis of at least one of an error in calibration, an error in installation of a robot, an error resulting from the rigidity of the robot, an error of a position where the robot has gripped the object, an error regarding imaging, and an error regarding a work environment. Additionally, the controller starts the visual servo when the distance between one point of a working unit of the robot and the object is equal to or greater than 2 mm. | 04-30-2015 |
20150120056 | MOBILE ROBOT - Disclosed is a mobile robot including a main body and a pattern irradiation unit emitting a cross-shaped optical pattern including a horizontal line optical pattern and a vertical line optical pattern intersecting the horizontal line optical pattern. The pattern irradiation unit includes a light source and a lens converting light emitted from the light source into the cross-shaped optical pattern, the lens includes convex cells on an incidence surface upon which the emitted light is incident, the incidence surface is divided into a first area converting the light emitted from the light source into the horizontal line optical pattern and a second area converting the light emitted from the light source into the vertical line optical pattern, vertical convex cells extended in parallel in the vertical direction are formed in the first area, and horizontal convex cells extended in parallel in the horizontal direction are formed in the second area. | 04-30-2015 |
20150120057 | Mobile Robot - A mobile robot including a robot body, a drive system supporting the robot body, and a controller in communication with the drive system. The robot also includes an actuator moving a portion of the robot body through a volume of space adjacent the mobile robot and a sensor pod in communication with the controller. The sensor pod includes a collar rotatably supported and having a curved wall formed at least partially as a surface of revolution about a vertical axis. The sensor pod also includes a volumetric point cloud sensor housed by the collar and observing the volume of space adjacent the robot from within the collar along an observation axis extending through the curved wall. A collar actuator rotates the collar and the volumetric point cloud sensor together about the collar axis. | 04-30-2015 |
20150127160 | ROBOT, ROBOT SYSTEM, AND ROBOT CONTROL APPARATUS - A robot includes, a holding unit configured to hold an object, an image pickup unit, and a predetermined first portion of the robot. The image pickup unit picks up images of the holding unit and the object using the first portion as a background. | 05-07-2015 |
20150127161 | APPARATUS AND METHOD FOR PICKING UP ARTICLE RANDOMLY PILED USING ROBOT - An article pickup apparatus configured so as to measure surface positions of articles by a three-dimensional measurement instrument to acquire position information of three-dimensional points, calculate a density distribution indicating a degree of a distribution of the three-dimensional points in a three-dimensional space based on the measured position information, calculate a density local maximum position where a density is locally maximized based on the density distribution, calculate a hand position posture which is a position and a posture of the hand capable of picking up an article at the density local maximum position based on the density local maximum position calculated, and control the robot so as to move the hand to the hand position posture to pick up the article. | 05-07-2015 |
20150127162 | APPARATUS AND METHOD FOR PICKING UP ARTICLE RANDOMLY PILED USING ROBOT - An article pickup apparatus configured so as to set a grip unit model including a substantial region of the grip unit in an opened state and a grip region inside the substantial region, set position posture candidates of the grip unit, calculate a grip success possibility of any of the articles by the grip unit in each of the grip position posture candidates based on the position information acquired by a three-dimensional measurement instrument and the grip unit model, select position posture candidates from the position posture candidates based on the grip success possibility and setting as a grip unit position posture, and control the robot so as to move the grip unit to the grip unit position posture to pick up any of the articles. | 05-07-2015 |
20150134115 | Commanding A Mobile Robot Using Glyphs - A method of operating a robot includes receiving image data from an image capture device of the robot. The image data is representative of a glyph viewed by the image capture device on the display of a computing device within a field of view of the image capture device. The method further includes determining, at a controller, a command message based on the glyph represented in the image data and issuing a command to at least one resource or component of the robot based on the command message. | 05-14-2015 |
20150148960 | Robots Comprising Projectors For Projecting Images On Identified Projection Surfaces - Robots including projectors for projecting images on identified projection surfaces are disclosed. A robot includes a housing, an electronic control unit coupled to the housing, a projector coupled to the housing, a human recognition module coupled to the housing, and a projection surface identification module coupled to the housing. The projector, the human recognition module, and the surface identification module are communicatively coupled with the electronic control unit. The electronic control unit includes a non-transitory memory that stores a set of machine readable instructions and a processor for executing the machine readable instructions. When executed by the processor, the machine readable instructions cause the robot to recognize a human using the human recognition module, identify a projection surface using the projection surface recognition module, and project an image on the identified projection surface with the projector. | 05-28-2015 |
20150290812 | ROBOTIZED ISLAND AND RELATIVE METHOD FOR THE AUTOMATIC LABELLING OF BEAMS OF ROUND STEEL, OF BILLETS EXITING CONTINUOUS CASTING AND OF WIRE RODS EXITING THE EVACUATION LINE - Robotized island ( | 10-15-2015 |
20150298317 | Interfacing With A Mobile Telepresence Robot - A telepresence robot may include a drive system, a control system, an imaging system, and a mapping module. The mapping module may access a plan view map of an area and tags associated with the area. In various embodiments, each tag may include tag coordinates and tag information, which may include a tag annotation. A tag identification system may identify tags within a predetermined range of the current position and the control system may execute an action based on an identified tag whose tag information comprises a telepresence robot action modifier. The telepresence robot may rotate an upper portion independent from a lower portion. A remote terminal may allow an operator to control the telepresence robot using any combination of control methods, including by selecting a destination in a live video feed, by selecting a destination on a plan view map, or by using a joystick or other peripheral device. | 10-22-2015 |
20150314449 | INTERFACING WITH A MOBILE TELEPRESENCE ROBOT - A telepresence robot may include a drive system, a control system, an imaging system, and a mapping module. The mapping module may access a plan view map of an area and tags associated with the area. In various embodiments, each tag may include tag coordinates and tag information, which may include a tag annotation. A tag identification system may identify tags within a predetermined range of the current position and the control system may execute an action based on an identified tag whose tag information comprises a telepresence robot action modifier. The telepresence robot may rotate an upper portion independent from a lower portion. A remote terminal may allow an operator to control the telepresence robot using any combination of control methods, including by selecting a destination in a live video feed, by selecting a destination on a plan view map, or by using a joystick or other peripheral device. | 11-05-2015 |
20150314452 | INFORMATION PROCESSING APPARATUS, METHOD THEREFOR, MEASUREMENT APPARATUS, AND WORKING APPARATUS - A shape model of an object to be measured is held. An image of the object obtained by a first device is inputted. Range information obtained by a second device is inputted, and the range information indicates surface positions of the object. A first degradation degree of the image caused by relative motion between the object and the first device, and a second degradation degree of the range information caused by relative motion between the object and the second device are estimated. A position and/or orientation of the object is estimated based on the image and/or range information, the shape model, and the first and second degradation degrees. | 11-05-2015 |
20150314454 | APPARATUS AND METHODS FOR PROVIDING A PERSISTENT COMPANION DEVICE - A development platform for developing a skill for a persistent companion device (PCD) includes an asset development library having an application programming interface (API) configured to enable a developer to at least one of find, create, edit and access one or more content assets utilizable for creating a skill, an expression tool suite having one or more APIs via which are received one or more expressions associated with the skill as specified by the developer wherein the skill is executable by the PCD in response to at least one defined input, a behavior editor for specifying one or more behavioral sequences of the PCD for the skill and a skill deployment facility having an API for deploying the skill to an execution engine of the PCD. | 11-05-2015 |
20150321353 | SYSTEM FOR IMAGING AND ORIENTING SEEDS AND METHOD OF USE - A system and method for the automated or semi-automated imagining and orienting of seeds to prepare the seeds for transformation and transgenic engineering. | 11-12-2015 |
20150321354 | PICKING APPARATUS AND PICKING METHOD - A picking apparatus includes: a three-dimensional imaging device configured to three-dimensionally-image workpieces contained in bulk in a container; a robot arm having a hand capable of gripping a workpiece; and a control device configured to control an operation of the robot arm; the control device is configured: to recognize a position and a posture of a target workpiece, based on an imaging result of the three-dimensional imaging device; to obtain an gripping position of the workpiece and an approach vector thereof, based on the recognized position and posture information; to calculate an intersection point between a straight line extending along the approach vector through the gripping position and a plane including an opening of the container; and to judge whether the workpiece can be picked or not, based on a positional relationship between the intersection point and the opening. | 11-12-2015 |
20150336274 | Information Technology Asset Type Identification Using a Mobile Vision-Enabled Robot - Mechanisms are provided for classifying an obstacle as an asset type. The mechanisms receive a digital image of an obstacle from an image capture device of an automated robot. The mechanisms perform a classification operation on the digital image of the obstacle to identify a proposed asset type classification for the obstacle. The mechanisms determine a final asset type for the obstacle based on the proposed asset type classification for the obstacle. The mechanisms update a map data structure for a physical premises in which the obstacle is present based on the final asset type. | 11-26-2015 |
20150343641 | ROBOT, CONTROL METHOD OF ROBOT, AND CONTROL DEVICE OF ROBOT - A robot includes a grasping unit and performs an action based on: first imaging information of the grasping unit which does not grasp an object to be grasped in a first point; second imaging information of the grasping unit which does not grasp the object to be grasped in a second point which is different from the first point; and third imaging information of the object to be grasped which is grasped by the grasping unit in the first point. | 12-03-2015 |
20150343642 | ROBOT, ROBOT SYSTEM, AND CONTROL METHOD - A robot includes: hands; and a control unit which controls the hands, in which the control unit grasps a target to be grasped by using the hands and detects a position of the target to be grasped, based on a captured image including the target to be grasped. | 12-03-2015 |
20150343643 | ROBOT, ROBOT SYSTEM, AND CONTROL METHOD - A robot includes an operation execution unit; and a control unit which controls the operation execution unit, in which the control unit assembles an operation member at an assembly position by the operation execution unit and determines a state of fastening based on a captured image including the assembly position. | 12-03-2015 |
20150352722 | SERVER CONNECTIVITY CONTROL FOR A TELE-PRESENCE ROBOT - A robot system with a robot that has a camera and a remote control station that can connect to the robot. The connection can include a plurality of privileges. The system further includes a server that controls which privileges are provided to the remote control station. The privileges may include the ability to control the robot, joint in a multi-cast session and the reception of audio/video from the robot. The privileges can be established and edited through a manager control station. The server may contain a database that defines groups of remote control station that can be connected to groups of robots. The database can be edited to vary the stations and robots within a group. The system may also allow for connectivity between a remote control station at a user programmable time window. | 12-10-2015 |
20150367516 | ROBOT ALIGNMENT SYSTEMS AND METHODS OF ALIGNING A ROBOT - Presently disclosed robotic alignment systems and methods may allow for alignment of a platform of a robot with respect to an access port of a part, such as a wing of an aircraft. A robot positioned under the wing may include a base and an upper platform coupled together by a plurality of legs. The upper platform may be moveable with respect to the base in six degrees of freedom in order to be aligned with the access port of the wing so that the robot may insert a tool through the access port without damaging the wing. Disclosed robotic alignment systems may include a calibration plate that is inserted into the access port. A number of positioning devices on the upper platform of the robot may interact with the calibration plate in order to align the upper platform with respect to the calibration plate, and thereby with respect to the access port. | 12-24-2015 |
20150371099 | ROBOTIC DEVICE TESTER - A system, method, and device may include software and hardware which simplify and quicken configuration of the system for testing a device, enhance testing procedures which may be performed, and provide data via which to easily discern a cause and nature of an error which may result during testing. A camera may capture still images of a display screen of a tested device and another camera may capture video images of the tested device and a partner device. A wizard may be used to generate a configuration file based on one previously generated for a similar device. A mount for a tested device may be structured so that: it is suitable for mounting thereon a plurality of differently structured devices; and adjustments in a vertical direction and a horizontal direction in a plane and adjustments of an angle of the device relative to the plane may be easily made. | 12-24-2015 |
20160008984 | ROBOT CONTROL SYSTEM | 01-14-2016 |
20160023352 | SURROGATE: A Body-Dexterous Mobile Manipulation Robot with a Tracked Base - Robotics platforms in accordance with various embodiments of the invention can be utilized to implement highly dexterous robots capable of whole body motion. Robotics platforms in accordance with one embodiment of the invention include: a memory containing a whole body motion application; a spine, where the spine has seven degrees of freedom and comprises a spine actuator and three spine elbow joints that each include two spine joint actuators; at least one limb, where the at least one limb comprises a limb actuator and three limb elbow joints that each include two limb joint actuators; a tracked base; a connecting structure that connects the at least one limb to the spine; a second connecting structure that connects the spine to the tracked base; wherein the processor is configured by the whole body motion application to move the at least one limb and the spine to perform whole body motion. | 01-28-2016 |
20160031085 | ROBOT SYSTEM THAT OPERATES THROUGH A NETWORK FIREWALL - A remote controlled robot system that includes a robot and a remote control station that communicate through a communication network. Communication with the robot is limited by a firewall coupled to the communication network. A communication server establishes communication between the robot and the remote control station so that the station can send commands to the robot through the firewall. | 02-04-2016 |
20160075031 | ARTICLE PICKUP APPARATUS FOR PICKING UP RANDOMLY PILED ARTICLES - An article pickup apparatus according to the present invention is configured to control a robot or a hand in accordance with profile control when the hand holds an article so that an external force acting on the hand detected by a force sensor installed between an arm and the hand is closer to a target value of the external force set by a force target value setting unit. | 03-17-2016 |
20160078583 | IMAGE PROCESSING APPARATUS AND ROBOT SYSTEM - An image processing apparatus includes a first connecting unit connected to an image pickup apparatus, a plurality of second connecting units connected to one control apparatus or one other image processing apparatus, the control apparatus being configured to control a robot; and a processing unit configured to process picked-up images picked up by the image pickup apparatus. | 03-17-2016 |
20160082594 | AUTO REVISING SYSTEM FOR AROUND VIEW MONITORING AND METHOD THEREOF - An auto revising system for around view monitoring (AVM) includes: one or more target members provided in a space in which a vehicle is assembled, as a reference for a position of the vehicle; a control unit configured to control an AVM system which is installed in the vehicle; and a revising robot unit configured to interface with the control unit to automatically revise a screen of the AVM system based on the one or more target members. | 03-24-2016 |
20160082595 | Self-Mobile Robot Laser-Guided Travel Operating System and Control Method Therefor - A laser-guided walking operation system for a self-moving robot comprising a self-moving robot ( | 03-24-2016 |
20160096273 | Apparatus and Method for Universal, Flexible Pillow Bag Pattern Creation - An apparatus and method for measuring a dimension of a non-rigid object and using the dimension to pick and place the object. A first input device conveys a non-rigid object into contact with a feed forward unit, which contact causes a displacement of the feed forward unit. The displacement measures a dimension of the object. The measured dimension is transmitted via at least one line of communication. | 04-07-2016 |
20160096274 | Apparatus and Method for Universal, Flexible Pillow Bag Pattern Creation - An apparatus and method for measuring a dimension of a non-rigid object and using the dimension to pick and place the object. A first input device conveys the object into contact with a feed forward unit, which contact conditions the object. A distance sensor is positioned over a gap in the feed forward unit to measure a distance to a surface of the conditioned object. The distance measures a dimension of the conditioned object. The measured dimension is transmitted via at least one line of communication. | 04-07-2016 |
20160103451 | Mobile Robot Area Cleaning - A cleaning robot includes a chassis, a drive system connected to the chassis and configured to drive the robot, a signal generator and sensor carried by the chassis, and a controller in communication with the drive system and the sensor. The signal generator directs a signal toward the floor surface. The sensor is responsive to reflected signals from the floor surface. The controller controls the drive system to alter direction of the robot responsive to a reflected signal indicating an edge of the floor surface. | 04-14-2016 |
20160107316 | TACTILE SENSOR - A visuo-haptic sensor is presented which uses a deformable, passive material that is mounted in view of a camera. When objects interact with the sensor the deformable material is compressed, causing a change in the shape thereof. The change is shape is detected and evaluated by and image processor that is operatively connected to the camera. The camera may also observe the vicinity of the manipulator to measure ego-motion and motion of close-by objects. The visuo-haptic sensor may be attached to a mobile platform, a robotic manipulator or to any other machine which needs to acquire haptic information about the environment. | 04-21-2016 |
20160114488 | CUSTOMER SERVICE ROBOT AND RELATED SYSTEMS AND METHODS - A robot for providing customer service within a facility includes a locomotion platform, an upper sensor for detecting objects within an upper field of view of the robot, a lower sensor for detecting objects within a lower field of view of the robot, a display and a robot computer in communication with the locomotion platform, the upper sensor and the lower sensor. The robot computer is configured to detect the presence of a customer within the facility based on information received from at least one of the upper sensor and lower sensor, and the robot computer is further configured to access one or more databases storing information associated with products available to customers within the facility and to provide customer service to the customer based on the accessed information. | 04-28-2016 |
20160129597 | DOCKING SYSTEM FOR A TELE-PRESENCE ROBOT - A remote controlled robot system that includes a mobile robot with a robot camera and a battery plug module, and a remote control station that transmits commands to control the mobile robot. The system also includes a battery charging module that mates with the mobile robot battery plug module, and an alignment system that aligns the battery plug module with the battery charging module. The battery modules may also be aligned with the aid of video images of the battery charging module provided to the remote station by a camera located within the battery plug module. | 05-12-2016 |
20160136809 | VISUALLY CONTROLLED END EFFECTOR - A visually controlled end effector is disclosed. The end effector comprises two or more operational members capable of picking up one or more randomly placed items. A crank is capable of actuation by a robot to orient a first one of said two or more operational members to pick up a first one of said one or more randomly placed items. The crank is further capable of actuation by the robot to orient a second one of said two or more operational members to pick up a second one of said one or more randomly placed items. The crank is further capable of orienting the first and second ones of the said two or more operational members for placement of the first and second ones of said randomly placed items into a desired oriented condition. A time savings of three robot transfers is thus made over prior art systems that transferred each individual product one by one. | 05-19-2016 |
20160158937 | ROBOT SYSTEM HAVING AUGMENTED REALITY-COMPATIBLE DISPLAY - A robot system using an augmented reality-compatible display, capable of providing information on the status and/or an operation guide of a robot added to an actual image or actual environment, to a user of the robot, so as to improve the efficiency of operations carried out by the user. The robot system includes an actual robot, a controller which controls the actual robot, and an image capturing-displaying device connected to the controller by a wire or by radio. The image capturing-displaying device has a function for capturing an image of a scene including the actual robot and a function for displaying the captured image in real-time. The user can obtain a scene including the actual robot in real-time by directing a camera arranged on the image capturing-displaying device toward the actual robot, and can monitor an augmented reality image. | 06-09-2016 |
20160158938 | METHOD AND DEVICE FOR DEFINING A WORKING RANGE OF A ROBOT - The invention relates to a method for defining a working range (A, A′) in which a robot ( | 06-09-2016 |
20160167227 | ROBOTIC GRASPING OF ITEMS IN INVENTORY SYSTEM | 06-16-2016 |
20160167232 | PLACEMENT DETERMINING METHOD, PLACING METHOD, PLACEMENT DETERMINATION SYSTEM, AND ROBOT | 06-16-2016 |
20160169663 | METHOD AND SENSOR FOR POSITIONING OF A FLEXIBLE ELEMENT | 06-16-2016 |
20160184995 | Image Processing Apparatus, Image Processing System, Image Processing Method, And Computer Program - A movement command to move an end effector to a plurality of predetermined positions is transmitted to a robot controller so as to change a relative position of a target, which becomes an imaging target, with respect to an imaging device. First coordinate values are acquired, the values being each of position coordinates of the end effector having moved in accordance with the movement command, and an image of the target is captured at each movement destination, to which the end effector has moved. Second coordinate values being position coordinates of the target are detected based on the image of the target captured at each movement destination, and a conversion rule between both of the coordinates is calculated based on the first coordinate values and the second coordinate values. | 06-30-2016 |
20160184997 | Image Processing Apparatus, Image Processing System, Image Processing Method, And Computer Program - There is provided an image processing apparatus, which are capable of controlling a motion of a robot with high accuracy without coding a complex robot motion control program point by point. First coordinate values being each of position coordinates of movement destinations of an end effector of a robot are acquired. Second coordinate values being position coordinates of a target are detected based on an image of the target captured at each of the movement destinations. Selections of a plurality of operations which a robot controller is made to execute are accepted out of a plurality of operations including at least an operation of moving the end effector to the first coordinate values or an operation of moving the end effector to the second coordinate values, to accept a setting of an execution sequence of the plurality of operations the selections of which have been accepted. | 06-30-2016 |
20160203799 | DISPLAY CONTROL APPARATUS AND METHOD FOR SAME | 07-14-2016 |
20160250754 | ROBOTIC ARM AND DISPLAY DEVICE USING THE SAME | 09-01-2016 |
20160375585 | ROBOT DEVICE, METHOD OF CONTROLLING THE SAME, COMPUTER PROGRAM, AND ROBOT SYSTEM - Provided is a robot device including an image input unit for inputting an image of surroundings, a target object detection unit for detecting an object from the input image, an object position detection unit for detecting a position of the object, an environment information acquisition unit for acquiring surrounding environment information of the position of the object, an optimum posture acquisition unit for acquiring an optimum posture corresponding to the surrounding environment information for the object, an object posture detection unit for detecting a current posture of the object from the input image, an object posture comparison unit for comparing the current posture of the object to the optimum posture of the object, and an object posture correction unit for correcting the posture of the object when the object posture comparison unit determines that there is a predetermined difference or more between the current posture and the optimum posture. | 12-29-2016 |
20180021956 | ROBOTIC CAMERA CONTROL VIA MOTION CAPTURE | 01-25-2018 |
20190143523 | ROBOTIC SYSTEM ARCHITECTURE AND CONTROL PROCESSES | 05-16-2019 |
20190146504 | CLEANING ROBOT AND CONTROL METHOD THEREFOR | 05-16-2019 |