Patent application title: ULTRAVIOLET END EFFECTOR
Inventors:
Anthony Sean Jules (Hillsborough, CA, US)
Jimmy Sastra (Pacifica, CA, US)
Assignees:
Robust Al, Inc.
IPC8 Class: AG05D100FI
USPC Class:
1 1
Class name:
Publication date: 2021-11-11
Patent application number: 20210349462
Abstract:
A cleaning robot may include a chassis, a camera, a control unit, a
mobility apparatus, and a movable lighting element. The control unit may
execute computer programming instructions in a manner depending on
information received from the camera. The mobility apparatus may be
capable of causing the cleaning robot to move through a physical space.
The movable lighting element may include one or more light sources
selectively emitting ultraviolet lightClaims:
1. A cleaning robot comprising: a chassis; a camera coupled with the
chassis; a control unit coupled with the chassis, the control unit
including a processor and a memory module, the processor configured to
execute computer programming instructions stored on the memory module,
the execution of the computer programming instructions depending on
information received from the camera; a mobility apparatus physically
coupled with the chassis, the mobility apparatus capable of causing the
cleaning robot to move through a physical space based on a first
instruction received from the control unit; and a movable lighting
element coupled with the chassis, the lighting element including one or
more light sources selectively emitting ultraviolet light based on a
second instruction received from the control unit.
2. The cleaning robot recited in claim 1, wherein the ultraviolet light is associated with a field of exposure, and wherein moving the lighting element comprises changing a direction of the field of exposure.
3. The cleaning robot recited in claim 1, wherein the one or more light sources includes a retractable array of light-emitting diodes.
4. The cleaning robot recited in claim 3, wherein the retractable array of light-emitting diodes includes a first portion arranged in a substantially horizontal orientation.
5. The cleaning robot recited in claim 4, wherein the retractable array of light-emitting diodes includes a second portion arranged in a substantially vertical orientation.
6. The cleaning robot recited in claim 5, wherein the cleaning robot further comprises a second roller, the second roller serving as a transition point between the first portion and the second portion.
7. The cleaning robot recited in claim 4, wherein the retractable array of light-emitting diodes includes a spool of tape, the light-emitting diodes being arranged on the spool of tape, the spool of tape being wrapped around a first roller when in a retracted position.
8. The cleaning robot recited in claim 1, wherein the control unit is operable to cause the cleaning robot to autonomously navigate and clean an area of physical space by exposing one or more surfaces within the physical space to the ultraviolet light.
9. The cleaning robot recited in claim 8, wherein the lighting element is coupled with a deformable skirt, and wherein exposing the one or more surfaces to the ultraviolet light involves bringing the deformable skirt in contact with the surface.
10. The cleaning robot recited in claim 9, wherein the deformable skirt is composed of material that absorbs ultraviolet light.
11. The cleaning robot recited in claim 8, wherein the movable lighting element includes a deformable surface, and wherein the one or more light sources include a plurality of light sources mounted on the deformable surface, and wherein exposing the one or more surfaces to the ultraviolet light involves deforming the deformable surface.
12. The cleaning robot recited in claim 11, wherein the deformable surface is deformed by a mechanism selected from the group consisting of: a magnet, a mechanical actuator, and a suction device.
13. The cleaning robot recited in claim 1, wherein the lighting element includes a Fresnel lens that reflects ultraviolet light.
14. The cleaning robot recited in claim 13, wherein the Fresnel lens is deformable, deformation of the Fresnel lens affecting an intensity pattern of the ultraviolet light emitted by the lighting element.
15. The cleaning robot recited in claim 13, wherein the Fresnel lens is asymmetric, one or more of the light sources being located off of a central axis of the Fresnel lens.
16. The cleaning robot recited in claim 1, wherein the movable lighting element includes a first surface and a second surface, each of the first and second surfaces including a respective one or more of the light sources, the first and second surfaces being connected by a hinge.
17. A method of calibrating a lighting element, the method comprising: positioning a lighting element and a rear projection screen in a plurality of configurations, the lighting element including one or more light sources configured to emit ultraviolet light, each of the configurations placing the lighting element at a respective location in three-dimensional space with respect to the rear projection screen; activating the lighting element at each of the positions; detecting a plurality of patterns of ultraviolet light projected onto the rear projection screen, each of the patterns being detected while the lighting element is positioned at a respective one of the locations; and determining a three-dimensional texture that includes a plurality of intensity values for the lighting element, each of the intensity values indicating a respective intensity level of the ultraviolet light emitted by the lighting element, each of the intensity levels corresponding with a respective position in three-dimensional space with respect to the lighting element.
18. The method recited in claim 17, wherein positioning the lighting element involves moving a robotic arm, the lighting element being mounted on the robotic arm.
19. The method recited in claim 17, wherein positioning the lighting element involves moving the rear projection screen.
20. The method recited in claim 17, wherein positioning the lighting element involves reconfiguring the lighting element, the lighting element capable of being arranged in a plurality of configurations.
Description:
CROSS-REFERENCE TO RELATED APPLICATION
[0001] The present application claims priority under 35 U.S.C. 120 to U.S. Provisional Application No. 63/022,348 (Attorney Docket No. RBAIP001P) by Brooks et al., titled "A CLEANING ROBOT", filed May 8, 2020, and to U.S. Provisional Application No. 63/022,349 (Attorney Docket No. RBAIP002P) by Brooks et al., titled "ROBOTIC SOCIAL INTERACTION", filed May 8, 2020, both of which are hereby incorporated by reference in their entirety and for all purposes.
COPYRIGHT NOTICE
[0002] A portion of the disclosure of this patent document contains material which is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure as it appears in the United States Patent and Trademark Office patent file or records but otherwise reserves all copyright rights whatsoever.
TECHNICAL FIELD
[0003] The present disclosure relates generally to robotics, and more specifically to robotic cleaning solutions.
DESCRIPTION OF RELATED ART
[0004] Conventional approaches to cleaning often involve manual activity by humans, but such approaches have numerous drawbacks. For example, humans can clean surfaces using chemical means. However, manual cleaning with chemicals can expose humans to potentially dangerous chemicals and pathogens. Further, manual cleaning with chemicals often results in incomplete cleaning that falls far short of standards for sterilization or even disinfection. As another example, humans can clean surfaces by activating a handheld UV light source. However, such an approach risks exposing humans to excessive UV energy, risks incomplete cleaning, and requires significant manual activity.
[0005] Other conventional approaches to cleaning involve automated, powerful UV light sources. However, such an approach often involves moving a powerful light source into an area and then evacuating the area while the cleaning is completed. Such approaches are typically restricted to optically-sealed rooms or shelters from which all people and animals must leave while cleaning occurs. The risk to humans is significant since a safe dosage would be exceeded at 1 meter distance in a matter of seconds. In addition, such approaches require massive energy consumption. Such powerful UV light sources also can degrade plastics, equipment, and other materials present in the room. Some powerful UV lamps emit ozone, a greenhouse gas. Alternatively, or additionally, some powerful UV lamps contain mercury (Hg), which both creates sanitary risks and imposes an environmental cost.
OVERVIEW
[0006] According to various embodiments, techniques and mechanisms described herein provide for systems, devices, methods, and machine readable media for robotic cleaning solutions. A cleaning robot may include a chassis, a camera coupled with the chassis, a control unit coupled with the chassis, a mobility apparatus physically coupled with the chassis, and a movable lighting element coupled with the chassis. The control unit may include a processor and a memory module. The processor may be configured to execute computer programming instructions stored on the memory module. The execution of the computer programming instructions may depend on information received from the camera. The mobility apparatus may be capable of causing the cleaning robot to move through a physical space based on a first instruction received from the control unit. The lighting element may include one or more light sources selectively emitting ultraviolet light based on a second instruction received from the control unit.
[0007] According to various embodiments, the ultraviolet light may be associated with a field of exposure, and moving the lighting element may involve changing a direction of the field of exposure.
[0008] In some implementations, the one or more light sources may include a retractable array of light-emitting diodes. The retractable array of light-emitting diodes may include a first portion arranged in a substantially horizontal orientation. The retractable array of light-emitting diodes may include a second portion arranged in a substantially vertical orientation.
[0009] The cleaning robot further may include a second roller serving as a transition point between the first portion and the second portion. The retractable array of light-emitting diodes may include a spool of tape. The light-emitting diodes being arranged on the spool of tape. The spool of tape may be wrapped around a first roller when in a retracted position.
[0010] In some embodiments, the control unit may be operable to cause the cleaning robot to autonomously navigate and clean an area of physical space by exposing one or more surfaces within the physical space to the ultraviolet light.
[0011] According to various embodiments, the lighting element may be coupled with a deformable skirt, and exposing the one or more surfaces to the ultraviolet light may involve bringing the deformable skirt in contact with the surface. The deformable skirt may be composed of material that absorbs ultraviolet light.
[0012] According to various embodiments, the movable lighting element may include a deformable surface. The one or more light sources include a plurality of light sources mounted on the deformable surface. Exposing the one or more surfaces to the ultraviolet light may involve deforming the deformable surface. The deformable surface may be deformed by a mechanism such as a magnet, a mechanical actuator, and a suction device.
[0013] In some embodiments, the lighting element may include a Fresnel lens that reflects ultraviolet light. The Fresnel lens may be deformable, where deformation of the Fresnel lens affects an intensity pattern of the ultraviolet light emitted by the lighting element. The Fresnel lens may be asymmetric, with one or more of the light sources located off of a central axis of the Fresnel lens.
[0014] In some embodiments, the movable lighting element may include a first surface and a second surface that each may include a respective one or more of the light sources. The first and second surfaces may be connected by a hinge.
[0015] In some embodiments, a method may involve positioning a lighting element and a rear projection screen in a plurality of configurations. The lighting element may include one or more light sources configured to emit ultraviolet light. Each of the configurations may place the lighting element at a respective location in three-dimensional space with respect to the rear projection screen. The lighting element may be activated at each of the positions. A plurality of patterns of ultraviolet light projected onto the rear projection screen may be detected. Each of the patterns may be detected while the lighting element may be positioned at a respective one of the locations. A three-dimensional texture that may include a plurality of intensity values for the lighting element may be determined. Each of the intensity values may indicate a respective intensity level of the ultraviolet light emitted by the lighting element. Each of the intensity levels may correspond with a respective position in three-dimensional space with respect to the lighting element.
[0016] In some embodiments, positioning the lighting element may involve moving a robotic arm, where the lighting element is mounted on the robotic arm. Alternatively, or additionally, positioning the lighting element may involve moving the rear projection screen. Positioning the lighting element may involve reconfiguring the lighting element capable of being arranged in a plurality of configurations.
BRIEF DESCRIPTION OF THE DRAWINGS
[0017] The included drawings are for illustrative purposes and serve only to provide examples of possible structures and operations for the disclosed inventive systems, apparatus, methods and computer program products for robotic cleaning solutions. These drawings in no way limit any changes in form and detail that may be made by one skilled in the art without departing from the spirit and scope of the disclosed implementations.
[0018] FIG. 1 illustrates a diagram of a cleaning robot, configured in accordance with one or more embodiments.
[0019] FIG. 2A and FIG. 2B illustrate diagrams of a UV end effector, configured in accordance with one or more embodiments.
[0020] FIG. 3A, FIG. 3B, and FIG. 3C illustrate diagrams of a UV end effector, configured in accordance with one or more embodiments.
[0021] FIG. 4A, FIG. 4B, FIG. 4C, and FIG. 4D illustrate diagrams of UV end effectors, configured in accordance with one or more embodiments.
[0022] FIG. 5A and FIG. 5B illustrate diagrams of UV end effectors, configured in accordance with one or more embodiments.
[0023] FIG. 6 illustrates a UV end effector, configured in accordance with one or more embodiments.
[0024] FIG. 7 illustrates a UV end effector calibration system, configured in accordance with one or more embodiments.
[0025] FIG. 8A, FIG. 8B, and FIG. 8C illustrate examples two-dimensional UV light patterns as detected by a UV end effector calibration system, generated in accordance with one or more embodiments.
[0026] FIG. 9 illustrates a light source calibration method, performed in accordance with one or more embodiments.
[0027] FIG. 10 illustrates an architecture diagram for a cleaning robot, configured in accordance with one or more embodiments.
[0028] FIG. 11 illustrates a method for cleaning an area, performed in accordance with one or more embodiments.
[0029] FIG. 12 illustrates one example of a computing device, configured in accordance with one or more embodiments.
DETAILED DESCRIPTION
[0030] Techniques and mechanisms described herein are directed to ultraviolet (UV) end effectors that may be used in conjunction with a robotic cleaning solution. A robot may navigate an area for the purpose of cleaning some or all of the area. The area may be otherwise occupied by people who are present for purposes associated with the location. Accordingly, the robot may engage in social accommodation, in which it attempts to accomplish its cleaning task while taking into account the presence, goals, and trajectories of the people.
[0031] According to various embodiments, the robot may be equipped with one or more cleaning tools, such as one or more ultraviolet (UV) power sources. The term "UV light" generally refers to electromagnetic radiation in the 10 nm-400 nm wavelength range. Cleaning applications of UV radiation, sometimes referred to as Ultra-Violet Germicidal Irradiation (UVGI), can apply illumination in the UVC range of wavelengths, between approximately 100 nm-280 nm, corresponding to the range of maximum response by most targeted pathogens.
[0032] In some implementations, one or more UV light sources may be configured with one or more additional components to form a UV end effector. These additional components may serve to direct, focus, reflect, deform, refract, or otherwise manipulate one or more UV light source and/or the UV light emitted by one or more UV light sources.
[0033] Depending on the configuration, a cleaning tool such as a UV end effector may be oriented in one, two, three, four, five, six, or any number of suitable dimensions, independent of the movement of the robot itself. For example, a robot may position a UV end effector close to a cleaning target and then activate the UV light source. The robot may then move the UV end effector through a trajectory, either by moving the end effector, moving the robot, or some combination thereon.
[0034] According to various embodiments, the robot may be equipped to accommodate the presence of people within the area of cleaning. For example, the robot may monitor people nearby and estimate levels of UV radiation the people may be receiving. If the robot determines that radiation may bring the total dose to any human above a safety threshold, then the robot may avoid activating the UV power source or may deactivate the UV power source if it has already been activated. As another example, the robot may provide social cues to people as to the robot's actions. For instance, the robot may indicate what it is doing and/or how long the cleaning action will continue. As yet another example, the robot may interrupt a cleaning process and/or move to a different location when it determines that it should defer to the activity of humans, animals, or other robots.
[0035] In some implementations, the robot may be guided in its cleaning activity based on communication with a remote computing device such as a control computer having access to a database system. Alternatively, or additionally, the robot may report its actions to such a system.
[0036] In some implementations, the robot may coordinate with other robots. The other robots may be configured to perform complementary cleaning activities or may be focused on other tasks. Each robot may be directed by a central command and control apparatus. Alternatively, or additionally, the robots may communicate with each other directly.
[0037] In some implementations, the robot may communicate with nearby people. For example, the robot may receive cleaning instructions from a nearby person. As another example, the robot may receive instructions about social accommodation from a nearby person. The robot may be configured to verify the authority of the person to issue such instructions. For instance, the robot may be configured to ascertain the person's identity and/or role through any of various authentication mechanisms.
[0038] In some embodiments, the term "cleaning" may encompass any or all of a variety of concepts. At the lowest level, "sanitation" refers to reducing the density amount of pathogenic microbes on a surface. The term "disinfection" refers to a strong sanitization in which specific pathogens are almost totally removed. "Decontamination" refers to the removal of specific dangerous pathogens. At the highest level, "sterilization" refers to the removal of all pathogens from a surface. The term "power" refers to energy over time and may be measured in Watts (i.e., Joules/second). The term "intensity" refers to the amount of power distributed on a surface and may be measured in Watts per area of surface. The term "dose" refers to intensity received over time and may be measured in Joules per area of surface, or equivalently Watts multiplied by time and divided by surface area.
[0039] In contrast to conventional approaches, techniques and mechanisms described herein provide for the safe, effective, comprehensive, and automated cleaning. For example, techniques and mechanisms described herein may facilitate the automated cleaning of a hospital, laboratory, restaurant, retail establishment, industrial facility, schools, or other environments.
[0040] In some implementations, one or more cleaning robots may navigate a physical environment to clean surfaces and objects within the environment. For example, each cleaning robot may be equipped with one or more relatively low power, directional UV light sources that the robot brings into close proximity with a surface to be cleaned. Because the UV light source is relatively lower powered and/or directional, human exposure can be limited or prevented. For instance, the robot can monitor the environment to avoid both inconveniencing humans and exposing humans to excessive UV energy. The robot can then automatically navigate to a charging or docking station when its tasks have been completed.
[0041] In particular embodiments, techniques and mechanisms described herein may facilitate the cleaning of an environment such as a hospital. The complexity, number of pathogens, and high occupancy of hospital environments makes the cleaning of such environments both extremely important and yet difficult via conventional means. However, using techniques and mechanisms described herein, one or more robots may continuously or periodically traverse a hospital environment to clean objects and surfaces. In particular, a robot using a targeted, low-powered UV light source may unobtrusively and safely navigate the environment to opportunistically clean surfaces and objects when such cleaning can be performed without inconveniencing humans or exposing humans to excessive UV radiation.
[0042] In particular embodiments, a cleaning robot can also perform tasks in a socially aware way, for instance by recognizing individuals based on their roles as doctors, nurses, patients, administrators, maintenance workers, and/or members of the public, and then treating individuals differently based on those roles. For example, the cleaning robot may place a very high priority on avoiding doctors and nurses, who may be in a hurry to provide medical services. As another example, the cleaning robot may be configured to respond to instructions from maintenance workers and administrators. However, the cleaning robot may be less accommodating of other individuals, such as members of the general public.
[0043] FIG. 1 illustrates a diagram of a cleaning robot 100, configured in accordance with one or more embodiments. The cleaning robot 100 includes a chassis 102, a mobility apparatus 104, an extension rod 106, a camera 108, an arm 110, and a UV light tape 122. The UV light tape 122 includes one or more UV light elements such as the UV light element 116. The UV light tape 122 may be retracted onto a reel 112 and is connected with the arm 110 via the pulley 114. Each of the UV light sources is configured to selectively emit UV light in a pattern such as the pattern 118.
[0044] According to various embodiments, the arm 110 may be configured in any of a variety of possible arrangements. For example, the arm 110 may be arranged in a fixed position. As another example, the arm 110 may be movable in one or more dimensions. For instance, the arm 110 may be retractable.
[0045] In particular embodiments, the UV light tape 122 may be connected with the arm 110, for instance at the connection point 120. In such a configuration, extending the arm 110 may cause the UV light tape 122 to extend, while retracting the arm 110 may cause the UV light tape 122 to retract onto the reel 112.
[0046] According to various embodiments, the camera 108 may be configured to capture visible light data. Alternatively, or additionally, the cleaning robot 100 may be equipped with one or more of a variety of sensors. Such sensors may include, but are not limited to: visual light cameras, infrared cameras, microphones, Lidar devices, Radar devices, chemical detection devices, near field communication devices, and accelerometers.
[0047] In FIG. 1, the arm 110 and the camera 108 are connected with the chassis 102 via the extension rod 106. However, a variety of configurations are possible. For instance, in some implementations the cleaning robot 100 may include a housing into which the arm 110 may retract.
[0048] In particular embodiments, the arm 110 may be moved at least in part by moving its base 124. For example, the base 124 may be rotated about the extension rod 106. As another example, the base 124 may be moved up and down the extension rod 106.
[0049] According to various embodiments, a cleaning robot 100 may include one or more elements instead of, or in addition to, the elements shown in FIG. 1. Additional detail regarding elements that may be included in a cleaning robot are discussed with respect to FIG. 10.
[0050] FIG. 2A and FIG. 2B illustrate diagrams of a UV end effector 202, configured in accordance with one or more embodiments. The UV end effector 202 is configured as two UV-emitting panels 212 and 214 connected together by a hinge 208. Each UV-emitting panel may be configured as an array of UV point light sources. The UV end effector 202 is shown in a fully open position in FIG. 2A and in a partially closed position in FIG. 2B.
[0051] Combining the UV-emitting panels 212 and 214 produces a superposition of intensity at a given position in space. In FIG. 2A and FIG. 2B, below the UV end effector 202 is a plot of UV intensity 204 along a dimension 206 perpendicular to the hinge 208. In FIG. 2A, the intensity is higher toward the middle of the hinge since that area is closest to the largest portion of the UV end effector 202 and therefore receives more UV exposure. In FIG. 2B, the intensity is lower toward the middle of the hinge since that portion of the surface is further away from the surface of the UV end effector than portions of the surface closer to the ends of the panels 214 and 212.
[0052] According to various embodiments, the configuration of the UV-emitting surfaces 212 and 214 may be adjusted in real-time, for instance by having actuators between the panels. Although the FIGS. 2A and 2B show only two UV-emitting surfaces, a variety of types, numbers, and configurations of UV light sources are possible.
[0053] FIG. 3A, FIG. 3B, and FIG. 3C illustrate diagrams of a UV end effector 302, configured in accordance with one or more embodiments. The UV end effector 302 is configured as a UV light source arranged on a deformable surface. The intensity of UV light on a surface exposed to the UV end effector 302 may be affected by deforming the surface.
[0054] According to various embodiments, the UV end effector 302 may be created by arranging a series of UV light sources on a deformable mesh or netting. In some implementations, the deformable surface UV end effector 302 may be deformed by a magnet such as the magnet 304. The magnet 304 may be configured as a permanent magnet or an induced magnet. For instance, the magnet 304 may be placed at a particular location on a coordinate plane above the deformable surface, which may cause the deformable surface to deform in a particular way. The magnet is shown at a first location in FIG. 3B and at a second location in FIG. 3C, with the surface deformed in different ways depending on the location of the magnet. Under such forces, the plane shown in FIG. 3A may be deformed into different skewed shapes of pseudo parabolas from which an array of light sources project.
[0055] In some embodiments, the deformable surface UV end effector 302 may be deformed by a mechanism other than a magnet. For example, the surface may be deformed by one or more mechanical actuators that press or pull the surface. As another example, the surface may be deformed by tension applied to the edges and/or corners of the surface. As yet another example, the surface may be deformed by suction applied to the back of the surface.
[0056] In particular embodiments, a UV end effector with a deformable or otherwise adjustable surface may be adjusted in order to clean different types of surfaces. For example, when cleaning a flat surface, the adjustable surface may itself be kept flat. As another example, when cleaning an object such as a phone that is smaller than the adjustable surface, or a round surface such a doorknob, the adjustable surface may be pulled into a concave shape. As yet another example, when cleaning the interior of an object such as a lamp shade, the adjustable surface may be pushed into a convex shape to emit UV light throughout the interior.
[0057] FIG. 4A, FIG. 4B, FIG. 4C, and FIG. 4D illustrate diagrams of UV end effectors 400, 420, 440, and 460, configured in accordance with one or more embodiments. Each of the UV end effectors 400, 420, 440, and 460 is equipped with a Fresnel lens. According to various embodiments, the Fresnel lens may be used to reflect and focus UV light emitted by a light source toward a target.
[0058] The UV end effector 400 includes an asymmetric Fresnel lens 402, a UV light source 404, and a housing 406. The housing 406 provides an attachment point for the UV light source 404, and may further aid in directing the UV light source toward a target. According to various embodiments, the Fresnel lens 402 may be asymmetric in order to allow the light source 404 to be positioned off of the central axis of the Fresnel lens 402.
[0059] According to various embodiments, one or more of a variety of techniques may be used to construct the Fresnel lens. For example, the Fresnel lens may be constructed of polycarbonate and then coated in aluminum or any other UV-reflective coating. As another example, the Fresnel lens may be machined from aluminum or any other UV-reflective coating. As yet another example, the Fresnel lens may be constructed of rubber or another flexible surface and then coated with aluminum or any other UV-reflective coating.
[0060] The UV end effector 420 includes a symmetric Fresnel lens 408, a UV light source 410, an extension support 412, and a housing 428. The housing 428 provides an attachment point for the extension support 412, which may be used to support the UV light source 404 over or near the central axis of the Fresnel lens 408, and may further aid in directing the UV light toward a target.
[0061] In particular embodiments, an extension support may be adjustable. For instance, an adjustable extension support may be combined with a deformable Fresnel lens to produce light in a variety of intensity patterns. Such intensity patterns may be determined by empirical analysis, such as via the calibration method 900 discussed with respect to FIG. 9.
[0062] The UV end effector 440 includes a symmetric deformable Fresnel lens 418, a UV light source 416, an extension support 430, and a housing 432. According to various embodiments, the deformable Fresnel lens 418 may be deformed primarily in the axial direction. The housing 432 provides an attachment point for the extension support 430, which may be used to support the UV light source 416 over or near the central axis of the Fresnel lens 414, and may further aid in directing the UV light toward a target.
[0063] The UV end effector 440 also includes a deformation mechanism 418. According to various embodiments, one or more of a variety of different deformation mechanisms may be used. For example, a mechanical actuator may push or pull the Fresnel lens into different shapes. Other mechanisms that may be used may include, but are not limited to: one or more magnets, suction devices, or tension devices.
[0064] In some implementations, the Fresnel lens may be deformed in order to affect its field of vision, focal length, and/or intensity pattern. For example, deforming the Fresnel lens in a direction along its axis may affect its focal length. As another example, deforming the Fresnel lens in a different direction may affect its intensity pattern.
[0065] The UV end effector 460 includes an asymmetric deformable Fresnel lens 424, a UV light source 422, and a housing 434. The housing 434 provides an attachment point for the UV light source 422, and may further aid in directing the UV light toward a target. According to various embodiments, the Fresnel lens 424 may be asymmetric in order to allow the light source 422 to be positioned off of the central axis of the Fresnel lens 424.
[0066] The UV end effector 460 also includes a deformation mechanism 426. According to various embodiments, one or more of a variety of different deformation mechanisms may be used. For example, a mechanical actuator may push or pull the Fresnel lens into different shapes. Other mechanisms that may be used may include, but are not limited to: one or more magnets, suction devices, or tension devices.
[0067] As discussed throughout this application, one or more of a variety of different UV light sources may be employed for the light sources 404, 410, 416, and 422. For instance, such light sources may be configured as a point, a rod, or an array of individual light sources.
[0068] According to various embodiments, the intensity pattern of a UV end effector having a fixed or deformable Fresnel lens, or indeed the intensity pattern of any other UV end effector, may be empirically determined via calibration. For instance, the method 900 shown in FIG. 9 may be used to calibrate a UV end effector.
[0069] FIG. 5A and FIG. 5B illustrate diagrams of UV end effectors 502 and 506, configured in accordance with one or more embodiments. The UV end effector 502 includes bristles or flaps 504 and a light source. According to various embodiments, the bristles or flaps 504 are composed of a material that absorbs UV light. The UV end effector 502 can then be brought close to a surface to clean it, while the bristles or flaps 504 block the surrounding area from UV exposure.
[0070] The UV end effector 506 includes a deformable skirt 508, bristles or flaps 510, and a light source. As with the UV end effector 502, the UV end effector 506 may be brought close to a surface to clean it.
[0071] In particular embodiments, a surface may be cleaned by bringing the UV end effector 502 or 506 in contact with the surface. The bristles and/or deformable skirt may then form a type of seal between the UV end effector and the surface. Alternatively, the UV end effector may be brought close to a surface without touching it.
[0072] According to various embodiments, the UV end effector 502 or 506 may be used to clean surfaces in sensitive areas, such as areas close to humans, animals, or objects that should not be exposed to UV radiation. In addition, by limiting the amount of UV light escaping, the intensity of light directed at the target may be effectively increased.
[0073] According to various embodiments, FIGS. 5A and 5B are only examples of many possible configuration. For example, bristles or flaps may partially or entirely absorb UV light. As another example, bristles or flaps may include a reflective layer close to the light source to reflect light back at the target, surrounded by an absorbent layer to absorb any light that passes the reflective layer. As yet another example, the shape, size, and material of the skirt may be strategically determined, for instance based on the environment and application. For instance, a very flexible skirt with many bristles may allow the skirt to conform to round surfaces as the UV light source is moved over the target.
[0074] In particular embodiments, the bristles or flaps may be composed of one or more of a variety of different elements. For example, bristles or flaps may be composed of any flexible material that absorbs UV light or that is coated with a material that absorbs UV light. For instance, bristles or flaps may be composed of fiberoptic cables, stretched polyethylene terephthalate, or other such materials. For instance, fiberoptic cables may be configured to emit light in order to indicate an area being cleaned.
[0075] FIG. 6 illustrates a UV end effector 602, configured in accordance with one or more embodiments. The UV end effector 602 includes a UV light source 604 and a galvanometer 606. The galvanometer 606 may direct the UV light emitted by the UV end effector 602.
[0076] According to various embodiments, the galvanometer 606 may be configured to direct the light in any of various ways. For example, the galvanometer 606 may direct the light onto a specific portion of a surface. As another example, the galvanometer 606 may be configured to rasterize the UV light over a surface. As yet another example, the galvanometer 606 may be configured to direct the light so as to produce a particular pattern of intensity onto a surface.
[0077] In particular embodiments, a galvanometer-based approach may be used in conjunction with one or more other configurations described herein. In this way, the system may be configured to steer light either coarsely or very precisely, for example for small or hard-to-reach areas that may be incident to a light source with a wider field-of-view.
[0078] FIG. 7 illustrates a UV end effector calibration system 700, configured in accordance with one or more embodiments. The UV end effector calibration system 700 includes a robotic arm 702 upon which a UV end effector 704 can be positioned. The robotic arm 702 may be moved through space to position the UV end effector 704 at a designated position.
[0079] The UV end effector 704 may then be activated to project UV light onto a screen 706. A sensor 708 may then detect the UV light pattern projected onto the screen 708. The pattern may be recorded to build a comprehensive model of the UV light emitted by the UV end effector 704.
[0080] According to various embodiments, the robotic arm 702 may be configured to position the UV end effector 704 with up to six degrees of freedom. For example, the robotic arm 702 may be configured to position the UV end effector at any of a variety of locations within a three-dimensional space. As another example, the robotic arm 702 may be configured to orient the UV end effector 704 by altering its roll, pitch, and/or yaw.
[0081] In some implementations, the UV end effector calibration system 700 may be configured such as the robotic arm 702 is located at a known position relative to the display screen 706. The robotic arm may then position the UV end effector 704 at a known position in three-dimensional space. Accordingly, the pattern of UV light emitted by the UV end effector may then indicate the intensity of the UV light emitted by the UV end effector 704 at a known distance and orientation in three-dimensional space.
[0082] Although the UV end effector calibration system 700 is shown as having the UV end effector 704 arranged in a mobile configuration and the screen 706 arranged in a fixed configuration, other arrangements are possible. For example, the UV end effector 704 may be fixed, while the screen 706 may be moved to known points in three-dimensional space around the UV end effector 704. As discussed above, such movement may be performed in up to 6 degrees of freedom, since the screen 706 may be positioned and oriented in each of up to three dimensions.
[0083] In particular embodiments, the UV end effector calibration system 700 may be used to determine a model of UV light emitted by any of a variety of suitable UV light sources. For example, the UV light source may be any UV light source discussed herein, including one or more point, rod, or surface light source. As another example, the UV light source may be a more complex configuration, such as one that involves a Fresnel lens, hinge, or other such component.
[0084] FIG. 8A, FIG. 8B, and FIG. 8C illustrate examples two-dimensional UV light patterns as detected by a UV end effector calibration system such as the system 700 shown in FIG. 7, generated in accordance with one or more embodiments.
[0085] FIG. 9 illustrates a method 900 for calibrating a light source, performed in accordance with one or more embodiments. The method 900 may be performed in order to determine a three-dimensional model of intensity for light emitted by a UV light source or any other light source. For instance, the method 900 may be used to determine a three-dimensional model of intensity for light emitted by one or more of the end effectors discussed herein.
[0086] In some implementations, the method 900 may be performed in order to employ a light source in a cleaning solution. For instance, the three-dimensional model of intensity may be used to determine a trajectory through space for moving a UV light source in order to clean a surface.
[0087] A request to calibrate a light source is received at 902. In some implementations, the request may be received at a light source calibration system such as the system 700 shown in FIG. 7. The light source may be arranged within a UV end effector.
[0088] The light source is positioned with respect to a projection screen at 904. According to various embodiments, the light source may be positioned by moving a robotic arm on which the light source is fixed. Alternatively, the projection screen may be positioned relative to a fixed light source. In still another configuration, both the light source and the projection screen may be moved in space.
[0089] The light source is activated at 906. When activated, the light source emits UV light onto the screen.
[0090] A pattern of light projected through or onto the screen is captured at 906. According to various embodiments, the pattern may be captured by any suitable sensor device. For instance, a sensor device may be configured to detect UV radiation.
[0091] A determination is made at 910 as to whether to reposition the light source with respect to the projection screen. According to various embodiments, light sources may be moved through a number of predetermined positions in order to construct a sufficiently comprehensive three-dimensional model of intensity.
[0092] In particular embodiments, the determination made at 910 may be made at least in part based on the pattern or patterns captured at 908. For instance, a simple light source may tend to generate relatively simple patterns of projected light. In such a situation, a three-dimensional model may be extrapolated from a relatively small number of positions and patterns. However, a complex end effector may include, for instance, one or more individual light sources, surfaces, lenses, or other such components. Together these components may result in an intensity for the end effector that is complex and non-linear in three or more dimensions, since the intensity at any particular point in space may be determined by a superposition of the emissions of the end effector. Accordingly, as the complexity of the light source increases, so to may the number of positions and patterns needed to accurately determine a three-dimensional model of intensity. Such a situation may be detected based on, for instance, complexity and non-linearity in the detected patterns. For example, the intensity pattern illustrated in FIG. 8A may indicate a relatively simple light source, while the intensity pattern illustrated in FIG. 8C may indicate a more complex light source.
[0093] In particular embodiments, positioning the light source may involve placing a UV end effector in a particular configuration. For instance, a UV end effector that includes a hinge, deformable surface, or deformable Fresnel lens may be rotated, adjusted, or deformed to various configurations. In this way, the intensity pattern of a complex light source may be empirically determined by observing the intensity of the light source in various positions, orientations, and configurations.
[0094] A three-dimensional texture of light intensity is determined for the light source at 912. In some implementations, the three-dimensional texture of light intensity may serve as a look-up table for the intensity of the source incident on a surface. The three-dimensional texture may be determined by combining information from the patterns captured at 908. For instance, each pattern may represent a cross-section of the light intensity of a plane at the screen's position through the emission of the light source.
[0095] According to various embodiments, the three-dimensional texture may be stored in any of a variety of formats. For instance, the three-dimensional texture may be stored as a matrix or array of values, as one or more numerical formulas, as one or more gradients, or as some combination thereof.
[0096] FIG. 10 illustrates an architecture diagram for a cleaning robot 1000, configured in accordance with one or more embodiments. According to various embodiments, the cleaning robot 1000 may be configured in a variety of form factors so long as it includes the ability to relocate and clean a surface. The cleaning robot 1000 includes a processor 1002, a memory module 1004, a communication interface 1006, a storage device 1008, a sensor module 1010, a UV end effector 1012, and a mobility apparatus 1014.
[0097] According to various embodiments, the cleaning robot 1000 may include one or more processors 1002 configured to perform operations described herein. The memory module 1004 may include one or more transitory memory elements, such as random access memory (RAM) modules. The storage device 1008 may be configured to store information such as computer programming language instructions and/or configuration data.
[0098] In some implementations, the cleaning robot 1000 may include one or more communication interfaces 1006 configured to perform wired and/or wireless communication. For example, the communication interface 1006 may include a WiFi communication module. As another example, the communication interface 1006 may include a wired port such as a universal serial bus (USB) port, which may be connected when the cleaning robot couples with a docking or charging port or device.
[0099] According to various embodiments, the sensor module 1010 may include one or more of various types of sensors. Such sensors may include, but are not limited to: visual light cameras, infrared cameras, microphones, Lidar devices, Radar devices, chemical detection devices, near field communication devices, and accelerometers.
[0100] In particular embodiments, the sensor module 1010 may communicate with one or more remote sensors. For example, an environment may be equipped with one or more of various types of sensors, data from which may be relayed to cleaning robots within the vicinity.
[0101] According to various embodiments, the UV end effector 1012 may include one or more of a variety of suitable cleaning devices. Various types of UV light sources may be used. For example, a "point" LED may emit UV light, potentially with an angular range of emission. As another example, a cylindrical UV light may emit UV radiation in a relatively constant pattern along the axis of the cylinder and may potentially cover an angular range. Such sources may be combined in a fixed or variable pattern, for instance to provide a more powerful light source and/or to shape the UV emission pattern.
[0102] In particular embodiments, shielding may help to stop UV light from emitting in certain directions. Alternatively, or additionally, one or more reflectors or lenses may be used to help guide or focus UV light toward a target.
[0103] According to various embodiments, a cleaning device may be attached to the cleaning robot 1000 in any of various ways. For example, the cleaning device may be attached in a fixed orientation relative to a robot drive mechanism. As another example, the cleaning device may be attached to the cleaning robot via a robotic arm having any of a variety of possible geometries.
[0104] In particular embodiments, a UV light may be fixed in a downward pointing on a robotic arm having three degrees of freedom of spatial mobility, allowing the arm to be positioned at different points of space along three axes. In such a configuration, the cleaning device could be positioned to clean a table top by positioning the UV light at a designated distance from the table top through a combination of movement of the robotic arm and the robot itself. The cleaning device could be positioned in a fixed location over the table top, or could clean the table top from different locations. For example, larger table tops may require cleaning from more locations.
[0105] In particular embodiments, a UV light may be configured with four, five, six, or more degrees of freedom. For example, a robotic arm may have three degrees of freedom. Then, a UV light positioned on the arm may itself be configured for movement in three dimensions. In this case, the movement of the robotic arm and the UV light may be combined to trace a potentially complex trajectory through space in order to irradiate a target from multiple directions. For instance, a cleaning device may be configured with 5 degrees of freedom in order to irradiate a spherical door handle from the left, from the right, from above, and from below without requiring the robot itself to move.
[0106] According to various embodiments, the mobility apparatus may include one or more of any suitable mobility devices. Such devices may include, but are not limited to, one or more motorized wheels, balls, treads, or legs. In some configurations, the mobility apparatus may include one or more rotational and/or gyroscopic elements configured to aid in mobility and/or stability.
[0107] In particular embodiments, the cleaning robot 1000 may be configured to communicate directly or indirectly with other robots in order to accomplish its tasks. For example, robots may share information to build up an accurate model of an environment, identify the location and/or trajectory of humans, animals, or objects, perform social accommodation. As another example, robots may coordinate to execute a cleaning plan. For instance, one cleaning robot may be interrupted in a task due to social accommodation. The cleaning robot may then move on to another cleaning task, while a different cleaning robot may then later perform the interrupted cleaning task. As yet another example, robots may coordinate to perform a single task. For example, one or two robots may position themselves so as to provide social cues such as warning lights or sounds, while another robot engages in high-intensity UV cleaning in a designated area.
[0108] FIG. 11 illustrates a method 1100 for cleaning an area, performed in accordance with one or more embodiments. The method 1100 may be performed by a cleaning robot such as the robot 100 shown in FIG. 1. The method 1100 may be performed as an example of a task as discussed with respect to the operations 202 and 204 shown in FIG. 2.
[0109] A request to clean a designated area is received at 1102. According to various embodiments, instructions for where and when to clean can be received from any of multiple sources. For example, a robot may receive instructions from a remote location such as a command center or cleaning service executed on a computing device. As another example, people in an environment may interact with a robot and ask it to perform a cleaning task once or on a regular basis. Such interaction may take place via a voice control mechanism, a mechanical user input device, an instruction sent from a computing device such as a mobile phone running a control application, or any other suitable mechanism.
[0110] According to various embodiments, instructions for where and when to clean can be general or specific. For example, an instruction may specify a particular location and a time at which to clean it. As another example, an instruction may specify a timing constraint associated with a location, such as cleaning every office door handle in a hallway twice per day. In this case, the robot may develop its own plan and then perform its own execution monitoring.
[0111] In some implementations, instructions for where and when to clean can be determined automatically. For example, the cleaning robot may monitor humans as part of its cleaning activity. As part of this monitoring, the cleaning robot may annotate the areas, surfaces, and objects that humans touched, were near to, sneezed on, or otherwise interacted with. Those areas may then be prioritized for cleaning. Human activity may also be determined based on data received from external sensors. For example, a room's motion sensor may indicate that no one has been there, so it may not need to be re-cleaned. As another example, a door sensor may identify the number of people who have visited a room such as a restroom, which may be targeted for recleaning after a threshold number of people have visited the room.
[0112] A cleaning plan for the designated area is determined at 1104. According to various embodiments, a cleaning plan for a designated area may include, for example, a list of surfaces and/or regions within an area to clean, as well as a path for navigating to each surface and/or region. The cleaning plan may be determined based on any of various considerations, such as the current and predicted location of people within the area, the time required to conduct the cleaning operations, and the distance traveled along the path. For instance, the robot may attempt to first minimize disruption to human activity and then minimize the cleaning time and distance traveled.
[0113] A surface to clean is selected at 1106. According to various embodiments, the cleaning robot may select a surface to clean based on the cleaning plan at 1104. For example, the cleaning robot may attempt to clean each doorknob in a hall or each table in a cafeteria in succession. However, the cleaning robot may adapt its plan in real time to accommodate changes to the environment, such as the actions of people. For example, the cleaning robot may skip a door that is open or a table that is occupied and return the door or table at a later point when it detects that the surface is not in use by humans.
[0114] A cleaning routine for the selected surface is identified at 1108. According to various embodiments, cleaning different types of objects and surfaces may involve different types of cleaning routines, which may change depending on the type of cleaning conducted. Accordingly, the specific pattern employed may depend on characteristics such as the strength of the UV light, characteristics of the environment, and the level of cleaning desired.
[0115] In some implementations, a small planar surface may be cleaned by holding a UV light fixture at a single point above it. A computation may be performed indicating the location of the point and how long the UV fixture needs to remain in place to meet the cleaning goal.
[0116] In some implementations, a large planar surface may be cleaned by moving a UV light fixture along a path in an X/Y plane parallel to the surface separated by a fixed distance Z. A computation may be performed to determine the distance Z, the path over the surface, and the speed at which the path is traversed to meet the cleaning goal.
[0117] In some implementations, a planar surface may be cleaned by moving a UV light fixture along a path in an X/Y plane parallel to the surface separated by a variable distance Z. For example, near the middle of the surface a higher intensity light may be applied at a larger distance Z, while near the edge of the surface a lower intensity light may be applied at a smaller distance Z to reduce spillover to areas behind the surface. A computation may be performed to determine the variable distance Z, the path over the surface, and the speed at which the path is traversed to meet the cleaning goal.
[0118] In some implementations, a surface such as one or more elevator call buttons, one or more internal elevator buttons, and/or small areas around buttons may be cleaned by emitting UV light on a line perpendicular to the plane of the button surface.
[0119] In some implementations, non-planar or elongated shape such as a faucet or handle may be cleaned via a vertical scan with a horizontal UV light emission coupled with an arc around the axis of the handle. The trajectory may change depending on the shape of the handle. For example, the robot may have a handle cleaning routine that it can adapt to a particular handle via rotation of a cleaning arm. As another example, spherical knobs may be cleaned via a rotational path around the knob.
[0120] According to various embodiments, the cleaning routine for the selected surface may be identified via one or more of a variety of techniques. In some implementations, cleaning routines for a fixed environment may be preprogrammed. For example, a robot may be manually configured to clean different surfaces or objects in an area in particular ways. As another example, a robot may be pre-configured to clean a standardized area such as a chain restaurant building, prefabricated housing area, hotel room, office hallway, retail location, or other such place.
[0121] In some implementations, different categories of objects and surfaces may each be associated with a specific cleaning routine. One or more of the cleaning routines may be parameterized. For instance, a procedure for cleaning a planar surface may be parameterized based on the size of the surface. Each object or surface may be pre-categorized by a human in advance. Alternately, or additionally, a trained neural network may be applied to categorize objects based on sensor data.
[0122] In some implementations, a cleaning robot may automatically determine a cleaning routine based on sensor data. For example, visual data and/or 3D scanning data may be used to estimate a three dimensional shape of the object or surface to be cleaned. A 3D planner may then be used to plan the trajectory and timing of the cleaning.
[0123] In some implementations, a cleaning robot may receive external information such as user input from a person, a two-dimensional or three-dimensional model or drawing of a region, surface, or object, a pre-trained neural network, or other such guidance.
[0124] According to various embodiments, a cleaning robot may use any technique in isolation to determine a cleaning plan. Alternately, or additionally, techniques may be used in combination. For example, a cleaning robot may be pre-configured to clean a variety of fixed environments. The cleaning robot may then be configured with specific cleaning routines for specific categories of objects. The cleaning robot may also be capable of automatically determining a cleaning routine based on sensor data, for instance when an object does not fall into an identified category. Finally, the cleaning robot may be configured to clean an object or surface based on user input, for instance when other approaches are insufficient for completing a cleaning task.
[0125] The cleaning robot is moved to the selected surface at 1110. In some implementations, moving the cleaning robot to the selected surface may involve engaging a mobility mechanism such as one or more wheels or treads. Additionally, the robot may need to navigate around obstacles such as people, animals, objects, or other robots. The robot may conduct that navigation in a socially accommodating manner. For example, the robot may move out of the way of humans, animals, or other robots, even though such accommodation requires moving along a longer path or waiting until a path is clear. As another example, the robot may predict the movement of humans, animals, or other robots in order to plan a path that avoids collisions.
[0126] A social cue for initiating cleaning is provided at 1112. According to various embodiments, any of a variety of social cues may be employed. Examples of such cues may include, but are not limited to: lights, sounds, vibration, and movement. For example, a robot may activate one or more lights and/or emit one or more sounds when cleaning is initiated. As another example, the robot may activate a spinning mechanical component to provide a visual indicator associated with cleaning.
[0127] In particular embodiments, a robot may emit a visual social cue indicating how long a task will take. For example, a robot may be equipped with a visible screen that is configured to display one or more countdown clocks. A countdown clock may indicate a time remaining for cleaning a specific surface or object. Alternately, or additionally, a countdown clock may indicate a time remaining for cleaning an entire area. As another example, a cleaning robot may be equipped with one or more colored lights to indicate the degree of completion of a task. For instance, presenting a visual cue may involve changing the color of an LED strip. The visual social cue may be perceivable from a distance so that a human can decide whether to interrupt the robot.
[0128] In some embodiments, presenting a visual cue may involve emitting audio. For example, one or more sound effects may be emitted when people transition across virtual boundaries. As another example, audio communication may be emitted in the form of music. As yet another example, audio communication may be emitted in the form of spoken natural language, for instance via text to speech or voice recording. Natural language communication may be presented on a display screen, or through speech, or a combination thereof. As still another example, the cleaning robot may emit a tune or whistle to indicate its progression in a cleaning cycle. As still another example, the cleaning robot may be configured to emit a verbal countdown or other natural language descriptions of progress along a task. For instance, the cleaning robot may state a task and verbally identify the initiation and/or completion of a task.
[0129] In some embodiments, presenting a visual cue may involve an information screen configured to display information such as text or icons. For instance, a caution icon may be displayed.
[0130] In some embodiments, presenting a visual cue may involve a projector to display information similarly to screen displays. Alternatively, or additionally, a projector may present a visual cue through illumination based on color and/or brightness similarly to LED strips. A projector may be used to show a graphic and/or text on the ground, for instance to indicate a safe boundary for humans to stay away, or onto a surface being disinfected, for instance to display AR information.
[0131] In some embodiments, a display screen on the robot may display an emotionally expressive face that is used for indicating system states. For example, when people are detected, the robot may present a happy face. As another example, when people are engaged in interaction for communicating with the robot, the robot may present a face that reflects the situation or statement (e.g., happy, apologetic, or thankful). As yet another example, when the robot predicts that people may soon be in an unsafe location, the robot may display a face indicating shock or panic.
[0132] In some embodiments, presenting a visual cue may involve motion. For example, the robot may use its arm for communicative gestures such as pointing to objects or surfaces for confirmation or socially communicating with people, for instance by waving. As another example, the robot may have the ability to move a "head" area (e.g., with 1-3 degrees of freedom) on which a display screen is mounted to control head gaze for communicating with people and directing sensors. Head gaze direction may be used to communicate task state (e.g., navigational goals, object/surface targets for disinfection) or interaction state (e.g., people being interacted with). Neck motions may also be used as communicative gestures, such as shaking the head no. As yet another example, the robot may use a mobile base trajectory for communication, for instance by driving to encircle a region to refer to it for task confirmation. As still another example, any of the robot's movable components may be used for emphasis within a communicative message, for instance for beat gestures.
[0133] The selected surface is cleaned using the identified cleaning routine at 1114. In some implementations, cleaning the selected surface may involve operations such as adjusting the position of the cleaning robot, adjusting the position of one or more cleaning devices, and/or adjusting one or more settings associated with a cleaning device such as the intensity of UV light emitted. For instance, a UV light may be activated and strengthened or weakened in intensity as it is moved through space in an arc around a circular object or in a plan over a flat object.
[0134] A social cue for completing cleaning is provided at 1116. In some implementations, the social cue for completing cleaning may simply involve no longer emitting the social cue for initiating cleaning provided at 1112. Alternately, or additionally, a different social cue may be emitted to demonstrate that cleaning has stopped. For instance, a robot may emit a green light when not cleaning and a red light while cleaning.
[0135] In particular embodiments, a cleaning robot may strategically determine a social cue based on factors such as environmental characteristics. For example, the cleaning robot may emit a brighter light or louder sound in a brighter or louder environment with more people, while emitting a dimmer light or quieter sound in a darker or quieter environment.
[0136] Activity is reported at 1118. According to various embodiments, the activity may be reported via the communication interface 1006 shown in FIG. 10. The cleaning robot may report on any or all activity. For example, the cleaning robot may report on cleaning activity such as where it cleaned, when it cleaned, and/or one or more parameters of the cleaning process. As another example, the cleaning robot may report on social accommodation actions such as when, where, why, and how it took action to accommodate activity by humans, animals, or other robots. As yet another example, the cleaning robot may report on characteristics of the environment observed via its sensors, such as the presence or absence of trash, graffiti, or damage. As still another example, the cleaning robot may report on successes, failures, and/or reasons for failure while executing its cleaning plan.
[0137] In some implementations, activity may be reported in real time. For example, a robot may communicate with a remote database via WiFi. Alternatively, or additionally, activity may be reported via bulk communication. For example, a robot may transmit a comprehensive log of activity, for instance via a wired connection, when the robot returns to a charging or docking station.
[0138] A determination is made at 1120 as to whether to select an additional surface to clean. In some implementations, the determination may be made at least in part on whether the cleaning robot has cleaned each surface according to the cleaning plan determined at 102. For instance, the cleaning robot may revisit a surface that it was initially unable to clean due to accommodating people within the environment.
[0139] In particular embodiments, one or more of the operations shown in FIG. 11 may be omitted. For example, the cleaning robot may not provide a social cue in some configurations, such as in settings where limited light and sound disruption is desired.
[0140] FIG. 12 illustrates one example of a computing device. According to various embodiments, a system 1200 suitable for implementing embodiments described herein includes a processor 1201, a memory module 1203, a storage device 1205, an interface 1211, and a bus 1215 (e.g., a PCI bus or other interconnection fabric.) System 1200 may operate as variety of devices such as cleaning robot, remote server, or any other device or service described herein. Although a particular configuration is described, a variety of alternative configurations are possible. The processor 1201 may perform operations such as those described herein. Instructions for performing such operations may be embodied in the memory 1203, on one or more non-transitory computer readable media, or on some other storage device. Various specially configured devices can also be used in place of or in addition to the processor 1201. The interface 1211 may be configured to send and receive data packets over a network. Examples of supported interfaces include, but are not limited to: Ethernet, fast Ethernet, Gigabit Ethernet, frame relay, cable, digital subscriber line (DSL), token ring, Asynchronous Transfer Mode (ATM), High-Speed Serial Interface (HSSI), and Fiber Distributed Data Interface (FDDI). These interfaces may include ports appropriate for communication with the appropriate media. They may also include an independent processor and/or volatile RAM. A computer system or computing device may include or communicate with a monitor, printer, or other suitable display for providing any of the results mentioned herein to a user.
[0141] In the foregoing specification, various techniques and mechanisms may have been described in singular form for clarity. However, it should be noted that some embodiments include multiple iterations of a technique or multiple instantiations of a mechanism unless otherwise noted. For example, a system uses a processor in a variety of contexts but can use multiple processors while remaining within the scope of the present disclosure unless otherwise noted. Similarly, various techniques and mechanisms may have been described as including a connection between two entities. However, a connection does not necessarily mean a direct, unimpeded connection, as a variety of other entities (e.g., bridges, controllers, gateways, etc.) may reside between the two entities.
[0142] In the foregoing specification, reference was made in detail to specific embodiments including one or more of the best modes contemplated by the inventors. While various implementations have been described herein, it should be understood that they have been presented by way of example only, and not limitation. For example, some techniques and mechanisms are described herein in the context of cleaning via UV light. However, the techniques of the present invention apply to a wide variety of cleaning techniques. Particular embodiments may be implemented without some or all of the specific details described herein. In other instances, well known process operations have not been described in detail in order not to unnecessarily obscure the present invention. Accordingly, the breadth and scope of the present application should not be limited by any of the implementations described herein, but should be defined only in accordance with the claims and their equivalents.
User Contributions:
Comment about this patent or add new information about this topic: