Patent application number | Description | Published |
20080297487 | DISPLAY INTEGRATED PHOTODIODE MATRIX - The use of one or more proximity sensors alone or in combination with one or more touch sensors in a multi-touch panel to detect the presence of a finger, body part or other object and control or trigger one or more functions in accordance with an “image” of touch provided by the sensor outputs is disclosed. In some embodiments, one or more infrared (IR) proximity sensors can be driven with a specific stimulation frequency and emit IR light from one or more areas, which can in some embodiments correspond to “pixel” locations. The reflected IR signal, if any, can be demodulated using synchronous demodulation. | 12-04-2008 |
20090309851 | Capacitive Sensor Panel Having Dynamically Reconfigurable Sensor Size and Shape - This relates to a capacitive sensor panel that is able to dynamically reconfigure its sensor size and shape for proximity and/or distance to enable hover and gesture detection. Thus, the size and/or shape of the sensors in the panel can differ according to present needs. The sensor panel may dynamically reconfigure its sensor size and shape based on an object's proximity to the panel. The sensor panel may dynamically reconfigure its sensor size and shape based on a gesture detected by the panel. The sensor panel may dynamically reconfigure its sensor size and shape based on an application executing on a device in communication with the panel. | 12-17-2009 |
20100026656 | CAPACITIVE SENSOR BEHIND BLACK MASK - Devices having one or more sensors located outside a viewing area of a touch screen display are disclosed. The one or more sensors can be located behind an opaque mask area of the device; the opaque mask area extending between the sides of a housing of the device and viewing area of the touch screen display. In addition, the sensors located behind the mask can be separate from a touch sensor panel used to detect objects on or near the touch screen display, and can be used to enhance or provide additional functionality to the device. For example, a device having a sensor located outside the viewing area can be used to detect objects in proximity to a functional component incorporated in the device, such as an ear piece (i.e., speaker for outputting sound). The sensor can also output a signal indicating a level of detection which may be interpreted by a controller of the device as a level of proximity of an object to the functional component. In addition, the controller can initiate a variety of actions related to the functional component based on the output signal, such as adjusting the volume of the earpiece. | 02-04-2010 |
20100026667 | ACOUSTIC MULTI-TOUCH SENSOR PANEL - Sensing of multiple touches on a surface of a material is provided. A beamed acoustic wave traveling in a substantially linear path along the surface of the material is formed by a plurality of transducers, e.g., a phased array, coupled to the surface. One or more echoes of the acoustic wave caused by a corresponding one or more touches on the path are detected with a detector. The detector may include, for example, one or more of the transducers in the plurality of transducers. The surface can be scanned with a plurality of beamed acoustic waves using a variety of configurations, such as parallel beams, radially emanating beams, etc. | 02-04-2010 |
20100060592 | Data Transmission and Reception Using Optical In-LCD Sensing - Data transmission and reception through optical in-LCD sensing panels is provided. The transmission/reception can be a communication in which data is transmitted by displaying, on display pixels of the panel, a communication image encoding the data, and data is received by capturing, with EM sensors embedded in the panel, a communication image encoding the data that is displayed in proximity to the panel. The transmission/reception can be a scan in which a motion of a handheld device is determined by scanning a surface with the EM sensors at different times and comparing the corresponding scan images to obtain the motion of the device. A control signal based on the motion of the handheld device can be transmitted to an external device, for example, to control a mouse cursor. In another example, scan images can be combined based on the motion to generate a combined scan image of a surface. | 03-11-2010 |
20100073328 | ANISOTROPIC OPTICAL COVER FOR TOUCH PANEL DISPLAY - A touch panel display configured to improve touch panel detection for sensor panels embedded in display modules. Touch panel detection can be improved by arranging an optically anisotropic cover over a display module within which an optical sensor panel is embedded. Since the optically anisotropic cover comprises light-guiding channels through which light is guided by total internal reflection, the cover can effectively shift the sensor plane from an outer surface of the cover, near the location of an object to be detected, to an inner surface of the cover, near the location of the sensor panel. In additional, the optically anisotropic cover can effectively shift the image plane from the inner surface of the cover, near the display module, to the outer surface of the cover, near the viewing surface. | 03-25-2010 |
20100079405 | Touch Screen Device, Method, and Graphical User Interface for Moving On-Screen Objects Without Using a Cursor - In some embodiments, an electronic device with a touch screen display: detects a single finger contact on the touch screen display; creates a touch area that corresponds to the single finger contact; determines a representative point within the touch area; determines if the touch area overlaps an object displayed on the touch screen display, which includes determining if one or more portions of the touch area other than the representative point overlap the object; connects the object with the touch area if the touch area overlaps the object, where connecting maintains the overlap of the object and the touch area; after connecting the object with the touch area, detects movement of the single finger contact; determines movement of the touch area that corresponds to movement of the single finger contact; and moves the object connected with the touch area in accordance with the determined movement of the touch area. | 04-01-2010 |
20100171712 | Device, Method, and Graphical User Interface for Manipulating a User Interface Object - In some embodiments, an electronic device with a display and a touch-sensitive surface displays a user interface object. The device detects a first contact and a second contact concurrently on the touch-sensitive surface. The device determines which contact of the first contact and the second contact is a topmost contact, a bottommost contact, a leftmost contact, and a rightmost contact on the touch-sensitive surface. While continuing to detect the first contact and the second contact, the device detects movement of the first contact across the touch-sensitive surface, and concurrently moves two edges of the user interface object that correspond to the first contact in accordance with the detected movement of the first contact, including horizontally moving one of the two edges and vertically moving the other of the two edges. | 07-08-2010 |
20110007021 | TOUCH AND HOVER SENSING - Improved capacitive touch and hover sensing with a sensor array is provided. An AC ground shield positioned behind the sensor array and stimulated with signals of the same waveform as the signals driving the sensor array may concentrate the electric field extending from the sensor array and enhance hover sensing capability. The hover position and/or height of an object that is nearby, but not directly above, a touch surface of the sensor array, e.g., in the border area at the end of a touch screen, may be determined using capacitive measurements of sensors near the end of the sensor array by fitting the measurements to a model. Other improvements relate to the joint operation of touch and hover sensing, such as determining when and how to perform touch sensing, hover sensing, both touch and hover sensing, or neither. | 01-13-2011 |
20110078622 | Device, Method, and Graphical User Interface for Moving a Calendar Entry in a Calendar Application - In some embodiments, a multifunction device with a display and a touch-sensitive surface displays a multi-week view in a calendar application on the display and detects a first input by a user. In response to detecting the first input by the user, the device selects a first calendar entry in the multi-week view in the calendar application. While continuing to detect selection of the first calendar entry by the user, the device detects a first multifinger gesture on the touch-sensitive surface, and in response to detecting the first multifinger gesture on the touch-sensitive surface, the device expands display of a single week in the multi-week view; and maintains display of the first calendar entry on the display. In some embodiments, the device moves the first calendar entry to a date and time in the calendar application in accordance with a second input by the user. | 03-31-2011 |
20110078624 | Device, Method, and Graphical User Interface for Manipulating Workspace Views - In some embodiments, a multifunction device with a display and a touch-sensitive surface creates a plurality of workspace views. A respective workspace view is configured to contain content assigned by a user to the respective workspace view. The content includes application windows. The device displays a first workspace view in the plurality of workspace views on the display without displaying other workspace views in the plurality of workspace views and detects a first multifinger gesture on the touch-sensitive surface. In response to detecting the first multifinger gesture on the touch-sensitive surface, the device replaces display of the first workspace view with concurrent display of the plurality of workspace views. | 03-31-2011 |
20110141052 | TOUCH PAD WITH FORCE SENSORS AND ACTUATOR FEEDBACK - Electronic devices may use touch pads that have touch sensor arrays, force sensors, and actuators for providing tactile feedback. A touch pad may be mounted in a computer housing. The touch pad may have a rectangular planar touch pad member that has a glass layer covered with ink and contains a capacitive touch sensor array. Force sensors may be mounted under each of the four corners of the rectangular planar touch pad member. The force sensors may be used to measure how much force is applied to the surface of the planar touch pad member by a user. Processed force sensor signals may indicate the presence of button activity such as press and release events. In response to detected button activity or other activity in the device, actuator drive signals may be generated for controlling the actuator. The user may supply settings to adjust signal processing and tactile feedback parameters. | 06-16-2011 |
20120084689 | Managing Items in a User Interface - User interface changes related to moving items in a user interface are disclosed. An operation (e.g., a drag operation) can be initiated on selected items by moving a cursor or pointing device in the user interface, and an animation can be presented illustrating representations of the selected items moving from their respective original locations toward a current location of the cursor or pointing device and forming a cluster in proximity to the current location of the cursor or pointing device. As the cluster of items is moved over a container object in the user interface, the representations of the items can adopt the appearance style defined by that container object. The representations of the items can also be shown to depart from the cluster and move toward anticipated locations of the items in the container object as a preview of a drop operation into the container object. | 04-05-2012 |
20120307096 | Metadata-Assisted Image Filters - This disclosure pertains to devices, methods, systems, and computer readable media for generating and/or interpreting image metadata to determine input parameters for various image processing routines, e.g., filters that distort or enhance an image, in a way that provides an intuitive experience for both the user and the software developer. Such techniques may attach the metadata to image frames and then send the image frames down an image processing pipeline to one or more image processing routines. Image metadata may include face location information, and the image processing routine may include an image filter that processes the image metadata in order to keep the central focus (or foci) of the image filter substantially coincident with one or more of the faces represented in the face location information. The generated and/or interpreted metadata may also be saved to a metadata track for later application to unfiltered image data. | 12-06-2012 |
20130027303 | Touch Screen Device, Method, and Graphical User Interface for Moving On-Screen Objects Without Using a Cursor - An electronic device with a touch screen display: detects a single finger contact on the touch screen display; creates a touch area that corresponds to the single finger contact; determines a representative point within the touch area; determines if the touch area overlaps an object displayed on the touch screen display, which includes determining if one or more portions of the touch area other than the representative point overlap the object; connects the object with the touch area if the touch area overlaps the object, where connecting maintains the overlap of the object and the touch area; after connecting the object with the touch area, detects movement of the single finger contact; determines movement of the touch area that corresponds to movement of the single finger contact; and moves the object connected with the touch area in accordance with the determined movement of the touch area. | 01-31-2013 |
20130050263 | Device, Method, and Graphical User Interface for Managing and Interacting with Concurrently Open Software Applications - While in a first mode, a first electronic device displays on a touch-sensitive display a first application view that corresponds to a first application. In response to detecting a first input, the electronic device enters a second mode, and concurrently displays in a first predefined area an initial group of application icons with at least a portion of the first application view adjacent to the first predefined area. While in the second mode, in response to detecting a first touch gesture on an application icon that corresponds to a second application, the electronic device displays a popup view corresponding to a full-screen-width view of the second application on a second electronic device. In response to detecting one or more second touch gestures within the popup view, the electronic device performs an action in the second application that updates a state of the second application. | 02-28-2013 |
20130205194 | NAVIGATING AMONG CONTENT ITEMS IN A BROWSER USING AN ARRAY MODE - In any context where a user can view multiple different content items, switching among content items is provided using an array mode. In a full-frame mode, one content item is visible and active, but other content items may also be open. In response to user input the display can be switched to an array mode, in which all of the content items are visible in a scrollable array. Selecting a content item in array mode can result in the display returning to the full-frame mode, with the selected content item becoming visible and active. Smoothly animated transitions between the full-frame and array modes and a gesture-based interface for controlling the transitions can also be provided. | 08-08-2013 |
20130205244 | GESTURE-BASED NAVIGATION AMONG CONTENT ITEMS - In any context where a user can view multiple different content items, switching among content items is provided using an array mode. In a full-frame mode, one content item is visible and active, but other content items may also be open. In response to user input the display can be switched to an array mode, in which all of the content items are visible in a scrollable array. Selecting a content item in array mode can result in the display returning to the full-frame mode, with the selected content item becoming visible and active. Smoothly animated transitions between the full-frame and array modes and a gesture-based interface for controlling the transitions can also be provided. | 08-08-2013 |
20130328917 | SMART COVER PEEK - A tablet device includes a display configured to present visual content, a sensor array configured to detect a status of a foldable flap in relation to the display, and a processor configured to operate the tablet device in accordance with the determined status of the foldable flap in relation to the display. In one embodiment, the processor receives a setting value and uses the setting value to execute an application in accordance with the determined relationship of the flap and the display. | 12-12-2013 |
20140009441 | TOUCH PAD WITH FORCE SENSORS AND ACTUATOR FEEDBACK - Electronic devices may use touch pads that have touch sensor arrays, force sensors, and actuators for providing tactile feedback. A touch pad may be mounted in a computer housing. The touch pad may have a rectangular planar touch pad member that has a glass layer covered with ink and contains a capacitive touch sensor array. Force sensors may be mounted under each of the four corners of the rectangular planar touch pad member. The force sensors may be used to measure how much force is applied to the surface of the planar touch pad member by a user. Processed force sensor signals may indicate the presence of button activity such as press and release events. In response to detected button activity or other activity in the device, actuator drive signals may be generated for controlling the actuator. The user may supply settings to adjust signal processing and tactile feedback parameters. | 01-09-2014 |
20140035824 | Device, Method, and Graphical User Interface for Entering Characters - A device with a display and a touch-sensitive keyboard with one or more character keys: displays a text entry area; detects a first input on the touch-sensitive keyboard; in accordance with a determination that the first input corresponds to activation of a character key, enters a first character corresponding to the character key into the text entry area; in accordance with a determination that the first input corresponds to a character drawn on the touch-sensitive keyboard: determines one or more candidate characters for the drawn character, and displays a candidate character selection interface that includes at least one of the candidate characters; while displaying the candidate character selection interface, detects a second input that selects a respective candidate character within the candidate character selection interface; and in response to detecting the second input, enters the selected respective candidate character into the text entry area. | 02-06-2014 |
20140092064 | Touch Pad with Force Sensors and Actuator Feedback - Electronic devices may use touch pads that have touch sensor arrays, force sensors, and actuators for providing tactile feedback. A touch pad may be mounted in a computer housing. The touch pad may have a rectangular planar touch pad member that has a glass layer covered with ink and contains a capacitive touch sensor array. Force sensors may be mounted under each of the four corners of the rectangular planar touch pad member. The force sensors may be used to measure how much force is applied to the surface of the planar touch pad member by a user. Processed force sensor signals may indicate the presence of button activity such as press and release events. In response to detected button activity or other activity in the device, actuator drive signals may be generated for controlling the actuator. The user may supply settings to adjust signal processing and tactile feedback parameters. | 04-03-2014 |
20140218372 | INTELLIGENT DIGITAL ASSISTANT IN A DESKTOP ENVIRONMENT - Methods and systems related to interfaces for interacting with a digital assistant in a desktop environment are disclosed. In some embodiments, a digital assistant is invoked on a user device by a gesture following a predetermined motion pattern on a touch-sensitive surface of the user device. In some embodiments, a user device selectively invokes a dictation mode or a command mode to process a speech input depending on whether an input focus of the user device is within a text input area displayed on the user device. In some embodiments, a digital assistant performs various operations in response to one or more objects being dragged and dropped onto an iconic representation of the digital assistant displayed on a graphical user interface. In some embodiments, a digital assistant is invoked to cooperate with the user to complete a task that the user has already started on a user device. | 08-07-2014 |
20140351707 | DEVICE, METHOD, AND GRAPHICAL USER INTERFACE FOR MANIPULATING WORKSPACE VIEWS - In some embodiments, a multifunction device with a display and a touch-sensitive surface creates a plurality of workspace views. A respective workspace view is configured to contain content assigned by a user to the respective workspace view. The content includes application windows. The device displays a first workspace view in the plurality of workspace views on the display without displaying other workspace views in the plurality of workspace views and detects a first multifinger gesture on the touch-sensitive surface. In response to detecting the first multifinger gesture on the touch-sensitive surface, the device replaces display of the first workspace view with concurrent display of the plurality of workspace views. | 11-27-2014 |
20150033170 | TOUCH SCREEN DEVICE, METHOD, AND GRAPHICAL USER INTERFACE FOR MOVING ON-SCREEN OBJECTS WITHOUT USING A CURSOR - In some embodiments, an electronic device with a touch screen display: detects a single finger contact on the touch screen display; creates a touch area that corresponds to the single finger contact; determines a representative point within the touch area; determines if the touch area overlaps an object displayed on the touch screen display, which includes determining if one or more portions of the touch area other than the representative point overlap the object; connects the object with the touch area if the touch area overlaps the object, where connecting maintains the overlap of the object and the touch area; after connecting the object with the touch area, detects movement of the single finger contact; determines movement of the touch area that corresponds to movement of the single finger contact; and moves the object connected with the touch area in accordance with the determined movement of the touch area. | 01-29-2015 |
20150062052 | Device, Method, and Graphical User Interface for Transitioning Between Display States in Response to a Gesture - An electronic device displays a user interface in a first display state. The device detects a first portion of a gesture on a touch-sensitive surface, including detecting intensity of a respective contact of the gesture. In response to detecting the first portion of the gesture, the device displays an intermediate display state between the first display state and a second display state. In response to detecting the end of the gesture: if intensity of the respective contact had reached a predefined intensity threshold prior to the end of the gesture, the device displays the second display state; otherwise, the device redisplays the first display state. After displaying an animated transition between a first display state and a second state, the device, optionally, detects an increase of the contact intensity. In response, the device displays a continuation of the animation in accordance with the increasing intensity of the respective contact. | 03-05-2015 |
20150067495 | Device, Method, and Graphical User Interface for Providing Feedback for Changing Activation States of a User Interface Object - An electronic device with a touch-sensitive surface, a display, and one or more sensors to detect intensity of contacts with the touch-sensitive surface displays a user interface object having a plurality of activation states; detects a contact on the touch-sensitive surface; and detects an increase of intensity of the contact from a first intensity to a second intensity. In response to detecting the increase in intensity, the device: changes activation states M times, and generates a tactile output on the touch-sensitive surface corresponding to each change in activation state. The device detects a decrease of intensity of the contact from the second intensity to the first intensity; and in response to detecting the decrease in intensity, the device: changes activation states N times, and generates a tactile output on the touch-sensitive surface corresponding to each change in activation state, where N is different from M. | 03-05-2015 |
20150067496 | Device, Method, and Graphical User Interface for Providing Tactile Feedback for Operations Performed in a User Interface - An electronic device with a touch-sensitive surface and a display displays a user interface object on the display, detects a contact on the touch-sensitive surface, and detects a first movement of the contact across the touch-sensitive surface, the first movement corresponding to performing an operation on the user interface object, and, in response to detecting the first movement, the device performs the operation and generates a first tactile output on the touch-sensitive surface. The device also detects a second movement of the contact across the touch-sensitive surface, the second movement corresponding to reversing the operation on the user interface object, and in response to detecting the second movement, the device reverses the operation and generates a second tactile output on the touch-sensitive surface, where the second tactile output is different from the first tactile output. | 03-05-2015 |
20150067513 | Device, Method, and Graphical User Interface for Facilitating User Interaction with Controls in a User Interface - An electronic device, with a touch-sensitive surface and a display, includes one or more sensors to detect intensity of contacts with the touch-sensitive surface. The device displays, on the display, a first control for controlling a first operation. The device detects, on the touch-sensitive surface, a first input that corresponds to the first control; and in response to detecting the first input: in accordance with a determination that the first input meets first control-activation criteria but does not include a contact with a maximum intensity above a respective intensity threshold, the device performs the first operation; and in accordance with a determination that the first input includes a contact with an intensity above the respective intensity threshold, the device displays a second control for performing a second operation associated with the first operation. | 03-05-2015 |
20150067560 | Device, Method, and Graphical User Interface for Manipulating Framed Graphical Objects - An electronic device with a touch-sensitive surface, a display, and one or more sensors to detect intensity of contacts with the touch-sensitive surface displays a graphical object inside of a frame on the display, and detects a gesture. Detecting the gesture includes: detecting a contact on the touch-sensitive surface while a focus selector is over the graphical object, and detecting movement of the contact across the touch-sensitive surface. In response to detecting the gesture: in accordance with a determination that the contact meets predefined intensity criteria, the device removes the graphical object from the frame; and in accordance with a determination that the contact does not meet the predefined intensity criteria, the device adjusts an appearance of the graphical object inside of the frame. | 03-05-2015 |
20150067563 | Device, Method, and Graphical User Interface for Moving and Dropping a User Interface Object - An electronic device detects a contact associated with a focus selector that controls movement of a respective user interface object; and while continuously detecting the contact, the device detects first movement of the contact. In response to detecting the first movement of the contact, the device moves the focus selector and the respective user interface object, and determines an intensity of the contact. The device detects second movement of the contact and in response to detecting the second movement of the contact: when the contact meets respective intensity criteria, the device moves the focus selector and the user interface object; and when the contact does not meet the respective intensity criteria, the device moves the focus selector without moving the user interface object. | 03-05-2015 |
20150067596 | Device, Method, and Graphical User Interface for Displaying Additional Information in Response to a User Contact - An electronic device, with a touch-sensitive surface and a display, includes one or more sensors to detect intensity of contacts with the touch-sensitive surface. The device detects a contact on the touch-sensitive surface while a focus selector corresponding to the contact is at a respective location on the display associated with additional information not initially displayed on the display. While the focus selector is at the respective location, upon determining that the contact has an intensity above a respective intensity threshold before a predefined delay time has elapsed with the focus selector at the respective location, the device displays the additional information associated with the respective location without waiting until the predefined delay time has elapsed; and upon determining that the contact has an intensity below the respective intensity threshold, the device waits until the predefined delay time has elapsed to display the additional information associated with the respective location. | 03-05-2015 |
20150067601 | Device, Method, and Graphical User Interface for Displaying Content Associated with a Corresponding Affordance - An electronic device with a display, a touch-sensitive surface, and sensors to detect intensity of contacts with the touch-sensitive surface displays, on the display, an affordance corresponding to respective content at a respective size and detects a gesture that includes an increase in intensity of a contact followed by a subsequent decrease in intensity of the contact. In response to the increase in intensity, the device decreases a size of the affordance below the respective size. In response to the subsequent decrease in intensity: when a maximum intensity of the contact is above a content-display intensity threshold, the device ceases to display the affordance and displays at least a portion of the respective content; and when a maximum intensity of the contact is below the content-display intensity threshold, the device increases the size of the affordance to the respective size and forgoes displaying the respective content. | 03-05-2015 |
20150067602 | Device, Method, and Graphical User Interface for Selecting User Interface Objects - An electronic device with a display, touch-sensitive surface and one or more sensors to detect intensity of contacts with the touch-sensitive surface displays a first user interface object and detects first movement of the contact that corresponds to movement of a focus selector toward the first user interface object. In response to detecting the first movement, the device moves the focus selector to the first user interface object; and determines an intensity of the contact. After detecting the first movement, the device detects second movement of the contact. In response to detecting the second movement of the contact, when the contact meets selection criteria based on an intensity of the contact, the device moves the focus selector and the first user interface object; and when the contact does not meet the selection criteria, the device moves the focus selector without moving the first user interface object. | 03-05-2015 |