IMERJ LLC Patent applications |
Patent application number | Title | Published |
20140380204 | REPOSITIONING APPLICATIONS IN A STACK - A dual-screen user device and methods for revealing a combination of desktops on single and multiple screens are disclosed. A number of desktops and/or running applications can be selectively displayed on dual screen displays conditioned upon inputs received and the state of the device. Desktop displays and applications can be selectively shifted between the screens by user gestures or moved off of the screens by other user gestures and therefore hidden. The hidden desktops and screens however can be re-displayed by yet another gesture. The user gestures prevent the user from having to open and close the running desktops and applications, or to execute other user commands that otherwise result in a laborious effort by the user to manage the multiple desktops and applications. One user gesture or input enables a user to change an order of the window stack by simply re-launching a selected desktop or application. | 12-25-2014 |
20130290067 | METHOD AND SYSTEM FOR ASSESSING RISK - The present disclosure relates to a risk module that determines an important set of a plurality of potential risk events for an organization, each member of the important set of potential risk events having no more than a selected probability of occurring but at least a selected significance of impact on the organization, whether a mitigation strategy exists for each member of the important set of the plurality of potential risk events, when a mitigation strategy exists for a selected member of the important set, determining a corresponding mitigated significance of impact for the selected member of the important set of the plurality of potential risk events, and a more important set of the plurality of potential risk events. | 10-31-2013 |
20130156095 | NETWORKED IMAGE/VIDEO PROCESSING SYSTEM - A distributed image/video processing system is disclosed herein wherein one or more of digital image/video recorders (e.g., a digital cameras, video recorders, or smart phones, etc.) are in network communication with central network site for transmitting image or video data thereto. The recorders process their image/video data dependent upon an estimate of a measurement of network bandwidth that is available for transmitting image or video data to the central network site. | 06-20-2013 |
20130156092 | NETWORKED IMAGE/VIDEO PROCESSING SYSTEM AND NETWORK SITE THEREFOR - A distributed image/video processing system is disclosed herein wherein one or more of digital image/video recorders (e.g., a digital cameras, video recorders, or smart phones, etc.) are in network communication with central network site for transmitting image or video data thereto. The recorders process their image/video data dependent upon an estimate of a measurement of network bandwidth that is available for transmitting image or video data to the central network site. | 06-20-2013 |
20130156091 | NETWORKED IMAGE/VIDEO PROCESSING SYSTEM FOR ENHANCING PHOTOS AND VIDEOS - A distributed image/video processing system is disclosed herein wherein one or more of digital image/video recorders (e.g., a digital cameras, video recorders, or smart phones, etc.) are in network communication with central network site for transmitting image or video data thereto. The recorders process their image/video data dependent upon an estimate of a measurement of network bandwidth that is available for transmitting image or video data to the central network site. | 06-20-2013 |
20130086293 | SYSTEMS AND METHODS FOR DOCKING PORTABLE ELECTRONIC DEVICES - Systems and methods for docking portable electronic devices. A master device may be docked to a slave device to control the operation of the slave device. The slave device may have a form factor different than that of the master device. For example, the slave device may be a tablet and the master device may be a handheld device such as a smart phone. The slave device may include a retention mechanism to retain the master device in a docked position with respect to the slave device. When in the docked position, the master device may be in operative communication with one or more hardware components of the slave device to control the operation thereof. The slave device may lack the ability to exploit the full functionality of the one or more hardware components of the slave device without communication with the master device. | 04-04-2013 |
20130080957 | DESKTOP APPLICATION MANAGER: CARD DRAGGING OF DUAL SCREEN CARDS - SMARTPAD - Methods and devices for selecting a card from an application stack, wherein the card represents a corresponding application that a user would like to make active or bring focus to. The selecting includes one or more of a dragging and a tapping action, with these actions being triggers for transitioning the device to an optional drag state or tapped state, respectively. Transitioning through this state executes the activating of a corresponding application or other action on the device to facilitate window/application/desktop management. The selecting further allows a user to specify which a touch screen (or portion hereof) on which a particular application should be launched. | 03-28-2013 |
20130080956 | DESKTOP APPLICATION MANAGER: CARD DRAGGING OF DUAL SCREEN CARDS - Methods and devices for selecting a card from an application stack, wherein the card represents a corresponding application that a user would like to make active or bring focus to. The selecting includes one or more of a dragging and a tapping action, with these actions being triggers for transitioning the device to an optional drag state or tapped state, respectively. Transitioning through this state executes the activating of a corresponding application or other action on the device to facilitate window/application/desktop management. The selecting further allows a user to specify which a touch screen (or portion hereof) on which a particular application should be launched. | 03-28-2013 |
20130080937 | BROWSER FULL SCREEN VIEW - Methods and devices for minimizing and maximizing display outputs associated with applications are provided. In a preferred embodiment, an application presented with two or more pages can be maximized to present one of the pages on multiple screens of the device. The page that has been maximized is presented as a single, continuous image. The page that is dismissed can be recalled by a user input which returns display of the application on both of the screens, such as the first page of the application on a first screen, and second page of the application on the second screen. The maximized page of the application may include a web browser page in which the user may navigate the web page on either the first and second screens, or both the first and second screens as desired. | 03-28-2013 |
20130080929 | MINIMIZING AND MAXIMIZING BETWEEN PORTRAIT DUAL DISPLAY AND PORTRAIT SINGLE DISPLAY - Methods and devices for minimizing and maximizing displayed output associated with applications are provided. More particularly, an application presented as two or more pages in a portrait mode can be minimized to present one of the two or more pages following a minimization operation. The page that continues to be displayed can comprise a primary or preferred page, while the page that is dismissed can comprise a secondary or ancillary page. With respect to a maximization operation received with respect to a page of an application results in the display of an additional page associated with that application. Maximization can include controlling the respective screens on which first and second pages of the maximized application are displayed. | 03-28-2013 |
20130076793 | DESKTOP APPLICATION MANAGER: TAPPING DUAL-SCREEN CARDS - Methods and devices for selecting a card from an application stack, wherein the card represents a corresponding application that a user would like to make active or bring focus to. The selecting includes one or more of a dragging and a tapping action, with these actions being triggers for transitioning the device to an optional drag state or tapped state, respectively. Transitioning through this state executes the activating of a corresponding application or other action on the device to facilitate window/application/desktop management. The selecting further allows a user to specify which a touch screen (or portion hereof) on which a particular application should be launched. | 03-28-2013 |
20130076663 | SMARTPAD SCREEN MODES - A multi-display device is adapted to be dockable or otherwise associatable with an additional device. In accordance with one exemplary embodiment, the multi-display device is dockable with a smartpad. The exemplary smartpad can include a screen, a touch sensitive display, a configurable area, a gesture capture region(s) and a camera. The smartpad can also include a port adapted to receive the device. The exemplary smartpad is able to cooperate with the device such that information displayable on the device is also displayable on the smartpad. Furthermore, any one or more of the functions on the device are extendable to the smartpad, with the smartpad capable of acting as an input/output interface or extension of the device. Therefore, for example, information from one or more of the displays on the multi-screen device is displayable on the smartpad. | 03-28-2013 |
20130076662 | SMARTPAD SCREEN MANAGEMENT - A multi-display device is adapted to be dockable or otherwise associatable with an additional device. In accordance with one exemplary embodiment, the multi-display device is dockable with a smartpad. The exemplary smartpad can include a screen, a touch sensitive display, a configurable area, a gesture capture region(s) and a camera. The smartpad can also include a port adapted to receive the device. The exemplary smartpad is able to cooperate with the device such that information displayable on the device is also displayable on the smartpad. Furthermore, any one or more of the functions on the device are extendable to the smartpad, with the smartpad capable of acting as an input/output interface or extension of the device. Therefore, for example, information from one or more of the displays on the multi-screen device is displayable on the smartpad. | 03-28-2013 |
20130076658 | USER FEEDBACK TO INDICATE TRANSITIONS BETWEEN OPEN AND CLOSED STATES - A dual screen user device and methods for generating user feedback to indicate transitional states of the device are disclosed. Feedback is provided to a user of the device concerning transitional states or changes in status of the device, such as whether the device is open or closed, whether a peripheral device has been connected to or disconnected from the device and whether the device has been powered up or down. In a preferred embodiment, one or more vibrators are used to generate vibration and/or audible signals to the user to indicate to the user the particular change in status of the device as it occurs. | 03-28-2013 |
20130076655 | STATE OF SCREEN INFO: EASEL - A multi-display device is adapted to turn on and off certain device functionality based on one or more of device state and triggers. These triggers include a transition trigger, an open trigger and a closed trigger. Furthermore, and based on one or more of these triggers and device state, the device can provide feedback to a user, such as visual feedback, audio feedback and vibration feedback to assist a user with determining when the device is changing state. The operation of the primary screen, secondary screen, system and feedback options are also described relative to the beginning and endpoint of the device transition. Furthermore, the trigger corresponding to a transitional trigger where primary and secondary screens have a certain angle orientation and the trigger corresponding to a trigger point where the primary and secondary screens have a second angle orientation relative to one another are described. | 03-28-2013 |
20130076654 | HANDSET STATES AND STATE DIAGRAMS: OPEN, CLOSED TRANSITIONAL AND EASEL - A multi-display device is adapted to turn on and off certain device functionality based on one or more of device state and triggers. These triggers include a transition trigger, an open trigger and a closed trigger. Furthermore, and based on one or more of these triggers and device state, the device can provide feedback to a user, such as visual feedback, audio feedback and vibration feedback to assist a user with determining when the device is changing state. The operation of the primary screen, secondary screen, system and feedback options are also described relative to the beginning and endpoint of the device transition. Furthermore, the trigger corresponding to a transitional trigger where primary and secondary screens have a certain angle orientation and the trigger corresponding to a trigger point where the primary and secondary screens have a second angle orientation relative to one another are described. | 03-28-2013 |
20130076591 | DETAIL ON TRIGGERS: TRANSITIONAL STATES - A multi-display device is adapted to turn on and off certain device functionality based on one or more of device state and triggers. These triggers include a transition trigger, an open trigger and a closed trigger. Furthermore, and based on one or more of these triggers and device state, the device can provide feedback to a user, such as visual feedback, audio feedback and vibration feedback to assist a user with determining when the device is changing state. The operation of the primary screen, secondary screen, system and feedback options are also described relative to the beginning and endpoint of the device transition. Furthermore, the trigger corresponding to a transitional trigger where primary and secondary screens have a certain angle orientation and the trigger corresponding to a trigger point where the primary and secondary screens have a second angle orientation relative to one another are described. | 03-28-2013 |
20130024521 | MULTIPLE MESSAGING COMMUNICATION OPTIMIZATION - Systems and methods for use with a communication system. A plurality of devices employing a plurality of messaging modalities may be used to send or receive message data. In any regard, message data may comprise a plurality of portions of message data in machine readable form. The plurality of portions of message data may be indexed. The indexing may include storing representations (e.g., vectors or other mathematical constructs) representing the content of the plurality of portions of message data in a message data index. The representations of the content of the plurality of portions of message data may be compared to one another to determine a relationship between the various representations. If the relationship between a first representation (e.g., a first vector) and a second representation (e.g., a second vector) exceeds a predetermined threshold, the portions of message data corresponding to the first and second vectors may be determined to be related. A notification and/or alert may be presented to a user regarding the related portions of message data. In one embodiment, the related portions of message data or messages from which the portions derive may be accessed by a user. | 01-24-2013 |
20130021266 | METHODS OF DISPLAYING A SECOND VIEW - Methods of displaying information on an electronic device have a first touchscreen display and a second touchscreen display are disclosed. As the user enters inputs on the first touchscreen display, the second touchscreen displays the results of the user's inputs on the first touchscreen display. Either touchscreen can be used as an input device. Display information generated by an application can be the same on both displays. Alternatively, the first and second displays comprise a single, larger screen. An single application can address the displays as distinct devices. A touchscreen display can be dedicated to a single application, or a single instance of an application such that the two displays can run different applications, or different instances of the same application. In conjunction with a wireless communications module, the electronic device can be used as a portable teleconferencing device. | 01-24-2013 |
20130021265 | SECOND VIEW - Systems for displaying information on an electronic device have a first touchscreen display and a second touchscreen display. As the user enters inputs on the first touchscreen display, the second touchscreen displays the results of the user's inputs on the first touchscreen display. Either touchscreen can be used as an input device. Display information generated by an application can be the same on both displays. Alternatively, the first and second displays comprise a single, larger screen. An single application can address the displays as distinct devices. A touchscreen display can be dedicated to a single application, or a single instance of an application such that the two displays can run different applications, or different instances of the same application. In conjunction with a wireless communications module, the electronic device can be used as a portable teleconferencing device. | 01-24-2013 |
20130021262 | SYSTEMS AND METHODS FOR RECEIVING GESTURE INPUTS SPANNING MULTIPLE INPUT DEVICES - Systems and methods for operating a handheld computing device having more than one input device. An input event may be received at one of the input devices. The location of the input event may be translated into a location with respect to a global virtual space. This translated location may in turn be passed to an application executing on the device. Additionally, systems and methods are provided for recognition of a gesture input spanning more than one independent input device. | 01-24-2013 |
20130005469 | DUAL SCREEN GAME MODULE - Systems for, and methods of, dual screen gaming modules are described herein. A first screen can be configured in whole or in part as a display for the game or a controller for a game. Similarly, a second screen can be configured as a display or a controller for a game. The two screens are configured to appear as one contiguous screen. Controller functions including size, placement and sensitivity to pressure can be user configured, pre-programmed or learned by the module. | 01-03-2013 |
20130002568 | FULL SCREEN MODE - Systems for, and methods of, dual screen smartphones are described herein. The dual screens are hingedly coupled and can be rotated into an open position forming one contiguous screen. A processor is configured to dismiss any action bar or status bar in response to a prompt thereby maximizing media across the entirety of any of the dual screens or contiguous screen formed by their intersection. An overlay command presented by the processor allows a user to minimize, tile, zoom, pan or dismiss the media when it is maximized. | 01-03-2013 |
20120290946 | MULTI-SCREEN EMAIL CLIENT - An email client having multiple screens that may be displayed in different corresponding ones of a plurality of different display portions of a handheld electronic device. The screens of the email client may be related by way of a dependency relationship and/or may provide for control between the various screens. In one embodiment, the email client includes a folder management screen, a message listing screen, a message detail screen, and an attachment screen. Additionally, the email client may be responsive to received gesture inputs to navigate with respect to the screens and/or perform actions with respect to one or more elements (e.g., messages) of the various screens. | 11-15-2012 |
20120214552 | WINDOWS POSITION CONTROL FOR PHONE APPLICATIONS - Methods and devices for selectively presenting a user interface for a phone application are provided. More particularly, a change in the display mode of a multiple screen device can be determined. More particularly, a presentation of a user interface for a dialer of a phone application can be retained after receiving a flick in an off-screen gesture area of the device. However, a second, child window is provided in a second screen in response to receiving the flick. | 08-23-2012 |
20120174028 | OPENING CHILD WINDOWS IN DUAL DISPLAY COMMUNICATION DEVICES - The present disclosure is directed to methodologies and devices for handling maximizing and minimizing of hierarchically related windows. | 07-05-2012 |
20120144323 | Desktop Reveal By Moving a Logical Display Stack With Gestures - A dual-screen user device and methods are disclosed for revealing a combination of desktops on single and multiple screens. A determined number of desktops and/or running applications are displayed on dual screen displays conditioned upon inputs received and the state of the device. Desktop displays and applications can be shifted between the screens by user gestures, and/or selected desktop displays. Applications can be moved off of the screens by other user gestures and therefore hidden. Hidden desktops and screens can be re-displayed by other gestures. Desktops and applications are arranged in a window stack that represents a logical order therefore providing a user with an intuitive ability to manage multiple applications and desktops running simultaneously. The user gestures prevent the user from having to open and close the running desktops and applications that otherwise may require laborious efforts by the user to manage the multiple running desktops and applications. | 06-07-2012 |
20120143944 | INTEGRATED HANDSET BROWSER SETTINGS - A mobile computing device with a mobile operating system and desktop operating system running concurrently and independently on a shared kernel without virtualization. The mobile operating system provides a user experience for the mobile computing device that suits the mobile environment. The desktop operating system provides a full desktop user experience when the mobile computing device is docked to a second user environment. A seamless cross-environment workflow is provided in a multi-operating system computing environment. Two or more application programs, running in independent operating systems, share user interaction state information including user data, user settings, and/or application context information. Interaction state information may be shared for applications that are used primarily to access and edit local user content as well as applications that communicate to a remote server or access and navigate other remote content (i.e., Internet-based application, browser). | 06-07-2012 |
20120124490 | FULL-SCREEN ANNUNCIATOR - A dual-screen user device and methods for revealing a combination of desktops on single and multiple screens are disclosed. Selected desktops and/or running applications are displayed on dual screen displays. Desktops and applications can be shifted between screens by user gestures, and/or moved off of the screens and therefore hidden. Hidden desktops and screens can be re-displayed by other gestures. The desktops and applications are arranged in a window stack that represents a logical order of the desktops and applications providing a user with an intuitive ability to manage multiple applications/desktops running simultaneously. User gestures rearrange the order of the applications and desktops in the window stack. One embodiment provides an annunciator display or window extending across both screens in a dual screen configuration of the device. The annunciator window provides alerts, notifications, and statuses of the device in an increased area thereby enhancing viewability of information in the window. | 05-17-2012 |
20120120627 | DUAL SCREEN FOLDING DISPLAY HINGE - Dual screen display devices are disclosed. The device is able to be a mechanical hinge capable of joining multiple screens to form a single display or a substantially continuous display. The device is also capable of splitting a jointed display into separated screens. In some embodiments, the device comprises detaining mechanisms allowing the device to instantly interchange among several pre-defined angles or positions. | 05-17-2012 |
20120119888 | UNIVERSAL REMOTE CONTROL WITH AUTOMATED SETUP - A controller that automatically identifies one or more peripheral devices which need to be programmed for use with the controller. In some embodiments, the controller is able to visually identify a peripheral device from an image, obtain the configuration information for the peripheral device, and program itself in the background according to the configuration information. These tasks are advantageously performed by the remote control, without user input. The controller can be programmed to support a plurality of peripheral devices. When the controller is used to control a peripheral device, the controller may first present a selection list on the display screen. The selection list may include all the peripheral devices that the remote is communicatively coupled with. Upon the user selecting the desired peripheral device to be controlled, the remote control may dynamically outputs a customized user interface associated with the selected peripheral device. | 05-17-2012 |
20120117495 | DRAGGING AN APPLICATION TO A SCREEN USING THE APPLICATION MANAGER - A dual-screen user device and methods for revealing a combination of desktops on single and multiple screens are disclosed. Selected desktops and/or running applications are displayed on dual screen displays. Desktops and applications can be shifted between screens by user gestures, and/or moved off of the screens and therefore hidden. Hidden desktops and screens can be re-displayed by yet other gestures. The desktops and applications are arranged in a window stack that represents a logical order of the desktops and applications providing a user with an intuitive ability to manage multiple applications/desktops miming simultaneously. One user gesture launches an applications management window that provides visual indications of all applications and desktops running at the time. Other gestures rearrange the order of the applications and desktops in the window stack. One particular gesture drags a selected application or desktop appearing in the applications management window to a selected screen. | 05-10-2012 |
20120117290 | SYSTEMS AND METHODS RELATING TO USER INTERFACES FOR DOCKING PORTABLE ELECTRONIC - Systems and methods related to the user interface of docking portable electronic devices. A master device may be docked with a slave device to control operation of the slave device. The master device may be operable to display a user interface. The user interface of the master device may be adapted to be used with the slave device that may include different display and/or input devices than that of the master device. In one embodiment, the master device may be a handheld device such as a smart phone and the slave device may be a tablet device. | 05-10-2012 |
20120110497 | CHANGING STACK WHEN SWAPPING - A dual-screen user device and methods for revealing a combination of desktops on single and multiple screens are disclosed. Specifically, a determined number of desktops and/or running applications can be selectively displayed on dual screen displays conditioned upon inputs received and the state of the device. Desktop displays and applications can be selectively shifted between the screens by a user gesture in the form of a pinch gesture. This single user gesture prevents the user from having to open and close the running desktops and applications, or to execute multiple other user commands that otherwise would result in a laborious effort by the user to manage the multiple desktops and applications. | 05-03-2012 |
20120110486 | UNIVERSAL CLIPBOARD - A multi-screen user device and a universal clipboard application are described. Specifically, the universal clipboard application can be open on one screen of the multi-screen device while another application, from which data is either being copied to the clipboard application or pasted from the clipboard application, is open on another screen of the multi-screen device. Inputs received on the screen displaying the universal clipboard application can cause content to be copied to an application being displayed on the other screen of the multi-screen device, thereby providing an intuitive user interface for the universal clipboard application. | 05-03-2012 |
20120105363 | METHOD AND SYSTEM FOR VIEWING STACKED SCREEN DISPLAYS USING GESTURES - An intuitive technique for inputting user gestures into a handheld computing device is disclosed allowing a user to better manipulate different types of screen display presentations, such as desktops and application windows, when performing tasks thereon, wherein a window stack for application windows and/or desktops can be navigated and sequentially displayed according to the window stack ordering without disturbing or changing this ordering. | 05-03-2012 |
20120089992 | USER INTERACTION SUPPORT ACROSS CROSS-ENVIRONMENT APPLICATIONS - Cross-environment rendering and user interaction support provide a seamless computing experience in a multi-operating system computing environment. The multi-operating system computing environment may include a mobile operating system associated with a first user environment and a desktop operating system associated with a second user environment running concurrently and independently on a mobile computing device. User interaction support includes handling input events initially received in the shared kernel by accepting the input events in the desktop operating system and translating, mapping, and/or passing the input events through a virtual input device to the mobile operating system such that applications of the mobile operating system receive the input events as if coming from a user interaction space of the mobile operating system. The mobile computing device may be a smartphone running the Android mobile operating system and a full desktop Linux distribution on a modified Android kernel. | 04-12-2012 |
20120089906 | CROSS-ENVIRONMENT APPLICATION COMPATIBILITY - A seamless cross-environment workflow is provided in a multi-operating system computing environment. The multi-operating system computing environment may include a mobile operating system and a desktop operating system running concurrently and independently on a mobile computing device. Two or more application programs, running in independent operating systems, share user interaction state information including user data, user settings, and/or application context information. Interaction state information may be shared for applications that are used primarily to access and edit local user content as well as applications that communicate to a remote server or access and navigate other remote content (e.g., Internet-based application, browser, etc.). The mobile computing device may be a smartphone running the Android mobile operating system and a full desktop Linux distribution on a modified Android kernel. | 04-12-2012 |
20120086717 | INSTANT REMOTE RENDERING - Cross-environment rendering and user interaction support provide a seamless computing experience in a multi-operating system computing environment. The multi-operating system computing environment may include a mobile operating system and a desktop operating system running concurrently and independently on a mobile computing device. Real-time or instant display of an application running in the mobile operating system within an environment of the desktop operating system is provided by rendering application graphics for the application within the desktop operating system. A console application of the desktop operating system may access surface information for the application from shared memory and render the application within a console window of the computing environment associated with the desktop operating system. The mobile computing device may be a smartphone running the Android mobile operating system and a full desktop Linux distribution on a modified Android kernel. | 04-12-2012 |
20120086716 | USER INTERACTION ACROSS CROSS-ENVIRONMENT APPLICATIONS THROUGH AN EXTENDED GRAPHICS CONTEXT - Cross-environment rendering and user interaction support provide a seamless computing experience in a multi-operating system computing environment. The multi-operating system computing environment may include a mobile operating system and a desktop operating system running concurrently and independently on a shared kernel of a mobile computing device. User interaction support includes handling input events initially received in the shared kernel by accepting the input events in the desktop operating system and translating, mapping, and/or passing the input events through a virtual input device to the mobile operating system such that applications of the mobile operating system receive the input events as if coming from a user interaction space of the mobile operating system. The mobile computing device may be a smartphone running the Android mobile operating system and a full desktop Linux distribution on a modified Android kernel. | 04-12-2012 |
20120084798 | CROSS-ENVIRONMENT REDIRECTION - Cross-environment rendering and user interaction support provide a seamless computing experience in a multi-operating system computing environment. The multi-operating system computing environment may include a mobile operating system and a desktop operating system running concurrently and independently on a mobile computing device. Full user interaction support is provided for redirected and/or mirrored applications that are rendered using an extended graphics context. An extended input queue handles input events from virtual input devices for remotely displayed applications. Remotely displayed applications are mapped to separate motion spaces within the input queue. The mobile computing device may be a smartphone running the Android mobile operating system and a full desktop Linux distribution on a modified Android kernel. | 04-05-2012 |
20120084793 | CROSS-ENVIRONMENT EVENT NOTIFICATION - A mobile computing device with a mobile operating system and desktop operating system running concurrently and independently on a shared kernel without virtualization. The mobile operating system provides a user experience for the mobile computing device that suits the mobile environment. The desktop operating system provides a full desktop user experience when the mobile computing device is docked to a second user environment. Cross-environment notification and event handling allows the user to be notified of and respond to events occurring within the mobile operating system through the user environment associated with the desktop operating system. Events that may trigger cross-environment notification may be local events and/or remote events. The mobile computing device may be a smartphone running the Android mobile operating system and a full desktop Linux distribution on a modified Android kernel. | 04-05-2012 |
20120084792 | CROSS-ENVIRONMENT COMMUNICATION USING APPLICATION SPACE API - A mobile computing device with a mobile operating system and desktop operating system running concurrently and independently on a shared kernel without virtualization. The mobile operating system provides a mobile user while the desktop operating system provides a full desktop user experience when the mobile computing device is docked to a secondary terminal environment. Applications of the desktop operating system communicate with applications and services of the mobile operating system through a cross-environment communication framework. The cross-environment communication framework may include application programming interfaces through which categories of applications can communicate across a multiple operating system computing environment through category-specific remote communication calls. The mobile computing device may be a smartphone running the Android mobile operating system and a full desktop Linux distribution on a modified Android kernel. | 04-05-2012 |
20120084791 | Cross-Environment Communication Framework - A mobile computing device with a mobile operating system and desktop operating system running concurrently and independently on a shared kernel without virtualization. The mobile operating system provides a user experience for the mobile computing device that suits the mobile environment. The desktop operating system provides a full desktop user experience when the mobile computing device is docked to a secondary terminal environment. Applications of the desktop operating system communicate with applications and services of the mobile operating system through a cross-environment communication framework. The cross-environment communication framework may include interfaces to remotable objects allowing processes in the mobile operating system and processes in the desktop operating system to share memory in a thread-safe manner. The mobile computing device may be a smartphone running the Android mobile operating system and a full desktop Linux distribution on a modified Android kernel. | 04-05-2012 |
20120084739 | FOCUS CHANGE UPON USE OF GESTURE TO MOVE IMAGE - Embodiments are described for handling focus when a gesture is input in a multi-screen device. In embodiments, the gesture indicates a request to move an image displayed on the multi-screen device. In response, the image is moved and the focus is placed on the moved image. | 04-05-2012 |
20120084727 | MODAL LAUNCHING - Embodiments are described for handling display of modal windows in a multi-screen device. In embodiments, a modal window will be launched and displayed in a display which receives the input that resulted in the display of the modal window. The other portions of a first display or second display, not displaying the modal window, are made inactive. In other embodiments, the modal window occupies only a first display and the second display remains active. | 04-05-2012 |
20120084726 | MINIMIZING APPLICATION WINDOWS - Embodiments are described for minimizing application windows displayed on one or more displays of a multi-screen device. A device may display a dual screen application which is displayed across both screens. Depending upon which screen is used to launch (or reopen) an application, the dual screen application may be minimized to the other screen. | 04-05-2012 |
20120084725 | MANAGING HIERARCHICALLY RELATED WINDOWS IN A SINGLE DISPLAY - The present disclosure is directed to methodologies and devices for handling the display of hierarchically related windows in a single-screen communication device. | 04-05-2012 |
20120084724 | SLEEP STATE FOR HIDDEN WINDOWS - Systems and methods are provides for changing a user interface for a multi-screen device. The user interface can change based on the movement of a window. The system can receive a user interface event that modifies the display of windows in the user interface. Upon receiving the user interface event, the system determines if a window has been covered or uncovered. If a window has been covered, the window is placed in a sleep state. If a window is uncovered, the window is activated from a sleep state. A sleep state is a window state where an application associated with the window does not receive user interface inputs and/or does not render the window. | 04-05-2012 |
20120084723 | METHOD AND APPARATUS FOR SHOWING STORED WINDOW DISPLAY - Systems and methods are provided for changing a user interface for a multi-screen device. The user interface can change based on the movement of a window. The system can receive a user interface event that modifies the display of windows in the user interface. Upon receiving the user interface event, the system determines if a window has been covered or uncovered. If a window has been covered, the window is placed in a sleep state. If a window is uncovered, the window is activated from a sleep state. A sleep state is a window state where an application associated with the window does not receive user interface inputs and/or does not render the window. Moreover, in a sleep state an image representing the window is maintained in memory. | 04-05-2012 |
20120084722 | MULTIPLE CHILD WINDOWS IN DUAL DISPLAY COMMUNICATION DEVICES - The present disclosure is directed to methodologies and devices for handling maximizing and minimizing of hierarchically related windows. | 04-05-2012 |
20120084721 | WINDOW STACK MODIFICATION IN RESPONSE TO ORIENTATION CHANGE - A multi-screen user device and methods for controlling data displayed thereby are disclosed. Specifically, the control of data displayed by at least one of the multiple screens of the multi-screen user device is conditioned upon the relative orientation of the multiple screens, whether the device orientation is changed from a first state to a second state, and user input received. | 04-05-2012 |
20120084720 | MANAGING EXPOSE VIEWS IN DUAL DISPLAY COMMUNICATION DEVICES - The present disclosure is directed to methodologies and devices for handling maximizing and minimizing of exposé views. | 04-05-2012 |
20120084718 | CHANGING THE SCREEN STACK UPON APPLICATION OPEN - Systems and methods are provides for opening a full screen window in a window stack for a multi-screen device. The window stack can change based on the opening of a window. The system can receive a gesture indicating an application with a new window is to be executed or a new window is to be opened in the device. Upon receiving the gesture, the system determines that the new window is to occupy substantially all of a composite display that spans substantially all of the two screens of the device. Then, the system can determine that the full screen window is to be associated with the composite display and create a logic data structure associated with the opened window to describe the position of the opened window in the window stack. | 04-05-2012 |
20120084716 | CHANGING THE SCREEN STACK UPON DESKTOP REVEAL - Systems and methods are provides for revealing a desktop in a window stack for a multi-screen device. The window stack can change based on the revealing of a desktop. The system can receive a gesture indicating an application with the desktop, which was previously created in the stack, is to be revealed on the display of the device. Upon receiving the gesture, the system determines that the desktop is to occupy substantially all of a composite display that spans substantially all of the two or more touch sensitive displays of the device. Then, the system can determine that the desktop is to be associated with the composite display and change a logic data structure associated with the desktop to describe the position of the desktop on the top of the window stack. | 04-05-2012 |
20120084715 | REPOSITIONING APPLICATIONS IN A STACK - A dual-screen user device and methods for revealing a combination of desktops on single and multiple screens are disclosed. A number of desktops and/or running applications can be selectively displayed on dual screen displays conditioned upon inputs received and the state of the device Desktop displays and applications can be selectively shifted between the screens by user gestures or moved off of the screens by other user gestures and therefore hidden. The hidden desktops and screens however can be re-displayed by yet another gesture. The user gestures prevent the user from having to open and close the running desktops and applications, or to execute other user commands that otherwise result in a laborious effort by the user to manage the multiple desktops and applications. One user gesture or input enables a user to change an order of the window stack by simply re-launching a selected desktop or application. | 04-05-2012 |
20120084714 | WINDOW STACK MODELS FOR MULTI-SCREEN DISPLAYS - Systems and methods are provides for creating a window stack for a multi-screen device. The stack is an arrangement of an active window and at least one other active or inactive window for at least one of the two displays. The system can receive activation of a window in the device. Upon activation of the window, the system can determine a display associated with the active window and can determine a position in the window stack for the active window. Then, the system can generate a logic data structure for the active window to describe the position of the active window in the window stack. | 04-05-2012 |
20120084712 | KEEPING FOCUS AT THE TOP OF THE DEVICE WHEN IN LANDSCAPE ORIENTATION - Systems and methods are provides for adjusting focus during an orientation change. A full screen window has focus before the orientation change. After the device is oriented in the landscape orientation, the focus is maintained on the window. However, a configurable area associated with at least one screen that displays the full screen window is changed to display the configurable area on the screen that is at the top of the device. | 04-05-2012 |
20120084710 | REPOSITIONING WINDOWS IN THE POP-UP WINDOW - A dual-screen user device and methods for revealing a combination of desktops on single and multiple screens are disclosed. Selected desktops and/or running applications are displayed on dual screen displays. Desktops and applications can be shifted between screens by user gestures, and/or moved off of the screens and therefore hidden. Hidden desktops and screens can be re-displayed by yet other gestures. The desktops and applications are arranged in a window stack that represents a logical order of the desktops and applications providing a user with an intuitive ability to manage multiple applications/desktops running simultaneously. One user gesture launches an applications management window that provides visual indications of all of the applications and desktops running at the time, applications/desktops displayed on the screens. Other gestures can rearrange the order of all of the applications and desktops in the window stack. | 04-05-2012 |
20120084709 | FILLING STACK OPENING IN DISPLAY - The present disclosure is directed to methodologies and devices for quitting an active application and controlling displayed images as a result of quitting the application. | 04-05-2012 |
20120084706 | LAUNCHED APPLICATION INSERTED INTO THE STACK - A dual-screen user device and methods for revealing a combination of desktops on single and multiple screens are disclosed. Specifically, a determined number of desktops and/or running applications can be selectively displayed on dual screen displays conditioned upon inputs received and the state of the device. Desktop displays and applications can be selectively shifted between the screens by user gestures, or moved off of the screens by other user gestures and hidden. The hidden desktops and screens can be re-displayed by yet another gesture. The desktops and applications are arranged in a window stack that represents a logical order of the desktops and applications. Desktops and applications can be selectively launched and added to the window stack. The user can also select where the desktops/applications are to be inserted and where they are first to be displayed after being launched. | 04-05-2012 |
20120084701 | KEYBOARD MAXIMIZATION - Methods and devices for selectively presenting a virtual keyboard are provided. More particularly, a change in the operating mode of a multiple screen device from a multiple screen operating mode to a single screen operating mode, or from a single screen operating mode to a multiple screen operating mode, can be determined. Moreover, a change in the operating mode can effect a change in a presentation of a virtual keyboard. More particularly, a presentation of a virtual keyboard can be retained, where the number of screens of the device in view of the user is changed, provided an application with a keyboard focus remains in view of the user after the change in operating mode. | 04-05-2012 |
20120084700 | KEYBOARD DISMISSED ON CLOSURE OF DEVICE - Methods and devices for selectively presenting a virtual keyboard are provided. More particularly, a change in the operating mode of a multiple screen device between a multiple screen operating mode and a single screen operating mode can be determined. Moreover, a change in the operating mode can effect a change in a presentation of a virtual keyboard. More particularly, a presentation of a virtual keyboard can be discontinued in response to a change in the operating mode of the device, where an application that had keyboard focus prior to the change is, after the change, no longer in view of the user. | 04-05-2012 |
20120084699 | KEYBOARD FILLS BOTTOM SCREEN ON ROTATION OF A MULTIPLE SCREEN DEVICE - Methods and devices for presenting a virtual keyboard are provided. More particularly, in connection with a multiple screen device, a virtual keyboard can be presented in a first mode using portions of both of the screens. In a second mode, the virtual keyboard can be presented using all of one of the screens. Movement between the different modes can be effected by rotating the device between a dual portrait orientation, corresponding to the first mode, and a dual landscape orientation, corresponding to the second mode. More particularly, with the screens of the device in a landscape orientation, one screen can be devoted to present the virtual keyboard while the other screen remains available to present other information. | 04-05-2012 |
20120084698 | SMARTPAD SPLIT SCREEN WITH KEYBOARD - A multi-display device is adapted to be dockable or otherwise associatable with an additional device. In accordance with one exemplary embodiment, the multi-display device is dockable with a smartpad. The exemplary smartpad can include a screen, a touch sensitive display, a configurable area, a gesture capture region(s) and a camera. The smartpad can also include a port adapted to receive the device. The exemplary smartpad is able to cooperate with the device such that information displayable on the device is also displayable on the smartpad. Furthermore, any one or more of the functions on the device are extendable to the smartpad, with the smartpad capable of acting as an input/output interface or extension of the smartpad. Therefore, for example, information from one or more of the displays on the multi-screen device is displayable on the smartpad. | 04-05-2012 |
20120084694 | METHOD AND SYSTEM FOR PERFORMING DRAG AND DROP OPERATIONS ON A DEVICE VIA USER GESTURES - A multi-screen user device and methods for performing a drag and drop operation using finger gestures are disclosed. A first finger gesture is used to select a display area from which data is to be copied. Subsequently, a drag finger gesture is used to identify where the data that is to be pasted. The drag may extend across a non-display boundary between a first and second display screen of the multi-screen device. | 04-05-2012 |
20120084693 | MODALS IN DUAL DISPLAY COMMUNICATION DEVICES - The present disclosure is directed to methodologies and devices for handling modals in a set of related windows. | 04-05-2012 |
20120084687 | FOCUS CHANGE UPON USE OF GESTURE - Embodiments are described for handling focus when a gesture is input in a multi-screen device. In embodiments, a first image displayed on a first touch sensitive display of a first screen may be currently in focus. In embodiments, the gesture is a tap on a second touch sensitive display of the device. In response to the gesture, focus is changed from the first image on the first touch sensitive display to the second touch sensitive display. | 04-05-2012 |
20120084686 | KEEPING FOCUS DURING DESKTOP REVEAL - Systems and methods are provides for adjusting focus during a desktop reveal. A window has focus before the desktop is revealed. After the window is returned and the desktop hidden, the focus is again placed on the window. Further, a configurable area associated with the screen that displays the window is maintained during the desktop reveal and the return of the window. | 04-05-2012 |
20120084682 | MAINTAINING FOCUS UPON SWAPPING OF IMAGES - Embodiments are described for handling focus when a gesture is input in a multi-screen device. In embodiments, the gesture indicates that two images, one that is in focus, swap positions. In response to receiving the gesture, the image in focus is moved from a first display of a first screen to a second display of a second screen. After the images are swapped, the focus is maintained on the image that originally had the focus. | 04-05-2012 |
20120084681 | APPLICATION LAUNCH - Embodiments are described for handling the launching of applications in a multi-screen device. In embodiments, a first touch sensitive display of a first screen receives input to launch an application. In response, the application is launched and a window of the first application is displayed on the first display. A second touch sensitive display of a second screen receives input to launch a second application. In response, the second application is launched and a second window of the second application is displayed on the second display. In embodiments, when an application is launched, it displays the view of the application (whether on the first touch sensitive display or the second touch sensitive display) that was displayed when the application was last closed. | 04-05-2012 |
20120084680 | GESTURE CAPTURE FOR MANIPULATION OF PRESENTATIONS ON ONE OR MORE DEVICE DISPLAYS - An intuitive technique for inputting user gestures into a handheld computing device is disclosed allowing a user to better manipulate different types of screen display presentations, such as desktops and application windows, when performing tasks thereon, e.g., minimization, maximization, moving between display screens, and increasing/decreasing a display thereof across multiple display screens. For manipulating an application window on a device display screen for performing tasks as identified above, user gestures are input to a corresponding gesture capture area for this display screen, wherein this capture area is separate from this display screen. | 04-05-2012 |
20120084679 | KEYBOARD OPERATION ON APPLICATION LAUNCH - Methods and devices for selectively presenting a virtual keyboard are provided. More particularly, upon the receipt of instructions to launch an application, a determination can be made as to whether the application is associated with instructions to receive keyboard focus on launch. If the application is to receive keyboard focus on launch, a virtual keyboard is presented together with the newly launched application. Where an application is not set to receive keyboard focus on launch, a virtual keyboard that is presented when instructions to launch the application are received can be dismissed. | 04-05-2012 |
20120084678 | FOCUS CHANGE DISMISSES VIRTUAL KEYBOARD ON A MULTIPLE SCREEN DEVICE - Methods and devices for providing a virtual keyboard in connection with a multiple screen device are provided. More particularly, information displayed on the screen of a multiple screen device having a current focus of the user is identified, and is presented by a top screen. The virtual keyboard is presented by the bottom screen. The virtual keyboard can be dismissed in response to detecting a change in the focus of the user. | 04-05-2012 |
20120084677 | METHOD AND APPARATUS FOR MOVING DISPLAY DURING A DEVICE FLIP - Systems and methods for moving a display during a device flip are provided. More particularly, in response to input or instructions selecting a secondary closed mode, operation of the device can be altered from a normal operating mode. In the secondary closed mode, when a device is moved from an open mode or state to a closed mode or state, the active window while the device was in the open mode is presented by a secondary screen, rather than a primary screen. The secondary closed mode is exited when the device is again placed in an open state or mode. | 04-05-2012 |
20120084676 | DUAL SCREEN APPLICATION VISUAL INDICATOR - A dual-screen user device and methods for revealing a combination of desktops on single and multiple screens are disclosed. A determined number of desktops and/or running applications can be selectively displayed on dual screen displays conditioned upon inputs received and the state of the device. Desktop displays and applications can be selectively shifted between the screens by user gestures, and can be moved off of the screens by other user gestures and therefore hidden. The hidden desktops and screens however can be re-displayed by yet another gesture. The desktops and applications are arranged in a window stack that represents a logical order of the desktops and applications allowing a user with an intuitive ability to manage multiple applications/desktops running simultaneously. Visual indicators can be used on the displayed applications and desktops enabling a user to maximize a display on multiple screens or to minimize them on a single selected screen. | 04-05-2012 |
20120084675 | ANNUNCIATOR DRAWER - A dual-screen user device and methods for revealing a combination of selected desktops and applications on single and dual screens are disclosed. Desktops and applications can be shifted between screens by user gestures, and/or moved off of the screens and therefore hidden. Hidden desktops and screens can be re-displayed by other gestures. The desktops and applications are arranged in a window stack that represents a logical order of the desktops and applications providing a user with an intuitive ability to manage multiple applications/desktops running simultaneously. One embodiment provides an annunciator window extending across both screens in a dual screen configuration. The annunciator window provides alerts, notifications, and statuses of the device in an increased area thereby enhancing viewability of the information in the window. The annunciator window can be expanded over a selected screen to view full contents of the window without having to minimize or close running applications. | 04-05-2012 |
20120084674 | ALLOWING MULTIPLE ORIENTATIONS IN DUAL SCREEN VIEW - A dual-screen user device and methods for revealing a combination of selected desktops and applications on single and dual screens are disclosed. Desktops and applications can be shifted between screens by user gestures, and/or moved off of the screens and therefore hidden. Hidden desktops and screens can be re-displayed by other gestures. The desktops and applications are arranged in a window stack that represents a logical order of the desktops and applications providing a user with an intuitive ability to manage multiple applications/desktops running simultaneously. One embodiment provides the user with the ability to selectively change the orientation of one display and leaving the other display unaffected when the user has two applications or desktops displayed on respective screens. The user can therefore selectively change on a screen-by-screen basis the orientation of any desktops or applications displayed. | 04-05-2012 |
20120084673 | DRAG/FLICK GESTURES IN USER INTERFACE - The disclosed method and device are directed to navigation, by a dual display communication device, through display objects. | 04-05-2012 |
20120084542 | Multi-Operating System - A mobile computing device with a mobile operating system and desktop operating system running concurrently and independently on a shared kernel without virtualization. The mobile operating system provides a user experience for the mobile computing device that suits the mobile environment. The desktop operating system provides a full desktop user experience when the mobile computing device is docked to a secondary terminal environment. The mobile computing device may be a smartphone running the Android mobile OS and a full desktop Linux distribution on a modified Android kernel. | 04-05-2012 |
20120084481 | AUTO-WAKING OF A SUSPENDED OS IN A DOCKABLE SYSTEM - A mobile computing device with a mobile operating system and desktop operating system running concurrently and independently on a shared kernel without virtualization. The mobile operating system provides a user experience for the mobile computing device that suits the mobile environment. The desktop operating system provides a full desktop user experience when the mobile computing device is docked to a secondary terminal environment. The desktop operating system may be suspended when the mobile computing device is not docked with a secondary terminal environment and resumed when the mobile computing device is docked with a secondary terminal environment that provides a desktop computing experience. The mobile computing device may be a smartphone running the Android mobile OS and a full desktop Linux OS distribution on a modified Android kernel. | 04-05-2012 |
20120084480 | AUTO-CONFIGURATION OF A DOCKED SYSTEM IN A MULTI-OS ENVIRONMENT - A mobile computing device with a mobile operating system and desktop operating system running concurrently and independently on a shared kernel without virtualization. The mobile operating system provides a user experience for the mobile computing device that suits the mobile environment. The desktop operating system provides a full desktop user experience when the mobile computing device is docked to a secondary terminal environment. The mobile computing device configures the mobile operating system and/or the desktop operating system to take advantage of a docked secondary terminal environment. The mobile computing device may be a smartphone running the Android mobile OS and a full desktop Linux OS distribution on a modified Android kernel. | 04-05-2012 |
20120083319 | RECEIVING CALLS IN DIFFERENT MODES - Embodiments are described for handling receipt of call in a multi-screen device. In embodiments, the device may be in a closed mode in which a primary screen is being used. A message regarding the incoming call is displayed on the primary screen to a user so that a user can decide whether to answer the call from the primary screen. If the device is being used in a closed secondary screen mode (with the user interacting with the secondary screen) when the call is received, a notice will be displayed to the user to turn the phone around to the primary screen so that the user can decide whether to answer the phone. | 04-05-2012 |
20120081854 | SMARTPAD SPLIT SCREEN DESKTOP - A multi-display device is adapted to be dockable or otherwise associatable with an additional device. In accordance with one exemplary embodiment, the multi-display device is dockable with a smartpad. The exemplary smartpad can include a screen, a touch sensitive display, a configurable area, a gesture capture region(s) and a camera. The smartpad can also include a port adapted to receive the device. The exemplary smartpad is able to cooperate with the device such that information displayable on the device is also displayable on the smartpad. Furthermore, any one or more of the functions on the device are extendable to the smartpad, with the smartpad capable of acting as an input/output interface or extension of the smartpad. Therefore, for example, information from one or more of the displays on the multi-screen device is displayable on the smartpad. | 04-05-2012 |
20120081403 | SMARTPAD SPLIT SCREEN - A multi-display device is adapted to be dockable or otherwise associatable with an additional device. In accordance with one exemplary embodiment, the multi-display device is dockable with a smartpad. The exemplary smartpad can include a screen, a touch sensitive display, a configurable area, a gesture capture region(s) and a camera. The smartpad can also include a port adapted to receive the device. The exemplary smartpad is able to cooperate with the device such that information displayable on the device is also displayable on the smartpad. Furthermore, any one or more of the functions on the device are extendable to the smartpad, with the smartpad capable of acting as an input/output interface or extension of the smartpad. Therefore, for example, information from one or more of the displays on the multi-screen device is displayable on the smartpad. | 04-05-2012 |
20120081401 | FOCUS CHANGES DUE TO GRAVITY DROP - Systems and methods are provides for adjusting focus during an orientation change. A window is displayed on a single display before the orientation change. After the device is oriented in the landscape orientation, the window expands into a full screen window being displayed on two displays. Further, the focus is changed to the window. And, a configurable area associated with at least one screen that displays the full screen window is changed to display the configurable area on the screen that is at the top of the device. | 04-05-2012 |
20120081400 | DUAL-SCREEN VIEW IN RESPONSE TO ROTATION - A multi-screen user device and methods for controlling data displayed thereby are disclosed. Specifically, a gesture sequence is disclosed which enables a user to toggle or shift through applications that are displayed by the multi-screen user device. The gesture sequence may correspond to various rotation or partial rotations of the multi-screen user device. | 04-05-2012 |
20120081399 | VISIBLE CARD STACK - A multi-display device is adapted to be dockable or otherwise associatable with an additional device. In accordance with one exemplary embodiment, the multi-display device is dockable with a smartpad. The exemplary smartpad can include a screen, a touch sensitive display, a configurable area, a gesture capture region(s) and a camera. The smartpad can also include a port adapted to receive the device. The exemplary smartpad is able to cooperate with the device such that information displayable on the device is also displayable on the smartpad. Furthermore, any one or more of the functions on the device are extendable to the smartpad, with the smartpad capable of acting as an input/output interface or extension of the smartpad. Therefore, for example, information from one or more of the displays on the multi-screen device is displayable on the smartpad. | 04-05-2012 |
20120081398 | SMARTPAD SPLIT SCREEN - A multi-display device is adapted to be dockable or otherwise associatable with an additional device. In accordance with one exemplary embodiment, the multi-display device is dockable with a smartpad. The exemplary smartpad can include a screen, a touch sensitive display, a configurable area, a gesture capture region(s) and a camera. The smartpad can also include a port adapted to receive the device. The exemplary smartpad is able to cooperate with the device such that information displayable on the device is also displayable on the smartpad. Furthermore, any one or more of the functions on the device are extendable to the smartpad, with the smartpad capable of acting as an input/output interface or extension of the smartpad. Therefore, for example, information from one or more of the displays on the multi-screen device is displayable on the smartpad. | 04-05-2012 |
20120081397 | ROTATION GRAVITY DROP - A multi-screen user device and methods for controlling data displayed thereby are disclosed. Specifically, the data displayed by the multiple screens of the multi-screen user device is conditioned upon the relative position of the multiple screens. A gravity-drop display feature is also disclosed in which data from a first application on a first screen is automatically displayed on a second screen when the device is rotated. | 04-05-2012 |
20120081396 | EXTENDED GRAPHICS CONTEXT WITH COMMON COMPOSITING - Cross-environment rendering and user interaction support provide a seamless computing experience in a multi-operating system computing environment. The multi-operating system computing environment may include a mobile operating system and a desktop operating system running concurrently and independently on a mobile computing device. The mobile operating system may be associated with a first user environment and the desktop operating system may be associated with a second user environment. Real-time display of applications running in the mobile operating system within the second user environment is provided by rendering the application through an extended graphics context of the mobile operating system. Application graphics for multiple applications are rendered into portions of a single graphics frame. The mobile computing device may be a smartphone running the Android mobile operating system and a full desktop Linux distribution on a modified Android kernel. | 04-05-2012 |
20120081383 | CROSS-ENVIRONMENT USER INTERFACE MIRRORING USING REMOTE RENDERING - Cross-environment rendering and user interaction support provide a seamless computing experience in a multi-operating system computing environment. The multi-operating system computing environment may include a mobile operating system and a desktop operating system running concurrently and independently on a mobile computing device. The seamless computing experience includes mirroring the active user interaction space of the mobile operating system to a display of a user environment associated with the desktop operating system. The user interface is rendered by the desktop operating system by accessing surface information of the active user interaction space directly from shared memory. The mobile computing device may be a smartphone running the Android mobile operating system and a full desktop Linux distribution on a modified Android kernel. | 04-05-2012 |
20120081380 | CROSS-ENVIRONMENT USER INTERFACE MIRRORING - A mobile computing device with a mobile operating system and desktop operating system running concurrently and independently on a shared kernel without virtualization. The mobile operating system provides a user experience for the mobile computing device that suits the mobile environment. The desktop operating system provides a full desktop user experience when the mobile computing device is docked to a second user environment. Cross-environment rendering and user interaction support provide a seamless computing experience in a multi-operating system computing environment. The seamless computing experience includes mirroring the active user interaction space of the mobile operating system to a display of a user environment associated with the desktop operating system. The mobile computing device may be a smartphone running the Android mobile operating system and a full desktop Linux distribution on a modified Android kernel. | 04-05-2012 |
20120081354 | EXTENDED GRAPHICS CONTEXT WITH DIVIDED COMPOSITING - A mobile computing device with a mobile operating system and desktop operating system running concurrently and independently on a shared kernel without virtualization. The mobile operating system provides a user experience for the mobile computing device that suits the mobile environment. The desktop operating system provides a full desktop user experience when the mobile computing device is docked to a second user environment. Cross-environment rendering and user interaction support provide a seamless computing experience in a multi-operating system computing environment. Real-time display of applications running in the mobile operating system within an environment of the desktop operating system is provided by rendering the application through an extended graphics context of the mobile operating system. Application graphics for multiple applications are rendered into separate graphics frames. The mobile computing device may be a smartphone running the Android mobile operating system and a full desktop Linux distribution on a modified Android kernel. | 04-05-2012 |
20120081353 | APPLICATION MIRRORING USING MULTIPLE GRAPHICS CONTEXTS - Cross-environment rendering and user interaction support provide a seamless computing experience in a multi-operating system computing environment. Cross-environment rendering provides real-time display of applications running in a mobile operating system to be displayed within an environment of a desktop operating system. The mobile operating system and the desktop operating system may be running concurrently and independently on a shared kernel of a mobile computing device. A graphics server of the mobile operating system tears down and rebuilds the rendering context for each application as it composites the surface information. The rendering context may be established to match the resolution of the associated display, so that graphics will be appropriately rendered for that resolution. The mobile computing device may be a smartphone running the Android mobile operating system and a full desktop Linux distribution on a modified Android kernel. | 04-05-2012 |
20120081323 | APPLICATION LAUNCH - Embodiments are described for handling the launching of applications in a multi-screen device. In embodiments, a first touch sensitive display of a first screen receives input to launch an application. In response, the application is launched. A determination is made as to whether the first touch sensitive display already has windows in its stack. If there are no windows in the stack of the first touch sensitive display, a new window of the first application is displayed on the first touch sensitive display. If there are windows in the stack, a determination is made whether a second display has windows in its stack. If not, the new window is displayed on the second display. If the second display also has windows in its stack, the new window will be displayed on the first touch sensitive display. | 04-05-2012 |
20120081322 | FOCUS CHANGE UPON APPLICATION LAUNCH - Embodiments are described for handling focus when a gesture is input in a multi-screen device. In embodiments, a first image displayed on a first touch sensitive display of a first screen may be currently in focus. In embodiments, the gesture is a tap on a second touch sensitive display of the device. In response to the gesture, an application is launched, which displays a second image on a second display of a second screen. Focus is then changed from the first image on the first touch sensitive display to the second image on the second touch sensitive display. | 04-05-2012 |
20120081319 | MODIFYING THE DISPLAY STACK UPON DEVICE OPEN - Systems and methods are provides for displaying a second window for a multi-screen device in response to opening the device. The window stack can change based on the change in the orientation of the device. The system can receive an orientation change that transitions the device from a closed state to an open state. A previously created, but inactive, window in the stack can be displayed on one of the two or more displays comprising the device when opened. The previously created window become active to be displayed on the second of the displays after the device is opened. | 04-05-2012 |
20120081318 | DISPLAYING THE DESKTOP UPON DEVICE OPEN - Systems and methods are provides for displaying a desktop for a multi-screen device in response to opening the device. The window stack can change based on the change in the orientation of the device. The system can receive an orientation change that transitions the device from a closed state to an open state. A previously created in the stack can expand over the area of the two or more displays comprising the device when opened. A desktop expands to fill the display area and be displayed on the second of the displays after the device is opened. | 04-05-2012 |
20120081317 | METHOD AND SYSTEM FOR PERFORMING COPY-PASTE OPERATIONS ON A DEVICE VIA USER GESTURES - A multi-screen user device and methods for performing a copy-paste operation using finger gestures are disclosed. A first finger gesture is used to select a display area from which data is to be copied. Subsequently, a drag finger gesture is used to identify where the data that is to be pasted. The drag may extend across a non-display boundary between a first and second display screen of the multi-screen device. | 04-05-2012 |
20120081316 | OFF-SCREEN GESTURE DISMISSABLE KEYBOARD - Methods and devices for discontinuing the presentation of a virtual keyboard or dismissing a virtual keyboard in response to the receipt of input are provided. More particularly, input entered outside of an area of a screen comprising a touch screen display can cause the presentation of a virtual keyboard to be discontinued. The discontinuance of the virtual keyboard display can be performed simultaneously with an operation indicated by the received input. | 04-05-2012 |
20120081315 | KEYBOARD SPANNING MULTIPLE SCREENS - Methods and devices for presenting a virtual keyboard are provided. More particularly, in connection with a multiple screen device, a virtual keyboard can be presented using portions of both of the screens. More particularly, with the screens of the device in a portrait orientation, the virtual keyboard can span the two screens, such that a first portion of the first screen and a first portion of the second screen operate cooperatively to present the virtual keyboard. | 04-05-2012 |
20120081314 | SMARTPAD SPLIT SCREEN DESKTOP - A multi-display device is adapted to be dockable or otherwise associatable with an additional device. In accordance with one exemplary embodiment, the multi-display device is dockable with a smartpad. The exemplary smartpad can include a screen, a touch sensitive display, a configurable area, a gesture capture region(s) and a camera. The smartpad can also include a port adapted to receive the device. The exemplary smartpad is able to cooperate with the device such that information displayable on the device is also displayable on the smartpad. Furthermore, any one or more of the functions on the device are extendable to the smartpad, with the smartpad capable of acting as an input/output interface or extension of the smartpad. Therefore, for example, information from one or more of the displays on the multi-screen device is displayable on the smartpad. | 04-05-2012 |
20120081313 | SMARTPAD SPLIT SCREEN DESKTOP - A multi-display device is adapted to be dockable or otherwise associatable with an additional device. In accordance with one exemplary embodiment, the multi-display device is dockable with a smartpad. The exemplary smartpad can include a screen, a touch sensitive display, a configurable area, a gesture capture region(s) and a camera. The smartpad can also include a port adapted to receive the device. The exemplary smartpad is able to cooperate with the device such that information displayable on the device is also displayable on the smartpad. Furthermore, any one or more of the functions on the device are extendable to the smartpad, with the smartpad capable of acting as an input/output interface or extension of the smartpad. Therefore, for example, information from one or more of the displays on the multi-screen device is displayable on the smartpad. | 04-05-2012 |
20120081312 | SMARTPAD SPLIT SCREEN - A multi-display device is adapted to be dockable or otherwise associatable with an additional device. In accordance with one exemplary embodiment, the multi-display device is dockable with a smartpad. The exemplary smartpad can include a screen, a touch sensitive display, a configurable area, a gesture capture region(s) and a camera. The smartpad can also include a port adapted to receive the device. The exemplary smartpad is able to cooperate with the device such that information displayable on the device is also displayable on the smartpad. Furthermore, any one or more of the functions on the device are extendable to the smartpad, with the smartpad capable of acting as an input/output interface or extension of the smartpad. Therefore, for example, information from one or more of the displays on the multi-screen device is displayable on the smartpad. | 04-05-2012 |
20120081311 | SMARTPAD ORIENTATION - A multi-display device is adapted to be dockable or otherwise associatable with an additional device. In accordance with one exemplary embodiment, the multi-display device is dockable with a smartpad. The exemplary smartpad can include a screen, a touch sensitive display, a configurable area, a gesture capture region(s) and a camera. The smartpad can also include a port adapted to receive the device. The exemplary smartpad is able to cooperate with the device such that information displayable on the device is also displayable on the smartpad. Furthermore, any one or more of the functions on the device are extendable to the smartpad, with the smartpad capable of acting as an input/output interface or extension of the smartpad. Therefore, for example, information from one or more of the displays on the multi-screen device is displayable on the smartpad. | 04-05-2012 |
20120081310 | PINCH GESTURE TO SWAP WINDOWS - The disclosed method and device are directed to a communication device that receives, by a gesture capture region and/or a touch sensitive display, a gesture while a first touch sensitive display is displaying a first displayed image and a second touch sensitive display is displaying a second displayed image and, in response, ceasing to display the first displayed image on the first touch sensitive display and commencing to display the first displayed image on the second touch sensitive display and ceasing to display a second displayed image on the second touch sensitive display and commencing to display the second displayed image on the first touch sensitive display. | 04-05-2012 |
20120081309 | DISPLAYED IMAGE TRANSITION INDICATOR - A dual display communication device includes a gesture capture region to receive a gesture, a first touch sensitive display to receive a gesture and display displayed images (such as a desktop or window of an application), and a second touch sensitive display to receive a gesture and display displayed images. Middleware receives a gesture, the gesture indicating that a displayed image is to be moved from the first touch sensitive display to the second touch sensitive display, such as to maximize a window to cover portions of both displays simultaneously; in response and prior to movement of the displayed image to the second touch sensitive display, moves a transition indicator from the first touch sensitive display to the second touch sensitive display to a selected position to be occupied by the displayed image; and thereafter moves the displayed image from the first touch sensitive display to the second touch sensitive display to the selected position. | 04-05-2012 |
20120081308 | LONG DRAG GESTURE IN USER INTERFACE - The disclosed method and device are directed to navigation, by a dual display communication device, through display objects. | 04-05-2012 |
20120081307 | FLICK MOVE GESTURE IN USER INTERFACE - The disclosed method and device are directed to navigation, by a dual display communication device, through display objects. | 04-05-2012 |
20120081306 | DRAG MOVE GESTURE IN USER INTERFACE - The disclosed method and device are directed to navigation, by a dual display communication device, through display objects. | 04-05-2012 |
20120081305 | SWIPEABLE KEY LINE - Methods and devices for the display of virtual keyboards are provided. More particularly, virtual keyboards that include selectable sets of virtual keys that can be selected or changed through direct input by a user are provided. The user input can include a touch screen input or gesture applied to an area of a display in which a selectable set of virtual keys is presented. The touch screen input or swipe can cause the displayed selectable set of virtual keys to be replaced by an alternate selectable set of virtual keys. | 04-05-2012 |
20120081304 | HARDWARE BUTTONS ACTIVATED BASED ON FOCUS - Methods and devices for providing one or more control buttons in connection with a multiple screen device are provided. More particularly, the screen of a multiple screen device having a current focus is identified, and one or more control buttons are provided as part of or in association with the identified screen. When a change in focus from a previously identified screen to a different screen is detected, the presentation of the one or more control buttons can also change. In particular, control buttons are presented as part of a screen having the current focus, while control buttons are not provided on or in association with a screen that does not have the current focus. | 04-05-2012 |
20120081302 | MULTI-SCREEN DISPLAY CONTROL - A dual-screen user device and methods for controlling data displayed thereby are disclosed. Specifically, the data displayed by the multiple screens of the dual-screen user device is conditioned upon the type of user gesture or combination of user gestures detected. The display controls described herein can correlate user inputs received in a gesture capture region to one or more display actions, which may include maximization, minimization, or reformatting instructions. | 04-05-2012 |
20120081293 | GRAVITY DROP RULES AND KEYBOARD DISPLAY ON A MULTIPLE SCREEN DEVICE - Methods and devices for presenting or dismissing a virtual keyboard are provided. More particularly, in connection with a multiple screen device, a virtual keyboard can be presented in a first mode using portions of both of the screens. In a second mode, the virtual keyboard can be presented using all of one of the screens, or can be dismissed. Movement between the different modes can be effected by rotating the device between a dual portrait orientation, corresponding to the first mode, and a dual landscape orientation, corresponding to the second mode. More particularly, depending on whether the device is rotated away from or towards information having focus, the display of the virtual keyboard is continued or discontinued. | 04-05-2012 |
20120081292 | DESKTOP REVEAL - A multi-display device is adapted to be dockable or otherwise associatable with an additional device. In accordance with one exemplary embodiment, the multi-display device is dockable with a smartpad. The exemplary smartpad can include a screen, a touch sensitive display, a configurable area, a gesture capture region(s) and a camera. The smartpad can also include a port adapted to receive the device. The exemplary smartpad is able to cooperate with the device such that information displayable on the device is also displayable on the smartpad. Furthermore, any one or more of the functions on the device are extendable to the smartpad, with the smartpad capable of acting as an input/output interface or extension of the smartpad. Therefore, for example, information from one or more of the displays on the multi-screen device is displayable on the smartpad. | 04-05-2012 |
20120081289 | KEYBOARD FILLING ONE SCREEN OR SPANNING MULTIPLE SCREENS OF A MULTIPLE SCREEN DEVICE - Methods and devices for presenting a virtual keyboard are provided. More particularly, in connection with a multiple screen device, a virtual keyboard can be presented in a first mode using all of one of the screens. In a second mode, the virtual keyboard can be presented using portions of both of the screens. More particularly, with the screens of the device in a landscape orientation, one screen can be devoted to present the virtual keyboard while the other screen remains available to present other information. In a portrait orientation, the virtual keyboard can span the two screens, such that a first portion of the first screen and a first portion of the second screen operate cooperatively to present the virtual keyboard. | 04-05-2012 |
20120081280 | SINGLE-SCREEN VIEW IN RESPONSE TO ROTATION - A multi-screen user device and methods for controlling data displayed thereby are disclosed. Specifically, a gesture sequence is disclosed which enables a user to toggle or shift though applications that are displayed by the multi-screen user device. The gesture sequence may correspond to various rotation or partial rotations of the multi-screen user device. | 04-05-2012 |
20120081271 | APPLICATION DISPLAY TRANSITIONS BETWEEN SINGLE AND MULTIPLE DISPLAYS - A multi-screen user device and methods for controlling data displayed are disclosed. The data displayed by the multiple screens of the device is dependent on the physical orientation of the device, whether the content for an application is displayed across a plurality of the multiple screens, and whether the data being displayed for the application originated from a single-screen application or a multi-screen application. | 04-05-2012 |
20120081270 | DUAL SCREEN APPLICATION BEHAVIOUR - A multi-screen user device and methods for logically controlling the display behavior of applications and other displayable data are disclosed. Specifically, when a dual-screen application is displayed to the primary and secondary screens of a device the dual-screen application can be minimized to a single screen by closing the device. When the device is reopened, the minimized dual-screen application is maintained in a minimized state on the screen to which it was minimized. Specific rules can be implemented to determine whether the dual-screen application will be maximized to run on the primary and secondary screens of a device upon reopening. | 04-05-2012 |
20120081269 | GRAVITY DROP - A dual-screen user device and methods for controlling data displayed thereby are disclosed. Specifically, the data displayed by the multiple screens of the dual-screen user device is conditioned upon the relative position of the multiple screens and whether the data being displayed originated from a single-screen application or a multi-screen application. | 04-05-2012 |
20120081268 | LAUNCHING APPLICATIONS INTO REVEALED DESKTOP - A dual-screen user device and methods for launching applications from a revealed desktop onto a logically chosen screen are disclosed. Specifically, a user reveals the desktop and then launches a selected application from one of two desktops displayed on a primary and secondary screen of a device. When the application is launched, it is displayed onto a specific screen depending on the input received and the logical rules determining the display output. As the application is displayed onto the specific screen, the desktop is removed from display and the opposite screen can display other data. | 04-05-2012 |
20120081267 | DESKTOP REVEAL EXPANSION - A dual-screen user device and methods for revealing a combination of desktops on single and multiple screens are disclosed. Specifically, a determined number of desktops is displayed to at least one of the screens of the device conditioned upon input received and the state of the device. Where a screen of the device is determined to be inactive, the desktop is not displayed to the screen, but is stored in a virtually displayed state by the device. Upon receiving input that the inactive screen is active, the device can actually display the desktop to the screen. | 04-05-2012 |
20120079323 | HIGH SPEED PARALLEL DATA EXCHANGE WITH TRANSFER RECOVERY - Systems and methods for transfer of data including establishing two separate connections, the two separate connections including a high speed connection and a high integrity connection. Blocks of data are exchanged over the high speed connection while the high integrity connection facilitates communication of descriptor data regarding data received over the high speed connection. As such, the data transfer speed of the high speed connection is utilized while communication via the high integrity connection allows for data reliability features not provided by the high speed connection. | 03-29-2012 |