Patent application number | Description | Published |
20090146803 | Monitoring and Notification Apparatus - The disclosure relates to monitoring and notification apparatus capable of monitoring events at various locations. The apparatus includes a sound receiving unit which receives audio content from various locations. A user can select which of the location is monitored at any one time. In one embodiment, this selection is made depending on the orientation of the sound receiving unit. | 06-11-2009 |
20090147649 | Sound Playback and Editing Through Physical Interaction - The disclosure relates to sound playback and editing apparatus. The editing apparatus uses user interaction to allow the user to instinctively modify recorded sound. This can be achieved by converting a quality of the user's physical interactions with the editing apparatus into instructions for processing the sound. For example, in one embodiment the user can mix sound files by ‘mixing’, i.e. shaking, physical representations of those sound files (such as the recording medium on which the files are stored) alone or together. | 06-11-2009 |
20090180623 | Communication Devices - The disclosure relates to communication devices which monitor an audio environment at a remote location and convey to a user a representation of that audio environment. The “representation” may be an abstraction of the audio environment at the remote location or may be a measure of decibels or some other quality or parameter of the audio environment. In some embodiments, the communication devices are two-way devices which allow users at remote locations to share an audio environment. In some embodiments, the communication devices are one way devices. In some embodiments, the communication devices may have the form of a window and be arranged to present sound in a manner that mimics sound received through a window. In such embodiments, the more open the window is, the more sound is relayed by the communication device. | 07-16-2009 |
Patent application number | Description | Published |
20110007078 | Creating Animations - Animation creation is described, for example, to enable children to create, record and play back stories. In an embodiment, one or more children are able to create animation components such as characters and backgrounds using a multi-touch panel display together with an image capture device. For example, a graphical user interface is provided at the multi-touch panel display to enable the animation components to be edited. In an example, children narrate a story whilst manipulating animation components using the multi-touch display panel and the sound and visual display is recorded. In embodiments image analysis is carried out automatically and used to autonomously modify story components during a narration. In examples, various types of handheld view-finding frames are provided for use with the image capture device. In embodiments saved stories can be restored from memory and retold from any point with different manipulations and narration. | 01-13-2011 |
20130162653 | Creating Animations - Animation creation is described, for example, to enable children to create, record and play back stories. In an embodiment, one or more children are able to create animation components such as characters and backgrounds using a multi-touch panel display together with an image capture device. For example, a graphical user interface is provided at the multi-touch panel display to enable the animation components to be edited. In an example, children narrate a story whilst manipulating animation components using the multi-touch display panel and the sound and visual display is recorded. In embodiments image analysis is carried out automatically and used to autonomously modify story components during a narration. In examples, various types of handheld view-finding frames are provided for use with the image capture device. In embodiments saved stories can be restored from memory and retold from any point with different manipulations and narration. | 06-27-2013 |
20150199844 | TANGIBILIZATION OF GEOCODED DATA - Data points that include geolocation data are obtained. Frequency values are determined that depict frequencies of sets of the data points that are associated with respective geolocations represented by the geolocation data, and the frequency values are normalized. A georepresentation of the data points is generated, as a tangible 3-D model, using the geolocation data to determine location perspective of the data points on the 3-D model for a mapping of the data points to the 3-D model, and using the normalized frequency values to determine sensory attributes of portions of the 3-D model at locations of the respective mapped data points on the 3-D model, the sensory attributes representing frequency value ranges. | 07-16-2015 |