Patents - stay tuned to the technology

Inventors list

Assignees list

Classification tree browser

Top 100 Inventors

Top 100 Assignees


Synchronization

Subclass of:

386 - Television signal processing for dynamic recording or reproducing

386200000 - WITH INTERFACE BETWEEN RECORDING/REPRODUCING DEVICE AND AT LEAST ONE OTHER LOCAL DEVICE

Patent class list (only not empty are listed)

Deeper subclasses:

Class / Patent application numberDescriptionNumber of patent applications / Date published
386219000 Digital playback device to display device 12
386207000 Synchronization correction 9
386210000 Camera source to digital recording device 7
386203000 With variable delay 4
20140254999DATA ACCESS CONTROL METHOD AND APPARATUS - One embodiment of the present invention discloses a data access method comprising: (a) receiving a plurality of data units consisting of filtered data units and un-filtered data units; (b) filtering the filtered data units; (c) storing the un-filtered data units; (d) recovering the timings of the un-filtered data units stored in the step (c) according to the received timings of the data units received in the step (a); (e) inserting replacement data units to replace the filtered data units, wherein each of the replacement data units has the same timing as each of the filtered data units; and (f) outputting the un-filtered data units and the replacement data units according to the timings of the un-filtered data units and the timing of the replacement data unit.09-11-2014
20140376873VIDEO-AUDIO PROCESSING DEVICE AND VIDEO-AUDIO PROCESSING METHOD - A video-audio processing device including a video output unit, an audio output unit, an audio transmitting unit, a controlling unit which switches an operation mode between (a) a first mode in which the audio signal is outputted from the audio output unit and the audio signal is transmitted from the audio transmitting unit and (b) a second mode in which the video signal is outputted from the video output unit and the audio signal is transmitted from the audio transmitting unit, a receiving unit which receives an input of delay information indicating an audio delay amount during the first mode, an audio delaying unit which delays the audio signal according to the audio delay amount, and a video delaying unit which delays the video signal by a video delay amount that is in accordance with the audio delay amount during the second mode.12-25-2014
20150063774INTERCONNECTED MULTIMEDIA SYSTEMS WITH SYNCHRONIZED PLAYBACK OF MEDIA STREAMS - Synchronous playback of time-based media received from one or more locations remote from a primary editing/mixing studio is achieved by time-stamping media samples with a local presentation time before streaming them to the primary studio. At the primary studio, samples having the same presentation timestamp are played back at the same time, independently of the samples' arrival time at the playback system. Media stored locally to the playback system may also be included as part of the synchronous playback using locally computed presentation times. In order to accommodate media streaming transmission delays, the playback system negotiates a suitable delay with the remote systems such that samples corresponding to a given presentation time are received at the playback system from remote locations prior to playback of media corresponding to the given presentation time.03-05-2015
20150319405Method for Synchronizing A/V Streams - A device may communicate a delay request value to downstream devices in order to synchronize A/V streams. For example, rather than adding the delay required to synchronize audio and video streams, an A/V splitter may instead communicate to downstream devices, asking them to add the amount of delay desired.11-05-2015
386216000 Playback device to digital recorder 3
20150304708DIGITAL DECODER HAVING A SO-CALLED "PLAYBACK" MODE OF OPERATION AND COMPRISING TWO BUFFER MEMORIES - A digital decoder for television receiver, comprises an input (E) for receiving a digital audio/video signal (SAV), means for demodulating and decoding the input signal into an output signal intended for the television receiver, and a first buffer memory (10-22-2015
20160086632AUTOMATED CAPTURE OF IMPAIRED VIDEO - Disclosed are a method, system, and apparatus for storing a video segment for analysis is provided. Accordingly, a digital video player receives a digital video, detects a defect in the digital video, and stores a segment of the digital video that contains the defect. The system includes a digital video player that includes a receiver, a decoder coupled to the receiver, and a memory coupled to the decoder. The digital video player optionally comprises a transmitter. The apparatus implements the method for storing a video segment for analysis.03-24-2016
20170236552IMAGING DEVICE AND PLAYBACK DEVICE08-17-2017
386202000 External synchronization for phase or frequency correction 3
20110110642SYSTEM FOR MODIFYING THE TIME-BASE OF A VIDEO SIGNAL - A storage device 05-12-2011
20150326911Synchronisation of Audio and Video Playback - A method of playing audio content to a viewer in synchronisation with a video content. The method comprises receiving and digitising an ultrasonic signal comprising ultrasonic synchronisation signal(s) which encode a respective timecode through modulation of ultrasonic carrier signal(s), and identifying digitised ultrasonic synchronisation signals which are decoded to determine the corresponding timecode by sampling the digitised ultrasonic signal and eliminating estimated background noise component(s) from the frequency spectrum of the samples, binarising the corrected frequency spectrum corresponding to the timecode-carrying part of the digitised ultrasonic synchronisation signal to generate a representation of the timecode, and determining the corresponding timecode from the representation. A stored audio content is played back from a playback point determined based on the timecode.11-12-2015
20160172006CLOCK RECOVERY FOR VIDEO ENCODING/TRANSCODING APPLICATIONS06-16-2016
386208000 Sync block 2
20140186003File Format for Synchronized Media - Metadata defining decoding and rendering instructions for media content to be co-rendered in a media presentation is divided and distributed as track fragments provided in different media container files. Track fragment adjustment information is included in at least one such track fragment in order to define rendering timing relationships between media content portions defined by the track fragments in a current media container file. The rendering timing relationships enable a correct time alignment of the playback of the media content to be co-rendered to achieve a synchronized media presentation. The track fragment adjustment information is particularly advantageous in connection with tuning in or a random access in a stream of media container files comprising fragmented metadata.07-03-2014
20150380056Video Channel Display Method and Apparatus - Methods for video display using a computing system. The computing system includes a main computing module and an ancillary computing module. The main computing module may transmit a synchronization control information block to the ancillary computing module. The synchronization control information block includes a frame number of a current frame and the reference time associated with the main computing module. The ancillary computing module receives the synchronization control information block and selects a frame pack having the same frame number contained in the synchronization control information block as the current frame. The ancillary computing module may obtain the reference time of the current frame based on a local time of the ancillary computing module. The main computing module and the ancillary computing module may decode one or more parts of the frame, respectively. Further, the decoded parts of the frame may be combined and displayed.12-31-2015
386204000 With phase lock loop (e.g., ProcAmp, PLL, etc.) 1
20110194831DEVICE AND METHOD FOR CONTROLLING CLOCK RECOVERY - A clock recovery device includes a PLL circuit and a tuning circuit. The PLL circuit includes a first frequency divider, a second frequency divider, and a clock recovery unit. The first frequency divider divides a first frequency of the input clock by a first divisor to generate a reference signal. The second frequency divider divides a second frequency of the output clock by a second divisor to generate a feedback signal. The clock recovery unit is coupled to the first frequency divider and the second frequency divider, for re-building and providing the output clock according to the reference signal and the feedback signal. The tuning circuit is coupled to the PLL circuit, for tuning at least one of the first divisor and the second divisor of the PLL circuit according to a buffer status information of a data buffer.08-11-2011
386222000 Controlling speed of disk 1
20100310229VIDEO PROCESSING APPARATUS AND VIDE PROCESSING METHOD - An apparatus, which controls a playback speed of a video content to enable the sound included therein to be in a well-audible range, is provided. A playback speed range is calculated based on sound characteristic information obtained by analyzing audio data of a video content and a predetermined sound parameter. A specific playback speed in the playback speed range is calculated based on a selected playback speed, and a video content is played back at the playback speed.12-09-2010
386206000 For disk/disc trick play 1
20110064374RECORDING MEDIUM, REPRODUCTION DEVICE, PROGRAM, REPRODUCTION METHOD - A BD-ROM stores PlayList information. The PlayList information defines a playback section of each of a plurality of AV clips and includes MainPath information and SubPath information. The MainPath information designates one of the AV clips as a Main Clip and defines a portion of the Main Clip as a primary playback section. The SubPath information designates another one of the AV clips as a SubClip and defines a portion of the SubClip as a secondary playback section that is to be played back in synchronism with the primary playback section. The BD-ROM stores, the one of the AV clips designated as the SubClip along with an EP_map. The EP_map shows a plurality of entry points on the SubClip in a one-to-one correspondence with entry times on the SubClip timeline.03-17-2011
386205000 For tape trick play 1
20120039578System and method for distributed trick play resolution using user preferences - A media system and method for distributed trick play resolution using user preferences. The method of distributed trick play resolution in a distributed media group network, includes: determining trick play preferences at each of a plurality of peer nodes in the distributed media group network with respect to a media item; and resolving conflicting trick play preferences between the peer nodes based on the determined trick play preferences. Other embodiments are disclosed.02-16-2012
Entries
DocumentTitleDate
20100329630SYSTEM AND METHOD FOR AN EARLY START OF AUDIO-VIDEO RENDERING - The present invention relates to a method at a receiver for playing a stream comprising a set of video samples and a set of audio samples, said audio samples and said video samples being adapted to be rendered at a standard rendering speed in a synchronized manner, comprising at the receiver the steps of starting the reception of the stream, starting the rendering of the video samples at a speed slower than the standard rendering speed and accelerating the rendering speed up to the standard rendering speed. The invention also concerns methods for an early rendering of audio samples, when the stream comprises a set of audio samples, wherein the audio and video samples are adapted to be rendered at a standard rendering speed in a synchronized manner.12-30-2010
20110013877Audiovisual (AV) Device and Control Method Thereof - According to one embodiment, video image and audio signals are transmitted to a plurality of electronic devices each having at least one of a video image display function and an audio reproducing function. Information indicating the fact that at least one of a video image and audio has been set in a mute state is acquired from each of the electronic devices. Based on the information, a processing operation is applied to the video image and audio signals supplied to each of the electronic devices.01-20-2011
20110033166SYSTEM AND METHOD FOR INTERNET ACCESS TO A PERSONAL TELEVISION SERVICE - A communication system and a family of methods for remote access to personal television service are disclosed. According to this invention, a remote personal TV service center provides centralized program guide information. A user may access to the personal TV service center through a digital video recorder which is connected to the personal TV service center via telephone modem or a network server. A user may access to the personal TV service center through a remote computer terminal or a personal digital assistant which is connected to a computer network. The user selects program events and programs the digital video recorder by using a graphical user interface installed in the front panel of the digital video recorder in case of local programming, or using a similar GUI which is incorporated into the Web pages presented to remote users by a Web server in case of remote programming. The media stream stored in one digital video recorder may be transferred to another digital video recorder. For data security protection during data transfer, all communication are authenticated and encrypted.02-10-2011
20110052136PATTERN-BASED MONITORING OF MEDIA SYNCHRONIZATION - Reference media data and monitored media data are accessed. Media data may be accessed as streams of media data, as media data stored in a memory, or any combination thereof. A first pattern of first media content (e.g., a video event) and a second pattern of second media content (e.g., an audio event) are identified in the reference media data, and their corresponding counterparts are identified in the monitored media data as a third pattern of first media content (e.g., a video event) and a fourth pattern of second media content (e.g., an audio event). After these patterns are identified, a first time interval is determined between two of the patterns, and a second time interval is determined between two of the patterns. A difference between the two time intervals is then determined and stored in a memory. This difference may be presented as a media synchronization error.03-03-2011
20110064373METHOD AND SYSTEM FOR LOOK DATA DEFINITION AND TRANSMISSION OVER A HIGH DEFINITION MULTIMEDIA INTERFACE - A method and system are provided for look data definition and transmission over a high definition multi-media interface (HDMI). The method includes generating metadata for video content. The metadata is used for altering the video content before display thereof by accounting for variations between different display devices and variations between different creative intents by a content creator. The method further includes preparing the video content and the metadata for transmission over a high definition multimedia interface.03-17-2011
20110103763SYSTEM AND METHOD FOR IDENTIFYING, PROVIDING, AND PRESENTING SUPPLEMENTAL CONTENT ON A MOBILE DEVICE - The present invention is embodied in a system for synchronizing a mobile device with video output by a video output device. In one embodiment, the system comprises a data reader, a remote control, and a router. The data reader is configured to be connected to a video output device for reading digital codes associated with video or audio signals output by the video output device. The remote control has a user input mechanism and is configured to transmit an activation signal to the data reader after activation of the user input mechanism. The router is configured to be connected to a wide area network for transmitting data from the data reader to a remote data server. The router is also configured to establish a local data connection between the data reader and a mobile device. The data reader, after receiving an activation signal from the remote control, is further configured to transmit data indicative of a digital code or series of digital codes read by the data reader to the remote data server via the router. Additionally, the data reader, after receiving a request signal from the mobile device, is further configured to transmit data indicative of a digital code or series of digital codes read by the data reader to the mobile device via the router.05-05-2011
20110110641METHOD FOR REAL-SENSE BROADCASTING SERVICE USING DEVICE COOPERATION, PRODUCTION APPARATUS AND PLAY APPARATUS FOR REAL-SENSE BROADCASTING CONTENT THEREOF - Provided is a method for a real-sense broadcasting service using device cooperation. The method for the real-sense broadcasting service may map and control synchronization of real-sense reproduction devices around a user, deviating from an existing real-sense broadcasting based on an image and a sound, and may reproduce a real-sense effect using cooperation of a device group with respect to a particular effect.05-12-2011
20110129194DIGITAL VIDEO RECORDING/PLAYBACK APPARATUS - A digital video recording/playback apparatus is provided in which maximum delay can be easily changed in accordance with the bit rate of inputted video data. The digital video recording/playback apparatus includes a memory unit, a playback unit, a synchronization unit, a synchronization signal counter, and a controller. The synchronization signal counter counts a synchronization signal to generate a frame counter value. The controller outputs a reading start instruction for the video data to the memory unit, and a decoding start instruction for the video data to the playback unit. The controller is configured to determine a first process completion time limit represented by the frame counter value as a process completion time limit for the memory unit and to determine a second process completion time limit represented by the frame counter value as a process completion time limit for the playback unit.06-02-2011
20110211802DISPLAY CONTROLLING APPARATUS FOR PLAYING BACK MOVING IMAGE AND CONTROL METHOD FOR THE SAME - Information regarding frames of a moving image, which are included in a predetermined segment that includes the current playback position and a segment that has not been played back yet, is displayed in a time-series indicator region along with the moving image that is being played back. Furthermore, a segment indicator that indicates the start position and end position of a fixed segment that includes the current playback position is displayed in the time-series indicator. For example, in the case where moving image data at a position corresponding to the segment indicator is generated as a separated moving image file in accordance with a moving image copy instruction that has been given, the user can become aware ahead of time what kind of moving image file would be generated if such an instruction were given.09-01-2011
20110280540BROADCAST MANAGEMENT SYSTEM - A broadcast management system creates, manages, and streams a broadcast of an event from videos captured from multiple cameras. A video capture system comprising multiple cameras captures videos of the event and transmits the videos to a broadcast management server. The broadcast management server generates a website or other graphical interface that simultaneously displays the captured videos in a time-synchronized manner. A broadcast manager user creates a broadcast by selecting which video to output to the broadcast at any given time. A broadcast map is stored for each broadcast that includes all of the broadcast decisions made by the broadcast manager user such that the broadcast can be recreated at a later time by applying the broadcast map to the raw videos. Using a viewer client, viewers can browse or search for broadcasts and select a broadcast for viewing.11-17-2011
20120014658PROGRAM, INFORMATION STORAGE MEDIUM, IMAGE PROCESSING DEVICE, IMAGE PROCESSING METHOD, AND DATA STRUCTURE - To provide a program, an information storage medium, an image processing device, an image processing method, and a data structure capable of collaboration between replay data and a motion picture produced by an image capturing unit during a period corresponding to the replay data. A replay data obtaining unit obtains replay data for reproducing the status of execution of a program. A captured motion picture obtaining unit obtains a captured motion picture that is produced by the image capturing unit during an image capturing period including at least a part of a reproduction period in which the status of execution of the program is reproduced based on replay data. A data holding unit holds replay data and a captured motion picture so as to be correlated to each other.01-19-2012
20120057842Method and Apparatus for Remote Voice-Over or Music Production and Management - A desktop application and supporting web site for recording voice-over or music sessions is introduced, wherein multiple participants in a collaborative session may be in separate remote locations. The application includes providing a high quality data format for transferring audiovisual data, recordings and the like, and a lower-quality, real-time data format for intercommunicating verbal instructions that relate to, but are not part of, the recording session. Peer-to-peer and server-client implementations may be optimized regarding delivery time versus take quality. The desktop application also provides mechanisms for playing back sound and video for participants' reference during a recording, delivering high quality data format take files to a remote network location or computer, along with synchronized presentations of textual, audio and visual material corresponding to the session.03-08-2012
20120128315DISPLAY SYSTEM AND IMAGE REPRODUCTION DEVICE - Image reproduction devices (05-24-2012
20120148208VIDEO-AUDIO PROCESSING APPARATUS AND VIDEO-AUDIO PROCESSING METHOD - A video-audio processing method includes acquiring encoded audio data, decoding the acquired encoded audio data and thereby creating audio data; causing an audio output unit to output the created audio data, capturing a video image of an object in synchronization with an output of the audio data by the audio output unit and thereby creating first video data, encoding the created first video data and thereby creating first encoded video data, holding the first encoded video data, and multiplexing the encoded audio data and the first encoded video data and thereby creating a first stream.06-14-2012
20120195567Identifying a Presentation Time Based on Rendition Periods and Presentation Rates - Techniques are provided for managing Presentation Time in a digital rendering system for presentation of temporally-ordered data when the digital rendering system includes a Variable Rate Presentation capability. In one embodiment, Presentation Time is converted to Data Time, and Data Time is reported instead of Presentation Time when only one time can be reported. In another embodiment, a predetermined one of Presentation Time and Data Time is returned in response to a request for a Current Time.08-02-2012
20120195568Enhancing a Rendering System to Distinguish Presentation Time from Data Time - Techniques are provided for managing Presentation Time in a digital rendering system for presentation of temporally-ordered data when the digital rendering system includes a Variable Rate Presentation capability. In one embodiment, Presentation Time is converted to Data Time, and Data Time is reported instead of Presentation Time when only one time can be reported. In another embodiment, a predetermined one of Presentation Time and Data Time is returned in response to a request for a Current Time.08-02-2012
20120321271PROVIDING VIDEO PRESENTATION COMMENTARY - Embodiments are disclosed that relate to providing commentary for video content. For example, one disclosed embodiment provides a method comprising receiving and storing an input of commentary data from each of a plurality of commentary input devices, and also, for each input of commentary data, receiving and storing identification metadata identifying a commentator, for each input of commentary data, synchronization metadata that synchronizes the commentary data with the associated media content item is received and stored. The method further comprises receiving a request from a requesting media presentation device for commentary relevant to a specified media content item and a specified user, identifying relevant commentary data based upon social network information for the specified user, and sending the relevant commentary data to the requesting client device.12-20-2012
20130129303SYSTEMS AND METHODS FOR TRANSMISSION OF MEDIA CONTENT - A method provide a selection option to the at least one portable device, the selection option relating to selection of the first audio content and retrieving a selection from the at least one portable device based on the selection option. The method further retrieves a selection of the second audio content and synchronizing the first audio content, the second audio content, and the video content by embedding a synchronizing signal in the first audio content, the second audio content, and the video content. The method further outputs the second audio content and the video content to an output device according to the synchronizing signal. Responsive to the selection of the first audio content, the first audio content with the embedded synchronizing signal is transmitted to the least one portable device, wherein the at least one portable device outputs the first audio content according to the synchronizing signal.05-23-2013
20130148938METHOD FOR PLAY SYNCHRONIZATION AND DEVICE USING THE SAME - A method for play synchronization and a device using the same are provided. A first device stores time-based content play information. The first device plays content according to the time-based content play information. The first device transmits the time-based content play information to a second device so that the second device plays the content according to the time-based content play information. Accordingly, the content played by one device can be played by another device concurrently.06-13-2013
20130177286NONINVASIVE ACCURATE AUDIO SYNCHRONIZATION - In various embodiments, a platform is provided for interactive user experiences. An application, running on device A, can be synchronised with the audio reproduced by a device B. Device A can listen to the audio of device B and obtaining the timecode by processing the recorded audio. Therefore, an application, running on a portable device, can display trivia and information exactly at certain points of a show reproduced by a TV set located in the same room.07-11-2013
20130236155AUDIO GUIDING DEVICE AND AUDIO GUIDING METHOD - An exemplary audio guiding method includes obtaining signals transmitted by one booth/spot. The method analyzes and determines an identifier contained in the signal, determines an audio file corresponding to the determined identifier according to a stored first table, and obtains the determined audio file from a storage unit. Next, the method determines whether the obtained signals contain the exhibiting information of the exhibit. If yes, the method determines the playing time point of the obtained audio file corresponding to the exhibiting information of the exhibit according to the stored first table. The method then controls an audio playing unit to play the obtained audio file from the determined playing time point.09-12-2013
20130251329SYSTEM, METHOD, AND INFRASTRUCTURE FOR SYNCHRONIZED STREAMING OF CONTENT - Systems and methods for synchronizing the playback of network media across multiple content playback devices, termed herein as “playback devices”, “clients”, or “client devices”. In one implementation, client devices are controlled to parse and buffer media content separately. Once all clients are ready, a controller may cause the client devices to start in a synchronized fashion based on signals sent by the controller. The controller adjusts the timing of the signal so that the outputs are displayed in synchronization on each client device. In other implementations, device lag times may be measured. In still other implementations, a master device may synchronize playback of media content on slave devices. In yet other implementations, devices may buffer and join playback of media content occurring on other devices. In further implementations, the systems and methods may be expanded to include steps of processing authentication for service providers prior to arranging synchronized playback.09-26-2013
20130272672MULTILINGUAL SIMULTANEOUS FILM DUBBING VIA SMARTPHONE AND AUDIO WATERMARKS - Method and apparatus for providing alternative audio for combined video and audio content, the method comprising: determining a current playback position of the combined video and audio content. Synchronising the alternative audio with the determined current playback position. Playing the alternative audio synchronised with the current playback position.10-17-2013
20130315554METHOD FOR DETERMINING THE MOVEMENTS OF AN OBJECT FROM A STREAM OF IMAGES - According to one aspect, the invention relates to a system for determining the movements of an object from a stream of images of said object. The system includes, in particular, a computer having a memory and a central processing unit, said central processing unit including: a reading unit (11-28-2013
20130330053IMAGE DISPLAY APPARATUS, MOBILE TERMINAL AND METHOD FOR OPERATING THE SAME - An image display apparatus, a mobile terminal and a method for operating the same are discussed. The method for operating the image display apparatus includes entering a wireless audio transmission mode, performing synchronization with a mobile terminal using a first wireless communication method, extracting audio data from multimedia data, and transmitting the extracted audio data to the mobile terminal using a second wireless communication method different from the first wireless communication method. By this configuration, it is possible to improve user convenience.12-12-2013
20130336626METHOD FOR SYNCHRONIZING AUDIO PLAYBACK OF DIGITAL MEDIA RENDERS, AND RELATED DIGITAL MEDIA CONTROLLER, DIGITAL AUDIO MEDIA RENDER AND DIGITAL MEDIA SERVER - An exemplary method for synchronizing audio playback of a plurality of digital media renders. The digital media renders include a digital audio/video (AV) media render and at least one digital audio media render. The method includes: detecting a relative time position difference between the digital AV media render and the at least one digital audio media render; and controlling audio playback of the digital audio media render to synchronize to audio playback of the digital AV media render according to the relative time position difference.12-19-2013
20140050454Multi Device Audio Capture - At a master device are registered one or more other devices associated with one or more audio channels for recording at least one acoustic signal from one or more sound sources. The at least one acoustic signal is recorded using at least one of the master device and one or more other devices, and the at least one recorded acoustic signal is either collected by at least one of the master device and the one or more other devices, or transmitted to another entity by at least one of the master device and the one or more other devices. In the examples the registration assigns audio and/or video channels to different microphones of the different devices. In one embodiment these different recordings are mixed at the master device and in another they are mixed at a web server into a multi-channel audio/sound (or audio-video) file.02-20-2014
20140056569CAPTURING MEDIA IN SYNCHRONIZED FASHION - Techniques for synchronizing audio and video content for presentation to a user at a same rate are provided. Streams of content from two or more sources of media, each media source having an associated clock, are synchronized by a synchronizing component and processor with respect to a master clock. As well, techniques are provided for ensuring that output devices are synchronized at preview startup. That is, such techniques ensure that the output devices start playing the media at the same time as well as at the same rate.02-27-2014
20140086549PROXIMITY-BASED VIDEO PLAYBACK SYNCHRONIZATION - A method and apparatus for video playback includes coordinating a display of a video playback on a first device so as to be synchronized to a display of the video at a second device in response to the first device departing a control territory associated with the second device.03-27-2014
20140093219Multiple Data Source Aggregation for Efficient Synchronous Multi-Device Media Consumption - Methods and systems for efficient synchronous playback among a group of devices using both locally and remotely stored data from one or more content providers are provided. A content identification and matching system can be used to identify local content on the group of devices and match the local content to content available from remotely available libraries. A control and content delivery system can be used to allow multiple devices to synchronously play content from local or remote sources as available.04-03-2014
20140105562METHOD AND APPARATUS FOR SYNCHRONIZING DATA STREAMS CONTAINING AUDIO, VIDEO AND/OR OTHER DATA - Several data streams contain video, audio and/or other data. Some of the data streams are pre-recorded in a multiplex on a storage medium while other data streams are located out of the data stream multiplex on the storage medium. The data streams are synchronized using a navigation file (List_of_PlayItems), which comprises descriptors (Play-Items, SubPlayItems) pointing to parts of said data streams, wherein said descriptors define the arrangement in time for said data streams by means of data sub stream paths.04-17-2014
20140133824SYSTEM AND METHOD FOR SIMULATANEOUS DISPLAY OF MULTIPLE GEO-TAGGED VIDEOS OF A PARTICULAR GEOGRAPHICAL LOCATION - A master-slave video playback interface in which a master video player is time based, and a slave video player displays a second video georeferenced to the first, e.g., the georeferenced frames of the second video in closest proximity to those being shown on the master player. The invention allows more efficient simultaneous viewing of multiple geo-tagged videos acquired at a singular geographical location, to compare videos, for example, collected on an oil or gas pipeline Right Of Way (ROW) at different times to monitor encroaching threats to pipeline integrity.05-15-2014
20140147088TRANSMISSION DEVICE, RECEIVING/PLAYING DEVICE, TRANSMISSION METHOD, AND RECEIVING/PLAYING METHOD - A transmission device including: a holder holding stream identification information associated with a first transmission stream among a plurality of transmission streams containing a plurality of types of information that are to be played back simultaneously by a receiving playback device, the stream identification information identifying, among the plurality of transmission streams, at least one transmission stream that is different from the first transmission stream; and a transmitter configured to transmit the stream identification information. The receiving side uses the stream identification information to identify at least one transmission stream that is to be played back simultaneously with the first transmission stream.05-29-2014
20140178027METHOD AND APPARATUS FOR RECORDING VIDEO IMAGE IN A PORTABLE TERMINAL HAVING DUAL CAMERA - A method for recording video is provided that includes outputting a first video stream captured by a first camera of portable terminal; entering into a comment recording mode while the first video stream is being captured; while in the comment recording mode: outputting the first video stream concurrently with a second video stream that is captured by a second video camera of the portable terminal, and generating sync data for a future synchronization of the first video stream with the second video stream; and exiting the comment recording mode and storing the generated sync data in a memory.06-26-2014
20140178028TERMINAL DEVICE AND METHOD FOR CONTROLLING THEREOF - A terminal device is provided. The terminal device includes a multimedia part configured to play back a content, a communicator configured to perform communication, a short-range wireless communication module configured to share communication connection information with an external device, and when tagged with the external device while a content is played back, a controller configured to control the communicator to be connected to the external device according to the communication connection information and transmit a synchronization signal and the played back content. Accordingly, the terminal device shares a synchronized content with an external device using a User Interface (UI) for controlling a plurality of external devices.06-26-2014
20140205259Screen recording for creating contents in mobile devices - The embodiments herein relate to screen recording and more particularly to creating and recording whiteboard contents along with audio and camera inputs in a mobile device. When the screen recording application is initialized, a canvas is created in sandbox of the screen recording application in the mobile device. Further a whiteboard content creation application is made run on the canvas. The content creation application creates separate recording threads for recording the whiteboard inputs and associated audio and video inputs. The inputs that are recorded using the created threads are then synchronized.07-24-2014
20140205260HAPTIC SENSATION RECORDING AND PLAYBACK - A system includes a video recorder configured to record video data, a sensor configured to sense movement of an object and output sensor data representative of the movement of the object, a transformer configured to transform the sensor data into a haptic output signal, a haptic output device configured to generate a haptic effect to a user based on the haptic output signal, a display configured to display a video, and a processor configured to synchronize the video data and the haptic output signal, and output the video data to the display and the haptic output signal to the haptic output device so that the haptic effect is synchronized with the video displayed on the display.07-24-2014
20140233905DETERMINE VIDEO TO PLAY WITH AUDIO - Example embodiments disclosed herein relate to determining video to play based on audio. Videos associated with the audio are determined. One of the videos to play with the audio is determined based on analysis.08-21-2014
20140270680System and Method for Synchronization of Selectably Presentable Media Streams - A system for synchronizing audio and video of selectably presentable multimedia content includes a memory for storing a plurality of selectably presentable multimedia content segments. Each content segment defines a portion of one or more content paths and includes a decision period during which a user may select a subsequent content segment as the content segment is playing. An assembly engine seamlessly assembles a subset of the content segments into one of the content paths, ultimately forming a multimedia presentation. A configuration manager determines an audio file and a video file to be played based on a content segment that is selected to be played immediately following the currently playing content segment. An audio engine processes the audio file for playback, and a video engine synchronizes playback of the video file with the playback of the audio file.09-18-2014
20140270681METHOD AND APPARATUS FOR ENCODING AND DECODING HAPTIC INFORMATION IN MULTI-MEDIA FILES - A method for encoding haptic information inside a multi-media file having content includes changing a portion of the content in the multi-media file, and adding the haptic information to the changed portion of the content, the haptic information corresponding to a haptic signal for generating a haptic effect upon playback of the multi-media file. A method for decoding haptic information from inside a multi-media file having content includes locating the haptic information inside the multi-media file, and generating a haptic signal based on the located haptic information during playback of the content of the multi-media file. A method includes receiving a multi-media signal comprising an audio signal and a haptic signal with a receiver of a haptic device, and outputting a haptic effect with a haptic output device of the haptic device based on the haptic signal in the multi-media signal.09-18-2014
20140321826SYNCHRONIZING EXTERNAL DATA TO VIDEO PLAYBACK - The subject disclosure is generally directed towards synchronizing live streaming videos with additional sources of contextually-related data. In one or more aspects, contextually-related data is marked with a timestamp, with a reference to each piece of data maintained in a manifest. The manifest is accessed to locate the contextually relevant information (e.g., on a companion device or the video playing device) in synchronization with the streaming video, such that a user is able to pause/stop/rewind/fast forward while maintaining the timeline synchronization between video and the data.10-30-2014
20140341525DEVICE AND METHOD FOR SYNCHRONIZING DIFFERENT PARTS OF A DIGITAL SERVICE - The invention relates to a reproduction device (11-20-2014
20140348483BROADCAST PAUSE AND RESUME FOR ENHANCED TELEVISION - Embodiments of the present invention provide for broadcast pause and resume for enhanced television. In some embodiments, software key frames identifying a state of a browser at a plurality of points in time may be used for synchronizing a series of graphics to a video stream. Other embodiments may be described and claimed.11-27-2014
20140355947SYSTEM AND METHOD FOR SYNCHRONIZING MULTI-CAMERA MOBILE VIDEO RECORDING DEVICES - System and method for synchronizing mobile recording devices for creation of a multi-camera video asset including a mobile recording device, master and slave wireless media sync devices, cloud storage system, video registry, and media management application. Exemplary embodiments provide for timing precision over current methods. Precise time-code within each device is provided without constant inter-device communication. Video is captured on each mobile video capture device without knowledge, control by other devices. A common audio signal is sent to mobile video capture devices over wireless network of sync devices. Audio waveform captured with video is identical on each device, adding an additional accuracy factor, which works in combination with time-code to improve synchronization of multi-camera mobile video capture system. Each recording device registers its recording event on network based server, so that a list may be assembled of recording devices and unique name may be added to recording by each device.12-04-2014
20140376872VIDEO AND TEXT INTEGRATION SYSTEM AND METHOD FOR OFFICIAL RECORDINGS - A system for providing an official recording of a legal proceedings includes one or more computer systems, a stenographic machine, a video recording device, and an audio recording device. Text file input from the stenographic machine is synchronized with video and audio data and delivered to appropriate parties. Starting and stopping of recording may be controlled based on outputs of the stenographic machine. A separate start/stop device may be coupled to a computer system recording video and audio and storing it locally. The start/stop device may provide visible and/or audible signals indicating when recording is occurring. Real-time concurrent display of textual data from the stenographic machine with video and/or audio data being received may also be provided.12-25-2014
20150016797MEDIA RECOGNITION AND SYNCHRONISATION TO A MOTION SIGNAL - The present document describes a device and method for synchronizing a motion signal corresponding to a media content with a media signal for the media content, the motion signal being for controlling a motion feedback system. The method comprises: receiving a portion of the media signal; obtaining a fingerprint corresponding to the received portion of the media signal; from reference fingerprints associated with time positions of at least one reference media content, identifying a reference time position of the media content corresponding to the obtained fingerprint; obtaining the motion signal associated with the identified reference time position of the media content; and outputting the motion signal synchronized with the media signal using the identified reference time position of the media content for controlling the motion feedback system.01-15-2015
20150023647DATA SYNCHRONOUS REPRODUCTION APPARATUS, DATA SYNCHRONOUS REPRODUCTION METHOD, AND DATA SYNCHRONIZATION CONTROL PROGRAM - A data synchronous reproduction apparatus capable of synchronously reproducing image frames and numerical data at high speed, including: a data storage section configured to store image data including multiple image frames of monitored objects captured by cameras, and process data including multiple numerical data acquired from the monitored objects in time series; a program storage section configured to store a data synchronous control program configured to reproduce the image data and synchronously reproduce the image data and the process data on the basis of frame numbers and a frame period of the image frames and a sampling interval or record numbers of the numerical data; and a CPU configured to execute the data synchronous control program.01-22-2015
20150030305APPARATUS AND METHOD FOR PROCESSING STAGE PERFORMANCE USING DIGITAL CHARACTERS - The present invention relates to an apparatus and method for processing a stage performance using digital characters. According to one embodiment of the present invention, an apparatus for processing a virtual video performance using a performance of an actor includes a motion input unit for receiving an input motion from the actor through a sensor attached to the body of the actor, a performance processor for creating a virtual space and reproducing a performance in real time according to a pre-stored scenario, a playable character (PC) played by the actor and acting based on the input motion, a non-playable character (NPC) acting independently without being controlled by the actor, an object, and a background being arranged and interacting with one another in the virtual space, and an output unit for generating a performance image from the performance reproduced by the performance processor and outputting the performance image to a display device.01-29-2015
20150043884INFORMATION PROCESSING DEVICE, SHOOTING APPARATUS AND INFORMATION PROCESSING METHOD - An information processing device is provided with: an image meaning judgment section classifying and judging an inputted image as having a particular meaning by classifying characteristics of the image itself and referring to a database; an audio meaning judgment section classifying and judging an inputted audio as having a particular meaning by classifying characteristics of the audio itself and referring to a database; and an association control section outputting the inputted image and the inputted audio acquired at different timings mutually in association with each other on the basis of each of judgment results of the image meaning judgment section and the audio meaning judgment section; and the information processing device is capable of, even if an image without a corresponding audio or an audio without a corresponding image is inputted, outputting the image and the audio in association with each other.02-12-2015
20150050003COMPUTER PROGRAM, METHOD, AND SYSTEM FOR MANAGING MULTIPLE DATA RECORDING DEVICES - A multiple recording device management system including an intermediate multiple recording device managing apparatus, a vehicle recording device mounted in a police vehicle and synced to the managing apparatus, and a personal recording device carried by a police officer and wirelessly synced to the managing apparatus. The managing apparatus is operable to detect when the vehicle recording device, personal recording device, or any other synced device in range has begun recording and to transmit a communication signal to any synced recording device in range indicating that the recording device should begin recording and to further transmit a time stamp to synced recording devices for corroborating recorded data.02-19-2015
20150055929CAMERA ARRAY INCLUDING CAMERA MODULES - The disclosure includes a camera array comprising camera modules, the camera modules comprising a master camera that includes a processor, a memory, a sensor, a lens, a status indicator, and a switch, the switch configured to instruct each of the camera modules to initiate a start operation to start recording video data using the lens and the sensor in the other camera modules and the switch configured to instruct each of the camera modules to initiate a stop operation to stop recording, the status indicator configured to indicate a status of at least one of the camera modules.02-26-2015
20150086172SYNCHRONIZATION OF EVENTS AND AUDIO OR VIDEO CONTENT DURING RECORDING AND PLAYBACK OF MULTIMEDIA CONTENT ITEMS - A computer-implemented method for simultaneously recording a media recording and an event recording includes recording a media recording, recording an event recording simultaneously with the media recording, the event recording encoding a plurality of events, an event being related to one or more user interactions with an input device associated with the media recording and recording the event recording includes for each of a plurality of events of the event recording generating data characterizing the particular event and generating a corresponding time stamp for the particular event by polling a system time of a computer device at the time the particular event takes place, the method further includes providing the data characterizing the particular event and the corresponding time stamp for storage.03-26-2015
20150086173Second Screen Locations Function. - Systems for, and methods of, displaying video information comprising: a second screen device obtaining current play position data of a video being played on a primary screen device (e.g., obtaining from the primary screen device an identification of a current play position of the video, or obtaining information to generate an acoustic fingerprint of the video); determining a current play position of the video playing on the primary screen device based upon the current play position data (e.g., identification of the current play position or the acoustic fingerprint); downloading information (e.g., video map, subtitles, moral principles, objectionable content, memorable content, performers, geographical maps, shopping, plot point, item, ratings, and trivia information) over a computer communications network into the memory of the second screen device; and displaying information on the second screen device synchronized with the contemporaneously played video on the primary screen device.03-26-2015
20150086174Second Screen Dilemma Function - Systems for, and methods of, displaying video information comprising: a second screen device obtaining current play position data of a video being played on a primary screen device (e.g., obtaining from the primary screen device an identification of a current play position of the video, or obtaining information to generate an acoustic fingerprint of the video); determining a current play position of the video playing on the primary screen device based upon the current play position data (e.g., identification of the current play position or the acoustic fingerprint); downloading information (e.g., video map, subtitles, moral principles, objectionable content, memorable content, performers, geographical maps, shopping, plot point, item, ratings, and trivia information) over a computer communications network into the memory of the second screen device; and displaying information on the second screen device synchronized with the contemporaneously played video on the primary screen device.03-26-2015
20150093093Second Screen Subtitles Function - Systems for, and methods of, displaying video information comprising: a second screen device obtaining current play position data of a video being played on a primary screen device (e.g., obtaining from the primary screen device an identification of a current play position of the video, or obtaining information to generate an acoustic fingerprint of the video); determining a current play position of the video playing on the primary screen device based upon the current play position data (e.g., identification of the current play position or the acoustic fingerprint); downloading information (e.g., video map, subtitles, moral principles, objectionable content, memorable content, performers, geographical maps, shopping, plot point, item, ratings, and trivia information) over a computer communications network into the memory of the second screen device; and displaying information on the second screen device synchronized with the contemporaneously played video on the primary screen device.04-02-2015
20150110455UTILITY AND METHOD FOR CAPTURING COMPUTER-GENERATED VIDEO OUTPUT - A video capture utility and method for a computer system. In one embodiment, the video capture utility includes: (1) a circular buffer allocated in a memory of the computer system to store at most a predefined video length, (2) a video output interceptor executable in a processor of the computer system and operable to receive and store video output most recently generated by an application program and (3) a video output extractor executable in the processor and operable to prompt contents of the circular buffer to be copied from the circular buffer to another location.04-23-2015
20150110456SYSTEM FOR PROVIDING VIDEO FOR VISUALLY IMPAIRED PERSON - Disclosed is a system for providing video for a visually impaired person. The system captures video of a subject to be captured using a video capturing unit, compresses the captured video, and employs wireless data transmission for transmitting the compressed video over a wireless communication network in real time. The system includes a video capturing unit for capturing video of a subject, and a video output unit for controlling the video capturing unit and displaying the video captured by the video capturing unit.04-23-2015
20150110457Second Screen Shopping Function - Systems for, and methods of, displaying video information comprising: a second screen device obtaining current play position data of a video being played on a primary screen device (e.g., obtaining from the primary screen device an identification of a current play position of the video, or obtaining information to generate an acoustic fingerprint of the video); determining a current play position of the video playing on the primary screen device based upon the current play position data (e.g., identification of the current play position or the acoustic fingerprint); downloading information (e.g., video map, subtitles, moral principles, objectionable content, memorable content, performers, geographical maps, shopping, plot point, item, ratings, and trivia information) over a computer communications network into the memory of the second screen device; and displaying information on the second screen device synchronized with the contemporaneously played video on the primary screen device.04-23-2015
20150110458Second Screen Trivia Function - Systems for, and methods of, displaying video information comprising: a second screen device obtaining current play position data of a video being played on a primary screen device (e.g., obtaining from the primary screen device an identification of a current play position of the video, or obtaining information to generate an acoustic fingerprint of the video); determining a current play position of the video playing on the primary screen device based upon the current play position data (e.g., identification of the current play position or the acoustic fingerprint); downloading information (e.g., video map, subtitles, moral principles, objectionable content, memorable content, performers, geographical maps, shopping, plot point, item, ratings, and trivia information) over a computer communications network into the memory of the second screen device; and displaying information on the second screen device synchronized with the contemporaneously played video on the primary screen device.04-23-2015
20150304696TIME SYNCHRONIZATION METHOD AND SYSTEM - A time synchronization method is provided. Firstly, an update time of a second playing device is calculated according to a first system time of transmitting a first packet from a first playing device and a second system time of receiving the first packet by the second playing device. Then, a first difference value is calculated according to the first system time of transmitting a second packet from the first playing device and the update time corresponding to the reception of the second packet by the second playing device. If plural first difference values are all smaller than a first predetermined value, a second difference value is calculated. If the second difference value is smaller than a second predetermined value, a synchronization time is obtained according to the second difference value and the update time.10-22-2015
20150325269Synchronisation of Audio and Video Playback - A method of playing audio content to a viewer in synchronisation with a video content. The method comprises receiving and digitising an ultrasonic signal comprising ultrasonic synchronisation signal(s), the ultrasonic synchronisation signal(s) comprising a timecode-carrying part that encodes a respective timecode through modulation of ultrasonic carrier signal(s), the ultrasonic synchronisation signal(s) comprising further an ultrasonic marker signal conterminous with the timecode-carrying part. The timecode-carrying part is identified based on a received ultrasonic marker signal and decoded to determine the corresponding timecode. The stored audio content is played back from a playback point determined based on the timecode.11-12-2015
20150334363METHOD FOR PLAY SYNCHRONIZATION AND DEVICE USING THE SAME - Methods and apparatuses are provided for providing contents. A plurality of contents is reproduced at the first device. The plurality of contents is transmitted from the first device to a second device to reproduce, in the second device, content in synchronization with the first device. Data is received from an external device during the reproducing of the plurality of contents at the first device. The received data is reproduced along with at least one of the plurality of contents being reproduced at the first device. The received data is transmitted from the first device to the second device. The transmitted data is used to enable the second device to reproduce the transmitted data together with the at least one of the plurality of contents.11-19-2015
20150340066SYSTEMS AND METHODS FOR CREATING AND ENHANCING VIDEOS - Embodiments of the present disclosure help to automatically generate video selected from multiple video sources using intelligent sensor processing, thereby providing viewers with a unique and rich viewing experience quickly and inexpensively.11-26-2015
20150348591SENSOR AND MEDIA EVENT DETECTION SYSTEM - Enables detection of events using motion capture sensors and potentially other sensors electromagnetic field, temperature, humidity, wind, pressure, elevation, light, sound, or heart rate sensors to confirm and post events, differentiate similar types of motion events to determine the type of equipment or activity or quality of the event, such proficiency. Enables motion capture data and other sensor data to be utilized to curate text, images, video, sound and post the results to social networks, for example in a dedicated feed. Embodiments of the system also may post or filter to social media sites using any other filter besides location and time and the text in the social media posts for example. May use motion or other sensor data to define and event, eliminate false positive events, post true events, and/or correlate the events with social media to confirm the events, or post the events in a particular channel.12-03-2015
20150350621SOUND PROCESSING SYSTEM AND SOUND PROCESSING METHOD - A recorder receives designation of a video which is desired to be reproduced from a user. If designation of one or more designated locations where sound is emphasized on a screen of a display which displays the video is received by the recorder from the user via an operation unit during reproduction or temporary stopping of the video, a signal processing unit performs an emphasis process on audio data, that is, the signal processing unit emphasizes audio data in directions directed toward positions corresponding to the designated locations from a microphone array by using audio data recorded in the recorder. A reproducing device reproduces the emphasis-processed audio data and video data in synchronization with each other.12-03-2015
20150380049METHOD FOR DETERMINING BIT RATE AND APPARATUS THEREFOR - The present invention relates to a method for controlling a bit rate and an apparatus therefor, and more specifically to an apparatus for storing a bit rate changed according to a significant level in a memory and a method for determining the bit rate, which meets the requirements for the distortion according to the memory space limitation and the significant level of the image information so as to minimize the energy consumption.12-31-2015
20150380054METHOD AND APPARATUS FOR SYNCHRONIZING AUDIO AND VIDEO SIGNALS - A method, apparatus and computer program product are provided to synchronize audio signals with video images that are replayed with a modified motion. In a method, a trajectory is determined for each audio object of an audio signal. The method also determines each of the audio objects to be a transient or non-transient object. The method also causes a respective audio object to be differently extended depending upon whether the audio object is determined to be a transient object or a non-transient object, thereby synchronizing video signals that are to be played back with a predefined motion. The method causes the respective audio object to be differently extended by splitting the transient object into transient segments, inserting silent segments therebetween and maintaining the trajectories of the transient object and/or by repeating the non-transient object with a trajectory that varies based on the predefined motion of the video signals.12-31-2015
20160005438ASSOCIATING PLAYBACK DEVICES WITH PLAYBACK QUEUES - In an example implementation, a system groups a first playback device and a second playback device into first player group. The system associates the first player group with a first playback queue that includes items for playback by the first player group. The first and second playback devices are configured to play items from the first playback queue while in the first player group. The system associates a third playback device with a second playback queue that includes items for playback by the second playback device. Thereafter, the system groups the first, second and third playback devices to form a second player group such that each of the first, second, and third playback devices are configured to play items from the second playback queue. The system removes the first playback device from the second player group and automatically associates the first playback device with the first playback queue.01-07-2016
20160005439SYSTEMS AND METHODS FOR NETWORKED MEDIA SYNCHRONIZATION - This disclosure illustrates systems and methods for generating synchronized audio output signals. The synchronization may be performed based on an audio synchronization message received by an audio synchronization computing platform over a communication network. The audio synchronization computing platform may generate a synchronized audio output signal from an audio signal. The audio synchronization computing platform may provide the synchronized audio output signal to one or more audio playback devices over the same communication network. The communication network may be a single ethernet network.01-07-2016
20160021332Spatially Synchronized Video - A method and apparatus to spatially synchronize individual frames of a digital video recording with the spatial location of a recording camera at the time that the relevant digital video frame was recorded. The digital video is played back at a speed in relation to the rate that a playback device is traveling. The frame rate is based on movement of the recording device—frames per distance (fpd) moved. The spatially recorded digital video includes periodic position frames, with absolute or virtual position. Alternatively, position frames may be captured after a given change of distance.01-21-2016
20160042767INTEGRATING DATA FROM MULTIPLE DEVICES - A recording system for an emergency response unit includes a first data collection device configured to record a first video, audio or data segment with an incident identifier and transmit a message including the incident identifier. A second data collection device may receive the message and, as appropriate, record at least a second video, audio or data segment with the incident identifier, allowing the first segment and the second segment to be associated using the incident identifier. In other embodiments, a first recording device may begin recording video, audio or legal evidence data with an incident identifier, and a control system may receive a message including the incident identifier from the first recording device, identify one or more additional recording devices located within a certain distance of the first recording device, and obtain recordings from the one or more additional recording devices.02-11-2016
20160055879SYSTEMS AND METHODS FOR AUTOMATICALLY PERFORMING MEDIA ACTIONS BASED ON STATUS OF EXTERNAL COMPONENTS - Systems and methods for automatically performing a media action based on the status of external components are provided. A media asset is determined to be presented on a user equipment device. Components external to the user equipment device are monitored to obtain status information pertaining to the components. A determination is made as to whether a media action is associated with the status information. The identified media action is performed for the media asset in response to determining that the media asset is associated with the status information.02-25-2016
20160068002HYBRID PRINT-ELECTRONIC BOOK - The present invention is directed to a hybrid book. In one embodiment, the hybrid book includes a plurality of pages containing a plurality of fixed media, a data controller, digital storage, an electronic visual display, a display driver and a power source. The hybrid book is configured and adapted to receive visual media into the digital storage for playback on the electronic visual display. The power source may optionally be a battery and preferably a rechargeable battery. The hybrid book presents the plurality of fixed media and the visual media in combination. The data controller, digital storage, electronic visual display and display driver are preferably communicatively coupled together. The digital storage preferably is programmable, but not re-writeable or re-programmable. In other embodiments, the digital storage may be re-writeable or re-programmable. The pages may be fixed by a spine. The hybrid book may include a cover, and the electronic visual display may be attached to the cover. The hybrid book may be configured and adapted to present the plurality of fixed media simultaneously with the visual media. The hybrid book may include a plurality of input and/or output ports. A port may be included for a power source to recharge the battery or to connect an external power supply. Another port may be used to load electronic media into the digital storage.03-10-2016
20160071548Play Sequence Visualization and Analysis - A method for visualizing plays in a sporting event may include receiving a video stream of the sporting event and a measurement stream, asynchronous to the video stream, associated with objects in the sporting event. The method may further include displaying a synchronized presentation of the video stream and the measurement stream. The synchronization may be performed near the time of the displaying. Another method for visualizing plays in a sporting event may include receiving measurement information related to actions from one or more sporting events. The method may also include identifying plays from the actions using the measurement information and displaying a representation of the identified plays. A system for visualizing plays in a sporting event may include an integrated server and a synchronization mechanism. Another method for visualizing plays in a sporting event may include displaying a video of a play selected from a representation.03-10-2016
20160139871METHOD AND APPARATUS FOR ASSOCIATING AN AUDIO SOUNDTRACK WITH ONE OR MORE VIDEO CLIPS - A method, apparatus and computer program product are provided to facilitate the association of a selected portion of an audio soundtrack with one or more video clips. In the context of a method, a visual representation of an audio soundtrack is caused to be displayed. The audio soundtrack is to be associated with one or more video clips to facilitate concurrent playback of at least a portion of the audio soundtrack and one or more video clips. The method also receives user input with respect to the audio soundtrack and, in response to the user input, adjusts a starting point of the audio soundtrack relative to the one or more video clips. The starting point may be adjusted by defining the starting point of the audio soundtrack based on the user input and also in a manner so as to coincide with a predefined feature of the audio soundtrack.05-19-2016
20160165297SEAMLESS PLAYBACK OF MEDIA CONTENT USING DIGITAL WATERMARKING - Methods and apparatus are described that provide a mechanism for transferring and/or synchronizing playback of media content from one media device to another in a seamless fashion. A media device 06-09-2016
20160180879SYSTEMS AND METHODS FOR RECORDING HAPTIC DATA FOR USE WITH MULTI-MEDIA DATA06-23-2016
20160180883METHOD AND SYSTEM FOR CAPTURING, SYNCHRONIZING, AND EDITING VIDEO FROM A PLURALITY OF CAMERAS IN THREE-DIMENSIONAL SPACE06-23-2016
20160180884METHOD AND SYSTEM FOR SYNCHRONIZATION OF MULTIPLE CONTENT STREAMS06-23-2016
20160196851METHODS AND DEVICES FOR DISTRIBUTED AUDIO/VIDEO SYNCHRONIZATION AND PLAYBACK USING AN A/V ORCHESTRATOR07-07-2016

Patent applications in class Synchronization

Patent applications in all subclasses Synchronization

Website © 2025 Advameg, Inc.