CA2845465A1 - Synchronizing multiple transmissions of content - Google Patents
Synchronizing multiple transmissions of content Download PDFInfo
- Publication number
- CA2845465A1 CA2845465A1 CA2845465A CA2845465A CA2845465A1 CA 2845465 A1 CA2845465 A1 CA 2845465A1 CA 2845465 A CA2845465 A CA 2845465A CA 2845465 A CA2845465 A CA 2845465A CA 2845465 A1 CA2845465 A1 CA 2845465A1
- Authority
- CA
- Canada
- Prior art keywords
- content
- screen
- interfaces
- interface
- synchronization
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/10—Protocols in which an application is distributed across nodes in the network
- H04L67/1095—Replication or mirroring of data, e.g. scheduling or transport for data synchronisation between network nodes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/24—Monitoring of processes or resources, e.g. monitoring of server load, available bandwidth, upstream requests
- H04N21/2408—Monitoring of the upstream path of the transmission network, e.g. client requests
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/242—Synchronization processes, e.g. processing of PCR [Program Clock References]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/4104—Peripherals receiving signals from specially adapted client devices
- H04N21/4126—The peripheral being portable, e.g. PDAs or mobile phones
- H04N21/41265—The peripheral being portable, e.g. PDAs or mobile phones having a remote control device for bidirectional communication between the remote control device and client device
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/63—Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
- H04N21/633—Control signals issued by server directed to the network components or client
- H04N21/6332—Control signals issued by server directed to the network components or client directed to client
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/85—Assembly of content; Generation of multimedia applications
- H04N21/854—Content authoring
- H04N21/8545—Content authoring for generating interactive applications
Abstract
The disclosure relates generally to providing synchronized supplemental content. In one aspect, second user devices may be used to consume supplemental content associated with primary content presented on a first display. The supplemental content may be synchronized with the primary content. Such synchronization may be performed by detecting and reporting triggers within the primary content and transmitting synchronization signals to appropriate second user devices. Another aspect of the disclosure relates to determining which interfaces or applications will report the triggers, and fine-tuning how many interfaces or applications will report the triggers.
Description
µ .
SYNCHRONIZING MULTIPLE TRANSMISSIONS OF CONTENT
BACKGROUND
[01] Television viewing is no longer the static, isolated, or passive pastime that it used to be.
Today, viewers have the option of using a computing device, such as a tablet computer or smartphone, as a second screen to view a webpage related to a show they are watching, thereby keeping viewers engaged in a particular program. However, there is a demand for taking second screen experiences further. Specifically, there is a demand for supplemental content (e.g., second screen content) that is synchronized with the primary content users are watching. While users want synchronization between their supplemental content and other programs, they also want and rely on fast network speeds.
Thus, systems and methods for providing synchronized supplemental content with minimal impact on network bandwidth and other benefits are desired.
SUMMARY
SYNCHRONIZING MULTIPLE TRANSMISSIONS OF CONTENT
BACKGROUND
[01] Television viewing is no longer the static, isolated, or passive pastime that it used to be.
Today, viewers have the option of using a computing device, such as a tablet computer or smartphone, as a second screen to view a webpage related to a show they are watching, thereby keeping viewers engaged in a particular program. However, there is a demand for taking second screen experiences further. Specifically, there is a demand for supplemental content (e.g., second screen content) that is synchronized with the primary content users are watching. While users want synchronization between their supplemental content and other programs, they also want and rely on fast network speeds.
Thus, systems and methods for providing synchronized supplemental content with minimal impact on network bandwidth and other benefits are desired.
SUMMARY
[02] Some or all of the various features described herein may facilitate synchronization of supplemental content (e.g., second screen content) displayed on a second user device (e.g., a second screen device such as a tablet computer, smartphone, laptop, etc.) with primary content displayed on a first user device (e.g., a first screen device such as a television or video display) thereby providing a desirable second screen experience.
Some aspects described below allow for synchronization of supplemental content with linear or time shifted primary content while minimizing an impact on network bandwidth.
Some aspects described below allow for synchronization of supplemental content with linear or time shifted primary content while minimizing an impact on network bandwidth.
[03] In accordance with some aspects of the disclosure, a plurality of interfaces (e.g., gateways, user devices, set top boxes, etc.) may receive triggers from a trigger source for supplemental content (e.g., second screen content) to be displayed on second screen devices in synchronicity with primary content. A subset of those interfaces may be selected to represent the plurality, and the subset may report back trigger receipt information to the trigger source, to allow the source to adjust its timing of future triggers , .
for the plurality of interfaces. Because a subset of the interfaces (which may be a relatively small amount of interfaces in comparison to the total number of interfaces) are used to report the triggers, instead of all of the interfaces, the impact on the upstream bandwidth of a service provider's network may be minimized.
for the plurality of interfaces. Because a subset of the interfaces (which may be a relatively small amount of interfaces in comparison to the total number of interfaces) are used to report the triggers, instead of all of the interfaces, the impact on the upstream bandwidth of a service provider's network may be minimized.
[04] In an illustrative embodiment, the disclosure teaches grouping or categorizing interfaces into different zones or groups, and teaches that the interfaces in a common zone or group are expected, to receive the same primary content at approximately the same time. In contrast, different zones may receive similar primary content (e.g., a television program), but the primary content may include different advertisements or may be time shifted differently (e.g., have different delays). In each zone, a subset of interfaces are expected to detect and report triggers embedded within or otherwise transmitted with the primary content. The subset of interfaces may vary. In some embodiments, the interfaces themselves or an associated computing device may compute an algorithm to determine whether they should report the trigger. An algorithm may be based on statistical information received from a service provider or some other administrator. The statistical information may be specific to each zone (or group) because, for example, it may be expected that users in some zones are more likely to view certain content than users in another zone. An algorithm may take into account the channel for delivering the primary content, time of day, and other factors when determining whether the interface (or an application associated with the interface) should report a trigger.
[05] In another aspect, and in particular to further reduce network traffic, multipliers may be sent at more frequent intervals than the statistical information, which may include a heavier payload than the multiplier. A multiplier may be a factor used to adjust the subset of interfaces reporting triggers. Regardless of the method for determining whether a trigger should be reported, when the trigger is reported, a trigger detection signal may be sent upstream on the same lines or channel that the primary content is received on.
The trigger detection signal may eventually reach one or more servers dedicated for receiving and processing such signals. The server(s) may then push synchronization signals to certain second screen devices that it knows or believes are displaying second screen content associated with the primary content from which the trigger was detected.
The server may transmit synchronization signals to second screen devices in response to the trigger detection signals. The second screen devices may then use the synchronization signals to synchronize second screen content, which may be received from another source, e.g., another network or server, or already downloaded on the second screen device. Additionally, aspects of the present disclosure teach computing devices, having a processor and memory storing computer-executable instructions, and other apparatuses to perform the above steps and other steps for improving a second screen experience.
The trigger detection signal may eventually reach one or more servers dedicated for receiving and processing such signals. The server(s) may then push synchronization signals to certain second screen devices that it knows or believes are displaying second screen content associated with the primary content from which the trigger was detected.
The server may transmit synchronization signals to second screen devices in response to the trigger detection signals. The second screen devices may then use the synchronization signals to synchronize second screen content, which may be received from another source, e.g., another network or server, or already downloaded on the second screen device. Additionally, aspects of the present disclosure teach computing devices, having a processor and memory storing computer-executable instructions, and other apparatuses to perform the above steps and other steps for improving a second screen experience.
[06] Other details and features will also be described in the sections that follow. This summary is not intended to identify critical or essential features of the inventions claimed herein, but instead merely summarizes certain features and variations thereof BRIEF DESCRIPTION OF THE DRAWINGS
[07] Some features herein are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which like reference numerals refer to similar elements.
[08] Figure 1 illustrates an example communication network on which various features described herein may be used.
[09] Figure 2 illustrates an example computing device and software configuration that can be used to implement any of the methods, servers, entities, and computing devices described herein.
, .
, .
[10] Figure 3 illustrates a system architecture according to one or more illustrative aspects described herein.
[11] Figure 4 illustrates a diagram illustrating an example embodiment of another aspect of the present disclosure.
[12] Figure 5 illustrates a system architecture according to one or more illustrative aspects described herein.
[13] Figure 6 is a flow diagram illustrating an example method according to one or more aspects of the disclosure.
[14] Figure 7 is a flow diagram illustrating an example method according to one or more aspects of the disclosure.
[15] Figure 8 is a flow diagram illustrating an example method according to one or more aspects of the disclosure.
[16] Figures 9A and 9B are flow diagrams illustrating example methods according to one or more aspects of the disclosure.
[17] Figure 10 is a flow diagram illustrating an example method according to one or more aspects of the disclosure.
DETAILED DESCRIPTION
DETAILED DESCRIPTION
[18] In the following description of various illustrative embodiments, reference is made to the accompanying drawings, which form a part hereof, and in which is shown, by way of illustration, various embodiments in which aspects of the disclosure may be practiced. It is to be understood that other embodiments may be utilized, and structural and functional modifications may be made, without departing from the scope of the present disclosure.
[19] By way of introduction, the various features described herein may allow a user to consume primary content (e.g., watch a television program) on a first device (e.g., a television) and second screen content, which is synchronized with the primary content, on a second device (e.g., a smartphone, tablet, laptop, etc.). In one example, an interface (e.g., a set top box) associated with the first device may determine whether it should report a trigger detected within the primary content. As a result, a sample of interfaces may report detected triggers instead of all interfaces thereby minimizing the impact of reporting triggers on the upstream bandwidth. This determination may be based on statistical information, a multiplier, and/or other data received from a service provider or other administrator. The system may monitor and update this information so that the reporting of the triggers may be further optimized. Based on detection signals received from the sample of interfaces, synchronization signals may be generated and transmitted (e.g., pushed) to second screen devices so that the second screen devices can synchronize second screen content with the primary content.
[20] FIG. 1 illustrates an example communication network 100 on which many of the various features described herein, such as the requesting and retrieval of primary content and second screen content may be implemented. Network 100 may be any type of information distribution network, such as satellite, telephone, cellular, wireless, etc. One example may be an optical fiber network, a coaxial cable network, or a hybrid fiber/coax distribution network. Such networks 100 use a series of interconnected communication links 101 (e.g., coaxial cables, optical fibers, wireless, etc.) to connect multiple premises 102 (e.g., businesses, homes, consumer dwellings, etc.) to a local office or headend 103.
The local office 103 may transmit downstream information signals onto the links 101, and each premises 102 may have a receiver used to receive and process those signals.
The local office 103 may transmit downstream information signals onto the links 101, and each premises 102 may have a receiver used to receive and process those signals.
[21] There may be one link 101 originating from the local office 103, and it may be split a number of times to distribute the signal to various premises 102 in the vicinity (which may be many miles) of the local office 103. The links 101 may include components not . ..r illustrated, such as splitters, filters, amplifiers, etc. to help convey the signal clearly, but in general each split introduces a bit of signal degradation. Portions of the links 101 may also be implemented with fiber-optic cable, while other portions may be implemented with coaxial cable, other lines, or wireless communication paths. By running fiber optic cable along some portions, for example, signal degradation may be significantly minimized, allowing a single local office 103 to reach even farther with its network of links 101 than before.
[22] The local office 103 may include an interface, such as a termination system (TS) 104.
More specifically, the interface 104 may be a cable modem termination system (CMTS), which may be a computing device configured to manage communications between devices on the network of links 101 and backend devices such as servers 105-107 (to be discussed further below). The interface 104 may be as specified in a standard, such as the Data Over Cable Service Interface Specification (DOCSIS) standard, published by Cable Television Laboratories, Inc. (a.k.a. CableLabs), or it may be a similar or modified device instead. The interface 104 may be configured to place data on one or more downstream frequencies to be received by modems at the various premises 102, and to receive upstream communications from those modems on one or more upstream frequencies.
More specifically, the interface 104 may be a cable modem termination system (CMTS), which may be a computing device configured to manage communications between devices on the network of links 101 and backend devices such as servers 105-107 (to be discussed further below). The interface 104 may be as specified in a standard, such as the Data Over Cable Service Interface Specification (DOCSIS) standard, published by Cable Television Laboratories, Inc. (a.k.a. CableLabs), or it may be a similar or modified device instead. The interface 104 may be configured to place data on one or more downstream frequencies to be received by modems at the various premises 102, and to receive upstream communications from those modems on one or more upstream frequencies.
[23] The local office 103 may also include one or more network interfaces 108, which can permit the local office 103 to communicate with various other external networks 109.
These networks 109 may include, for example, networks of Internet devices, telephone networks, cellular telephone networks, fiber optic networks, local wireless networks (e.g., WiMAX), satellite networks, and any other desired network, and the network interface 108 may include the corresponding circuitry needed to communicate on the external networks 109, and to other devices on the network such as a cellular telephone network and its corresponding cell phones. For example, the network interface 108 may communicate with a wireless device 116 via the external network 109 so that the wireless . ..
device 116 may receive supplemental content from the local office 103 or other computing devices connected to the external network 109.
These networks 109 may include, for example, networks of Internet devices, telephone networks, cellular telephone networks, fiber optic networks, local wireless networks (e.g., WiMAX), satellite networks, and any other desired network, and the network interface 108 may include the corresponding circuitry needed to communicate on the external networks 109, and to other devices on the network such as a cellular telephone network and its corresponding cell phones. For example, the network interface 108 may communicate with a wireless device 116 via the external network 109 so that the wireless . ..
device 116 may receive supplemental content from the local office 103 or other computing devices connected to the external network 109.
[24] As noted above, the local office 103 may include a variety of servers 105-107 that may be configured to perform various functions. For example, the local office 103 may include a push notification server 105. The push notification server 105 may generate push notifications to deliver data and/or commands to the various premises 102 in the network (or more specifically, to the devices in the premises 102 that are configured to detect such notifications). The local office 103 may also include a content server 106.
The content server 106 may be one or more computing devices that are configured to provide content to users at their premises. This content may be, for example, video on demand movies, television programs, songs, text listings, etc. The content server 106 may include software to validate user identities and entitlements, to locate and retrieve requested content, to encrypt the content, and to initiate delivery (e.g., streaming) of the content to the requesting user(s) and/or device(s).
The content server 106 may be one or more computing devices that are configured to provide content to users at their premises. This content may be, for example, video on demand movies, television programs, songs, text listings, etc. The content server 106 may include software to validate user identities and entitlements, to locate and retrieve requested content, to encrypt the content, and to initiate delivery (e.g., streaming) of the content to the requesting user(s) and/or device(s).
[25] The local office 103 may also include one or more application servers 107. An application server 107 may be a computing device configured to offer any desired service, and may run various languages and operating systems (e.g., servlets and JSP
pages running on Tomcat/MySQL, OSX, BSD, Ubuntu, Redhat, HTML5, JavaScript, AJAX and COMET). For example, an application server may be responsible for collecting television program listings information and generating a data download for electronic program guide listings. Another application server may be responsible for monitoring user viewing habits and collecting that information for use in selecting advertisements. Yet another application server may be responsible for formatting and inserting advertisements in a video stream being transmitted to the premises 102.
Although shown separately, one of ordinary skill in the art will appreciate that the push server 105, content server 106, and application server 107 may be combined.
Further, here the push server 105, content server 106, and application server 107 are shown generally, and it will be understood that they may each contain memory storing computer executable instructions to cause a processor to perform steps described herein and/or memory for storing data, such as information for identifying a user or second screen device.
pages running on Tomcat/MySQL, OSX, BSD, Ubuntu, Redhat, HTML5, JavaScript, AJAX and COMET). For example, an application server may be responsible for collecting television program listings information and generating a data download for electronic program guide listings. Another application server may be responsible for monitoring user viewing habits and collecting that information for use in selecting advertisements. Yet another application server may be responsible for formatting and inserting advertisements in a video stream being transmitted to the premises 102.
Although shown separately, one of ordinary skill in the art will appreciate that the push server 105, content server 106, and application server 107 may be combined.
Further, here the push server 105, content server 106, and application server 107 are shown generally, and it will be understood that they may each contain memory storing computer executable instructions to cause a processor to perform steps described herein and/or memory for storing data, such as information for identifying a user or second screen device.
[26] An example premises 102a, such as a home, may include an interface 120.
The interface 120 may include computer-executable instructions (e.g., an application) for performing one or more aspects of the disclosure, such as detecting triggers, determining whether to report triggers, and/or generating detection signals to report triggers. The interface 120 can include any communication circuitry needed to allow a device to communicate on one or more links 101 with other devices in the network. For example, the interface 120 may include a modem 110, which may include transmitters and receivers used to communicate on the links 101 and with the local office 103. The modem 110 may be, for example, a coaxial cable modem (for coaxial cable lines 101), a fiber interface node (for fiber optic lines 101), twisted-pair telephone modem, cellular telephone transceiver, satellite transceiver, local wi-fl router or access point, or any other desired modem device. Also, although only one modem is shown in FIG. 1, a plurality of modems operating in parallel may be implemented within the interface 120. Further, the interface 120 may include a gateway interface device 111. The modem 110 may be connected to, or be a part of, the gateway interface device 111. The gateway interface device 111 may be a computing device that communicates with the modem(s) 110 to allow one or more other devices in the premises 102a, to communicate with the local office 103 and other devices beyond the local office 103. The gateway 111 may be a set-top box (STB), digital video recorder (DVR), computer server, or any other desired computing device.
The gateway 111 may also include (not shown) local network interfaces to provide communication signals to requesting entities/devices in the premises 102a, such as display devices 112 (e.g., televisions), additional STBs 113, personal computers 114, laptop computers 115, wireless devices 116 (e.g., wireless routers, wireless laptops, notebooks, tablets and netbooks, cordless phones (e.g., Digital Enhanced Cordless Telephone - DECT phones), mobile phones, mobile televisions, personal digital assistants (PDA), etc.), landline phones 117 (e.g. Voice over Internet Protocol ¨ VoIP
phones), and any other desired devices. Examples of the local network interfaces include Multimedia Over Coax Alliance (MoCA) interfaces, Ethernet interfaces, universal serial bus (USB) interfaces, wireless interfaces (e.g., IEEE 802.11, IEEE 802.16), analog twisted pair interfaces, Bluetooth interfaces, and others.
The interface 120 may include computer-executable instructions (e.g., an application) for performing one or more aspects of the disclosure, such as detecting triggers, determining whether to report triggers, and/or generating detection signals to report triggers. The interface 120 can include any communication circuitry needed to allow a device to communicate on one or more links 101 with other devices in the network. For example, the interface 120 may include a modem 110, which may include transmitters and receivers used to communicate on the links 101 and with the local office 103. The modem 110 may be, for example, a coaxial cable modem (for coaxial cable lines 101), a fiber interface node (for fiber optic lines 101), twisted-pair telephone modem, cellular telephone transceiver, satellite transceiver, local wi-fl router or access point, or any other desired modem device. Also, although only one modem is shown in FIG. 1, a plurality of modems operating in parallel may be implemented within the interface 120. Further, the interface 120 may include a gateway interface device 111. The modem 110 may be connected to, or be a part of, the gateway interface device 111. The gateway interface device 111 may be a computing device that communicates with the modem(s) 110 to allow one or more other devices in the premises 102a, to communicate with the local office 103 and other devices beyond the local office 103. The gateway 111 may be a set-top box (STB), digital video recorder (DVR), computer server, or any other desired computing device.
The gateway 111 may also include (not shown) local network interfaces to provide communication signals to requesting entities/devices in the premises 102a, such as display devices 112 (e.g., televisions), additional STBs 113, personal computers 114, laptop computers 115, wireless devices 116 (e.g., wireless routers, wireless laptops, notebooks, tablets and netbooks, cordless phones (e.g., Digital Enhanced Cordless Telephone - DECT phones), mobile phones, mobile televisions, personal digital assistants (PDA), etc.), landline phones 117 (e.g. Voice over Internet Protocol ¨ VoIP
phones), and any other desired devices. Examples of the local network interfaces include Multimedia Over Coax Alliance (MoCA) interfaces, Ethernet interfaces, universal serial bus (USB) interfaces, wireless interfaces (e.g., IEEE 802.11, IEEE 802.16), analog twisted pair interfaces, Bluetooth interfaces, and others.
[27] FIG. 2 illustrates general hardware elements that can be used to implement any of the various computing devices discussed herein. The computing device 200 may include one or more processors 201, which may execute instructions of a computer program to perform any of the features described herein. The instructions may be stored in any type of computer-readable medium or memory, to configure the operation of the processor 201. For example, instructions may be stored in a read-only memory (ROM) 202, random access memory (RAM) 203, removable media 204, such as a Universal Serial Bus (USB) drive, compact disk (CD) or digital versatile disk (DVD), floppy disk drive, or any other desired storage medium. Instructions may also be stored in an attached (or internal) hard drive 205. The computing device 200 may include one or more output devices, such as a display 206 (e.g., an external television), and may include one or more output device controllers 207, such as a video processor. There may also be one or more user input devices 208, such as a remote control, keyboard, mouse, touch screen, microphone, etc. The computing device 200 may also include one or more network interfaces, such as a network input/output (I/O) circuit 209 (e.g., a network card) to communicate with an external network 210. The network input/output circuit 209 may be a wired interface, wireless interface, or a combination of the two. In some embodiments, the network input/output circuit 209 may include a modem (e.g., a cable modem), and the external network 210 may include the communication links 101 discussed above, the external network 109, an in-home network, a provider's wireless, coaxial, fiber, or hybrid fiber/coaxial distribution system (e.g., a DOCSIS
network), or any other desired network.
network), or any other desired network.
[28] The FIG. 2 example is a hardware configuration. Modifications may be made to add, remove, combine, divide, etc. components of the computing device 200 as desired.
Additionally, the components illustrated may be implemented using basic computing devices and components, and the same components (e.g., processor 201, ROM
storage 202, display 206, etc.) may be used to implement any of the other computing devices and components described herein. For example, the various components herein may be implemented using computing devices having components such as a processor executing computer-executable instructions stored on a computer-readable medium, as illustrated in FIG. 2. Some or all of the entities described herein may be software based, and may co-exist in a common physical platform (e.g., a requesting entity can be a separate software process and program from a dependent entity, both of which may be executed as software on a common computing device). Additionally, the computing device 200 may include a supplemental content manager 201a, which can perform the various methods for realizing synchronization of the second screen content with the primary content described herein as a replacement for, or augment to, any other processor 201 that the computing device 200 may include. That is, the supplemental content manager 201a may include a separate processor and/or set of computer-executable instructions stored on a computer-readable medium that, when executed by a processor, cause the processor (or the computing device 200 as a whole) to perform the various methods of the present disclosure, such as processing detection signals, monitoring detection signals, generating sampling information and multipliers, and generating synchronization signals. The supplemental content manager 201a may also include secure memory (not shown), which can store the various sampling information, multipliers, algorithms, and zone information described herein. The secure memory can be any desired type of memory, and can have enhanced security features to help restrict access (e.g., can only be accessed by the supplemental content manager 201a, can be internal to the supplemental content manager 201a, etc.).
Where the supplemental content manager 201a includes a separate set of computer-executable instructions, these instructions may be secured such that only authorized users may be allowed to modify, augment, or delete them.
Additionally, the components illustrated may be implemented using basic computing devices and components, and the same components (e.g., processor 201, ROM
storage 202, display 206, etc.) may be used to implement any of the other computing devices and components described herein. For example, the various components herein may be implemented using computing devices having components such as a processor executing computer-executable instructions stored on a computer-readable medium, as illustrated in FIG. 2. Some or all of the entities described herein may be software based, and may co-exist in a common physical platform (e.g., a requesting entity can be a separate software process and program from a dependent entity, both of which may be executed as software on a common computing device). Additionally, the computing device 200 may include a supplemental content manager 201a, which can perform the various methods for realizing synchronization of the second screen content with the primary content described herein as a replacement for, or augment to, any other processor 201 that the computing device 200 may include. That is, the supplemental content manager 201a may include a separate processor and/or set of computer-executable instructions stored on a computer-readable medium that, when executed by a processor, cause the processor (or the computing device 200 as a whole) to perform the various methods of the present disclosure, such as processing detection signals, monitoring detection signals, generating sampling information and multipliers, and generating synchronization signals. The supplemental content manager 201a may also include secure memory (not shown), which can store the various sampling information, multipliers, algorithms, and zone information described herein. The secure memory can be any desired type of memory, and can have enhanced security features to help restrict access (e.g., can only be accessed by the supplemental content manager 201a, can be internal to the supplemental content manager 201a, etc.).
Where the supplemental content manager 201a includes a separate set of computer-executable instructions, these instructions may be secured such that only authorized users may be allowed to modify, augment, or delete them.
[29] In some embodiments, the supplemental content manager 201a may be implemented as an application specific integrated circuit (ASIC). That is, the supplemental content manager 201a may be a chip designed specifically for performing the various processes described herein. Further, the ASIC may be implemented within or in communication with various computing devices provided herein.
[30] One or more aspects of the disclosure may be embodied in computer-usable data and/or computer-executable instructions, such as in one or more program modules, executed by one or more computers or other devices. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types when executed by a processor in a computer or other data processing device. The computer executable instructions may be stored on one or more computer readable media such as a hard disk, optical disk, removable storage media, solid state memory, RAM, etc. As will be appreciated by one of skill in the art, the functionality of the program modules may be combined or distributed as desired in various embodiments. In addition, the functionality may be embodied in whole or in part in firmware or hardware equivalents such as integrated circuits, field programmable gate arrays (FPGA), and the like. Particular data structures may be used to more effectively implement one or more aspects of the disclosure, and such data structures are contemplated within the scope of computer executable instructions and computer-usable data described herein.
[31] FIG. 3 is a diagram showing an example system architecture 300 on which various features described herein may be performed. The system 300 of FIG. 3 depicts a local office 103, a first premises 102a, a second premises 102b, one or more content distribution networks (CDN) 310 and 320, a network 330, and a second screen experience management computing device (e.g., a server) 340. As shown in FIG.
1, the local office 103 may connect to the first premises 102a and second premises 102b via links 101. The first premises 102a may include an interface 120 (e.g., a set top box), a . .
first screen device 301 (e.g., a television, a monitor, a projector, etc.), and one or more second screen devices 302 (e.g., a smartphone, tablet, laptop, etc.). As shown in FIG. 3, multiple users A and B may be located at the first premises 102a and each user may operate a second screen device 302 while consuming content via the first screen device 301. Meanwhile, the second premises 102b may include an interface 120, a first screen device 301, and a second screen device 302 used by a user C. Content, such as video content, may be transmitted from the local office 103 to the interfaces 120 of the first and second premises 102a-b, and presented through the first screen devices 301.
Thus, users A and B may consume content (e.g., view a stream of a video program that is multicast according to a schedule, or transmitted on demand, or playing back content that is locally recorded at the device 301 or an associated device such as at a DVR) at the premises 102a and user C may consume content at the premises 102b. Notably, while consuming content, each user may operate a respective second screen device 302 to consume second screen content related to the primary content presented on the first device 301 at their premises 102. For example, user A may operate a second screen device 302, such as a smartphone, to consume second screen content, such as a poll through which user A may vote for a contestant shown in the primary content presented on the first screen device 301. The second screen content may be any data that provides information or content to supplement primary content, which may be the video content (e.g., linear television program, on-demand movie, etc.) presented on a first screen device 301. For example, second screen content may include a link to a webpage of a product shown in an advertisement of the primary content, a video clip with bonus features, text and/or images with information about the content itself or about individuals or items shown in the primary content, advertisements, coupons, questions pertaining to the primary content, etc. The various second screen content may be generated from ordinary everyday consumers of the primary content, as well as from formal primary content sources. The appearance of the second screen content may be generated by the second screen device 302 using software that is previously stored, or it may be dynamically retrieved or received when it is needed, and the timing of when the second screen content appears . , (e.g., when a particular Internet link should appear, or when a particular image should be displayed) may be based on triggers or signals that are received along with the primary content stream.
1, the local office 103 may connect to the first premises 102a and second premises 102b via links 101. The first premises 102a may include an interface 120 (e.g., a set top box), a . .
first screen device 301 (e.g., a television, a monitor, a projector, etc.), and one or more second screen devices 302 (e.g., a smartphone, tablet, laptop, etc.). As shown in FIG. 3, multiple users A and B may be located at the first premises 102a and each user may operate a second screen device 302 while consuming content via the first screen device 301. Meanwhile, the second premises 102b may include an interface 120, a first screen device 301, and a second screen device 302 used by a user C. Content, such as video content, may be transmitted from the local office 103 to the interfaces 120 of the first and second premises 102a-b, and presented through the first screen devices 301.
Thus, users A and B may consume content (e.g., view a stream of a video program that is multicast according to a schedule, or transmitted on demand, or playing back content that is locally recorded at the device 301 or an associated device such as at a DVR) at the premises 102a and user C may consume content at the premises 102b. Notably, while consuming content, each user may operate a respective second screen device 302 to consume second screen content related to the primary content presented on the first device 301 at their premises 102. For example, user A may operate a second screen device 302, such as a smartphone, to consume second screen content, such as a poll through which user A may vote for a contestant shown in the primary content presented on the first screen device 301. The second screen content may be any data that provides information or content to supplement primary content, which may be the video content (e.g., linear television program, on-demand movie, etc.) presented on a first screen device 301. For example, second screen content may include a link to a webpage of a product shown in an advertisement of the primary content, a video clip with bonus features, text and/or images with information about the content itself or about individuals or items shown in the primary content, advertisements, coupons, questions pertaining to the primary content, etc. The various second screen content may be generated from ordinary everyday consumers of the primary content, as well as from formal primary content sources. The appearance of the second screen content may be generated by the second screen device 302 using software that is previously stored, or it may be dynamically retrieved or received when it is needed, and the timing of when the second screen content appears . , (e.g., when a particular Internet link should appear, or when a particular image should be displayed) may be based on triggers or signals that are received along with the primary content stream.
[32] Referring to FIG. 3, users (e.g., people) may consume content at a premises 102a (e.g., a home, business, etc.). Consuming content may include, for example, watching and/or listening to a television program or internet video on a first screen device 301(e.g., a television, computer monitor, projector, etc.). The first screen device 301 may receive the content from the interface 120, which is connected to the local office 103 and configured to accept the primary content. FIG. 3 also illustrates some examples of second screen devices 302, namely a smartphone and a laptop computer. Each second screen device 302 may be configured to store and/or execute a second screen experience application (e.g., a computer program) through which a user may select and consume second screen content. The second screen application may be downloaded from the local office 103 or another computing device 200 on the network 330, or retrieved from a computer readable medium (e.g., compact disc (CD), flash drive, etc.). The second screen application may also be a web browser for navigating to a website that provides the second screen experience described herein. Although FIG. 3 shows some example second screen devices 302, many other devices may be used as second screen devices 302. Indeed, even another television, similar in configuration to a first screen device 301, may be used as the second screen device 302. The second screen device 302 may also be a specially designed device (e.g., an enhanced television remote) for specific use in the embodiments disclosed herein.
[33] Further, each of the second screen devices 302 may be configured to bi-directionally communicate via a wired and/or wireless connection with the second screen experience management computing device 340 via the network 330. Specifically, the second screen devices 302 may be configured to access the network 330 (e.g., the Internet) to obtain second screen content and to transmit/receive signals via the network 330 to/from the . .
second screen experience management computing device 340. For example, a second screen device 302 may transmit information, such as requests for second screen content, through a wired connection, including the links 101 through which the primary content is supplied to a first screen device 301, to the local office 103 which then routes the transmission to the network 330 so that it may eventually reach the second screen experience management computing device 340. That is, the second screen device may connect to the interface 120 and communicate with the second screen experience management computing device 340 over the links 101 used to transmit the primary content downstream. Alternatively, a second screen device 302 may wirelessly communicate via, for example, a WiFi connection and/or cellular backhaul, to connect to the network 330 (e.g., the Internet) and ultimately to the second screen experience management computing device 340. Accordingly, although not shown, the network may include cell towers and/or wireless routers for communicating with the second screen devices 302.
second screen experience management computing device 340. For example, a second screen device 302 may transmit information, such as requests for second screen content, through a wired connection, including the links 101 through which the primary content is supplied to a first screen device 301, to the local office 103 which then routes the transmission to the network 330 so that it may eventually reach the second screen experience management computing device 340. That is, the second screen device may connect to the interface 120 and communicate with the second screen experience management computing device 340 over the links 101 used to transmit the primary content downstream. Alternatively, a second screen device 302 may wirelessly communicate via, for example, a WiFi connection and/or cellular backhaul, to connect to the network 330 (e.g., the Internet) and ultimately to the second screen experience management computing device 340. Accordingly, although not shown, the network may include cell towers and/or wireless routers for communicating with the second screen devices 302.
[34] Although FIG. 3 depicts the second screen experience management computing device 340 as being separate from the local office 103, in some embodiments, the second screen experience management computing device 340 may be located at the local office 103. In such embodiments, the second screen devices 302 may still access the second screen experience management computing device 340 through the network 330. Further, even though the second screen experience management computing device 340 is shown as a single element, in some embodiments, it may include a number of computing devices 200, which may include the supplemental content manager 201a.
[35] Still referring to FIG. 3, the local office 103 may be a computing device, a termination system, node, etc. within the system architecture 300. The local office 103 may include a router 305, and a database 306 for storing user information (e.g., user profiles), primary content, second screen content, and/or computer-executable instructions for inserting triggers, transmitting multipliers, or any of the steps described herein. The router 305 of the local office 103 may forward requests for content from users and/or user devices (e.g., first screen devices 301, second screen devices 302, etc.) to one or more CDNs 310 and 320 and/or the second screen experience management computing device 340 that may supply the requested content and/or synchronization signals. Each of the CDNs 310 and 320 may include one or more routers 311 and 321, whose purpose is to receive requests from users (e.g., via their local offices) and route them to servers within its network that may store the requested content and be able to supply it in response to the request. A CDN 310 for a given piece of content might have a hierarchy of one primary source, and a plurality of lower-level servers that can store (e.g., cache) the content and respond to requests. The lower-level servers that ultimately service the request may be referred to as edge servers, such as one or more edge servers 312 and 322. The various servers may include one or more content databases 313 and 323, which store content that the respective CDN 310 and 320 manages. In some embodiments, the CDNs 310 and 320 may provide the same or similar content. In other embodiments, the CDNs 310 and 320 may offer different content from one another. Also, the CDNs 310 and 320 may be maintained/operated by the same or different content providers. Although only two CDNs 310 and 320 are shown, many CDNs may be included in the system architecture 300 of FIG. 3.
[36]
FIG. 4 is a diagram illustrating an aspect of the present disclosure.
Specifically, FIG. 4 illustrates that a network, such as a service provider network including a plurality of interfaces 120 located at various premises 102, may be separated into a plurality of groups or zones (e.g., Zone 1, Zone 2, Zone 3, etc.). Each group or zone may include a plurality of interfaces 120 (e.g., set top boxes). The amount of interfaces 120 in each group or zone may be different. In some examples, the zones may cover a specific geographical region. For example, the zones may correspond to zip codes or area codes.
In other examples, each zone may include multiple zip codes or area codes.
Vice versa, multiple zones may make up a single zip code or area code. The zone groupings of interfaces may be done geographically, to group together interfaces that are in the same neighborhood, or served by the same local office 103 or trunk line. The zone groupings may also be determined based on signal propagation delay. For example, all of the interfaces that experience a 1500ms delay in receiving a downstream signal from the server 340 may be grouped in one group (regardless of geography), and all of the interfaces that experience a 2500ms delay may be grouped in another group.
FIG. 4 is a diagram illustrating an aspect of the present disclosure.
Specifically, FIG. 4 illustrates that a network, such as a service provider network including a plurality of interfaces 120 located at various premises 102, may be separated into a plurality of groups or zones (e.g., Zone 1, Zone 2, Zone 3, etc.). Each group or zone may include a plurality of interfaces 120 (e.g., set top boxes). The amount of interfaces 120 in each group or zone may be different. In some examples, the zones may cover a specific geographical region. For example, the zones may correspond to zip codes or area codes.
In other examples, each zone may include multiple zip codes or area codes.
Vice versa, multiple zones may make up a single zip code or area code. The zone groupings of interfaces may be done geographically, to group together interfaces that are in the same neighborhood, or served by the same local office 103 or trunk line. The zone groupings may also be determined based on signal propagation delay. For example, all of the interfaces that experience a 1500ms delay in receiving a downstream signal from the server 340 may be grouped in one group (regardless of geography), and all of the interfaces that experience a 2500ms delay may be grouped in another group.
[37] The zones may also be defined based on the similarity in the content they receive. For example, members of each zone may receive the same primary content, including identical advertisements, that is received by other members in the zone. In other words, the local office 103 may deliver the same primary content to each of the interfaces 120 in the same zone so that users of the interfaces 120 are exposed to the same advertisements.
Interfaces 120 in different zones may receive different content. In particular, the advertisements delivered to the interfaces 120 of different zones may be different. For example, all of the interfaces 120 in Zone 1 may receive a television program with an advertisement for a car during a commercial break, while all of the interfaces 120 in Zone 2 may receive the same television program but the advertisement at the same commercial break may be for a clothing store. In this example, the television program is the same, but the television programs may also be different. Further, the frequencies of the channels used to transmit the primary content may vary among the different zones. For example, in Zone 1, the primary content provided by NBC may be transmitted at a first frequency (e.g., 552 MHz), while in Zone 2, the primary content provided by NBC may be transmitted at a second frequency (e.g., 750 MHz). Similarly, the logical channels (e.g., channel 25) used to transmit the primary content may vary among the different zones. For example, in Zone 1, the primary content provided by NBC may be on channel 24, while in Zone 2, the primary content provided by NBC may be on channel 25.
Interfaces 120 in different zones may receive different content. In particular, the advertisements delivered to the interfaces 120 of different zones may be different. For example, all of the interfaces 120 in Zone 1 may receive a television program with an advertisement for a car during a commercial break, while all of the interfaces 120 in Zone 2 may receive the same television program but the advertisement at the same commercial break may be for a clothing store. In this example, the television program is the same, but the television programs may also be different. Further, the frequencies of the channels used to transmit the primary content may vary among the different zones. For example, in Zone 1, the primary content provided by NBC may be transmitted at a first frequency (e.g., 552 MHz), while in Zone 2, the primary content provided by NBC may be transmitted at a second frequency (e.g., 750 MHz). Similarly, the logical channels (e.g., channel 25) used to transmit the primary content may vary among the different zones. For example, in Zone 1, the primary content provided by NBC may be on channel 24, while in Zone 2, the primary content provided by NBC may be on channel 25.
[38] It should be understood that interfaces 120 in different zones may also receive the same primary content including the same advertisements; however this might not always be the case. Further, while the content delivered to each of the interfaces 120 in the same zone may be the same, it should be understood that the precise time at which the content arrives at the interfaces 120 in the same zone may be different. For example, there may be delays in transmitting the content such that one interface in Zone 1 receives the content at a first time, whereas another interface in Zone 1 receives the same content at a second time a few seconds later.
[39] FIG. 5 is a high-level diagram showing an example system architecture 500 on which various features described herein may be performed. The system 500 may include a number of premises 102a, 102b, 102c, a local office 103, a network 530, a second screen content server 541, and a synchronization manager 542. At each of the premises there may be an interface 120a, 120b, 120c, a first screen device 301a, 301b, 301c, and a second screen device 302a, 302b, 302c. As shown in FIG. 5, the premises 102a and 102b may belong to the same zone Z. Accordingly, the interfaces 120a and 120b may receive the same primary content Cl. Meanwhile, premises 102c, which is not within the same zone Z, may receive different primary content C2. Herein, the primary content Cl and primary content C2 may refer to the content shown on the first screen device 301, and may include, for example, both television programs and commercials. It should also be understood that the primary content Cl and primary content C2 may include similar content, but may be different in terms of their advertisement content or may be time shifted from one another. For example, the primary content Cl and primary content C2 may include the same television program at the same time on the same television channel, but have different advertisements during the commercial breaks. As another example, the primary content Cl may include a live television show (e.g., a football game) while the primary content C2 may include the same live television show but shifted in time so that there is a delay between when a user interface 120a receives the primary content Cl and when a user interface 120c receives the primary content C2.
[40] As mentioned, FIG. 5 shows the same content Cl being received by interfaces 120a and 120b. However, this is just one example scenario intended to illustrate that two interfaces (e.g., 120a and 120b) within the same zone Z may receive the same content at approximately the same time if they are tuned to the same channel. It should be understood that the interfaces 120a and 120b may still receive different content (e.g., different television shows) if they are tuned to different channels. In other words, each interface 120 may tune to whatever particular piece of content is desired from among the entire collection of primary content Cl or C2 that it receives.
[41] The primary content Cl and C2 (including, e.g., programming content and advertising content) may include triggers embedded within the content, or sent contemporaneously with the primary content, such as in a different logical data stream. The same content may have the same triggers. For example, primary content Cl delivered to interface 102a may include the same triggers as the primary content Cl delivered to the interface 102b.
Meanwhile, the primary content C2 may have different triggers than the primary content Cl. For example, even where the primary content Cl and primary content C2 are similar (e.g., where Cl and C2 both include the same television show), the primary content Cl may have different commercials than the primary content C2, and therefore, the triggers in the different commercials may be different. These triggers may be embedded by content creators (not shown) at the time the primary content Cl, C2 is created or by the local office 103 (or other content providers) before delivering the primary content to the appropriate interfaces 120. The triggers may be embedded within, or synchronized with, the content Cl, C2 at constant or variable intervals. For example, the content Cl, C2 may include a different trigger every five minutes. The triggers may include information identifying the associated primary content Cl, C2 or any other information, such as information for supplementing the primary content Cl, C2. The triggers also may include a unique or random identifier that may allow the trigger to be identified. Various formats for the triggers may be used. For example, the triggers may use the Enhanced TV Binary Interchange Format (EBIF) or comply with standards for digital program insertion of the Society of Cable Telecommunications Engineers (SCTE), such as SCTE
35, and may be sent in a data stream in synchronization with the primary content Cl, C2.
Alternatively, the trigger may be a vertical blanking interval (VBI) trigger embedded within the video stream of the primary content Cl, C2 (e.g., using closed captioning fields). Further, in some embodiments, watermarks in the video and/or audio streams of the primary content Cl, C2 may be used as triggers.
Meanwhile, the primary content C2 may have different triggers than the primary content Cl. For example, even where the primary content Cl and primary content C2 are similar (e.g., where Cl and C2 both include the same television show), the primary content Cl may have different commercials than the primary content C2, and therefore, the triggers in the different commercials may be different. These triggers may be embedded by content creators (not shown) at the time the primary content Cl, C2 is created or by the local office 103 (or other content providers) before delivering the primary content to the appropriate interfaces 120. The triggers may be embedded within, or synchronized with, the content Cl, C2 at constant or variable intervals. For example, the content Cl, C2 may include a different trigger every five minutes. The triggers may include information identifying the associated primary content Cl, C2 or any other information, such as information for supplementing the primary content Cl, C2. The triggers also may include a unique or random identifier that may allow the trigger to be identified. Various formats for the triggers may be used. For example, the triggers may use the Enhanced TV Binary Interchange Format (EBIF) or comply with standards for digital program insertion of the Society of Cable Telecommunications Engineers (SCTE), such as SCTE
35, and may be sent in a data stream in synchronization with the primary content Cl, C2.
Alternatively, the trigger may be a vertical blanking interval (VBI) trigger embedded within the video stream of the primary content Cl, C2 (e.g., using closed captioning fields). Further, in some embodiments, watermarks in the video and/or audio streams of the primary content Cl, C2 may be used as triggers.
[42] As shown in FIG. 5, the interface 120a may transmit a detection signal D
back upstream after detecting a trigger in the primary content Cl. Although FIG. 5 shows the interface 120a receiving triggers and sending the detection signal D, another computing device associated with (or coupled to) the interface 120a may perform these functions. For example, the interface 120a might receive the triggers and another computing device associated with the interface 120a may generate and send the detection signal D. A
detection signal D may include information identifying a timing of when an interface 120 actually received an associated trigger, as well as, information identifying the portion of the primary content that was being consumed at the time the trigger was received and/or a trigger identifier. Notably, the interface 120b, which is in the same zone Z
and receives the same primary content Cl as the interface 120a, does not transmit a detection signal back upstream. This is because in the example of FIG. 5, the interface 120b was not selected to report the trigger. In other examples, the interface 120b may send the detection signal to report the trigger and the interface 120a might not. Still in other examples, both interfaces 120a and 120b may send detection signals to report the triggers. Since interface 120a might not know whether interface 120b will send the detection signal, and vice versa, both interfaces 120a and 120b might independently determine that they should send the detection signal to be sure that synchronization will be maintained. On the contrary, neither interface 120a and 120b may send the detection signal, and instead, synchronization of the corresponding second screen devices 302a and 302b might depend on a detection signal sent by another interface 120 (not shown in FIG.
5) within the same zone Z. The determination of whether an interface 120 should report the trigger may be made on a trigger-by-trigger basis. That is, for every trigger received, each interface 120 may determine whether it should transmit a detection signal to report the trigger.
back upstream after detecting a trigger in the primary content Cl. Although FIG. 5 shows the interface 120a receiving triggers and sending the detection signal D, another computing device associated with (or coupled to) the interface 120a may perform these functions. For example, the interface 120a might receive the triggers and another computing device associated with the interface 120a may generate and send the detection signal D. A
detection signal D may include information identifying a timing of when an interface 120 actually received an associated trigger, as well as, information identifying the portion of the primary content that was being consumed at the time the trigger was received and/or a trigger identifier. Notably, the interface 120b, which is in the same zone Z
and receives the same primary content Cl as the interface 120a, does not transmit a detection signal back upstream. This is because in the example of FIG. 5, the interface 120b was not selected to report the trigger. In other examples, the interface 120b may send the detection signal to report the trigger and the interface 120a might not. Still in other examples, both interfaces 120a and 120b may send detection signals to report the triggers. Since interface 120a might not know whether interface 120b will send the detection signal, and vice versa, both interfaces 120a and 120b might independently determine that they should send the detection signal to be sure that synchronization will be maintained. On the contrary, neither interface 120a and 120b may send the detection signal, and instead, synchronization of the corresponding second screen devices 302a and 302b might depend on a detection signal sent by another interface 120 (not shown in FIG.
5) within the same zone Z. The determination of whether an interface 120 should report the trigger may be made on a trigger-by-trigger basis. That is, for every trigger received, each interface 120 may determine whether it should transmit a detection signal to report the trigger.
[43] Further, although the interface 120b in the example of FIG. 5 does not report the trigger, the second screen device 302b may still be synchronized with the primary content Cl streamed on the first screen device 301b because the second screen device 302b may benefit from the detection signal sent by the interface 120a as described in more detail below. As long as there is another interface 120 within the same zone sending a detection signal in response to the primary content presented on the first screen device 301b, the second screen device 302b may be synchronized with the first screen device 301b without the interface 120b having to send a detection signal. In light of the above, one of ordinary skill in the art should realize the advantages this system may offer in terms of bandwidth. Since both of the interfaces 120a and 120b are not sending detection signals upstream, the upstream bandwidth over the links 101 connecting the local office 103 to the interfaces 120 may be conserved.
[44] Additionally, one should understand that the interface 120c which is in a different zone from interface 120a might not benefit from the detection signal sent by the interface 120a. Because the primary content C2 may be different than the primary content Cl (e.g., primary content C2 may include different advertisements inserted into the programming content), the detection signal sent in response to the trigger in the content Cl might not help to synchronize second screen content on the second screen device 302c with the primary content on the first screen device 301c.
[45] Still referring to FIG. 5, the detection signal D may be transmitted to the synchronization manager 542. The synchronization manager 542 may be configured to receive detection signals D and generate and transmit synchronization signals 51, S2 to second screen devices 302. Each synchronization signal Si, S2 may include a number of signals transmitted in series in response to a single detection signal D. In some embodiments, the synchronization manager 542 may identify detection signals D and determine what primary content and/or what triggers are associated with the detection signals D. The synchronization manager 542 may then generate a specific synchronization signal Si based on this determination so that second screen devices 302 can synchronize second screen content SSC with the primary content Cl. For example, where the detection signal D is sent in response to a trigger embedded in an advertisement for a car, the synchronization manager 542 may generate and transmit a synchronization signal S1 to second screen devices 302a and 302b indicating that the interfaces 120a and 120b have recently outputted the advertisement for a car. As a result, the second screen devices 302a and 302b may synchronize second screen content SSC to display a portion of the second screen content SSC related to the car in synchronization with the primary content Cl which recently included the advertisement for the car. The synchronization signal Si may include information identifying the primary content and/or information indicating a timing of primary content being presented, a timing of supplemental content to be presented, a portion of primary content being presented, and/or a portion of supplemental content to be presented. Further, the synchronization signal S1 may include a number of signals transmitted in series in response to a single detection signal D.
[46] The synchronization signal Si may be a multicast signal pushed to a plurality of second screen devices 302 listening for multicast signals. The second screen devices 302 may then determine whether the synchronization signal 51 is related to the second screen content SSC it is presenting. For example, where the interface 120a is streaming one television show and the interface 120b is streaming another television show, the synchronization signal Si, multicasted to both the second screen device 302a and second screen device 302b, might only be received and used by the second screen device 302a because the detection signal D was sent from interface 120a associated with the second screen device 302a. In this example, the second screen device 302b might ignore the multicasted synchronization signal Si because its associated interface 120b is not streaming the same television show. Where the synchronization signal Si is a multicast signal, it may be sent to all second screen devices 302 in the same zone, or all second screen devices 302 regardless of zones (the latter case is not depicted in FIG. 5).
[47] Alternatively, the synchronization signal Si may include a plurality of signals that are selectively pushed to particular second screen devices 302 that have identified themselves as presenting (or desiring to present) related second screen content SSC. For instance, each time a second screen device 302 is directed to present different second screen content SSC, the second screen device 302 may send a signal to the synchronization manager 542 informing the synchronization manager 542 of the second screen content that it is presenting. As a result, the synchronization manager 542 may learn which second screen devices 302 should be sent synchronization signals related to such second screen content SSC.
[48] The synchronization signal S2 may be different from the synchronization signal Si because the second screen device 302c is associated with the interface 120c in a different zone than the zone Z which includes interfaces 120a and 120b. The synchronization signal Si might not be useful to the second screen device 302c because the content C2 received by the interface 120c is not necessarily the same as the content Cl.
The synchronization signal S2 may be transmitted by the same synchronization manager 542 that transmits synchronization signal Si. However, the synchronization manager may transmit the synchronization signal S2 in response to a different detection signal (not shown), than detection signal D, that is transmitted to the synchronization manager 542 by another interface 120 (not shown) in the same zone as the interface 120c.
Also, although not shown in FIG. 5, the synchronization signal S2 may be sent to other interfaces 120 within the same zone as interface 120c.
The synchronization signal S2 may be transmitted by the same synchronization manager 542 that transmits synchronization signal Si. However, the synchronization manager may transmit the synchronization signal S2 in response to a different detection signal (not shown), than detection signal D, that is transmitted to the synchronization manager 542 by another interface 120 (not shown) in the same zone as the interface 120c.
Also, although not shown in FIG. 5, the synchronization signal S2 may be sent to other interfaces 120 within the same zone as interface 120c.
[49] Further, FIG. 5 illustrates the paths of transmission of second screen content SSC.
Although FIG. 5 illustrates that second screen devices 302 receive second screen content SSC from the second screen content server 541 via the network 530, in other examples, the second screen content SSC may be received from a plurality of sources via a plurality of other networks. The second screen content server 541 may be configured to transmit the second screen content SSC to one or more of the second screen devices 302 via the network 530, which may include a cellular backhaul, the Internet, and/or additional local and wide area networks. The second screen content SSC may be multicast or unicast to the second screen devices 302. In some cases, the second screen devices 302 may first transmit a request (not shown) to the second screen content server 541 for specific second screen content SSC. For example, when a user via an application on a second screen device 302 directs a second screen device 302 to present specific second screen content SSC, the second screen device 302 may send a request identifying the second screen content desired to the second screen content server 541. In response to this request, the second screen content server 541 may transmit the second screen content SSC
back to the second screen device 302. In some embodiments, the request for second screen content may simply refer to the primary content for which second screen content is desired and the second screen content server 541 may detect what second screen content SSC
to supply in response. As shown in FIG. 5, the same second screen content SSC may be sent to second screen devices 302 in different zones. That is, there may be a single file of second screen content associated with a particular piece of primary content, and therefore, the job of synchronizing that second screen content may be left up to the second screen devices 302.
Although FIG. 5 illustrates that second screen devices 302 receive second screen content SSC from the second screen content server 541 via the network 530, in other examples, the second screen content SSC may be received from a plurality of sources via a plurality of other networks. The second screen content server 541 may be configured to transmit the second screen content SSC to one or more of the second screen devices 302 via the network 530, which may include a cellular backhaul, the Internet, and/or additional local and wide area networks. The second screen content SSC may be multicast or unicast to the second screen devices 302. In some cases, the second screen devices 302 may first transmit a request (not shown) to the second screen content server 541 for specific second screen content SSC. For example, when a user via an application on a second screen device 302 directs a second screen device 302 to present specific second screen content SSC, the second screen device 302 may send a request identifying the second screen content desired to the second screen content server 541. In response to this request, the second screen content server 541 may transmit the second screen content SSC
back to the second screen device 302. In some embodiments, the request for second screen content may simply refer to the primary content for which second screen content is desired and the second screen content server 541 may detect what second screen content SSC
to supply in response. As shown in FIG. 5, the same second screen content SSC may be sent to second screen devices 302 in different zones. That is, there may be a single file of second screen content associated with a particular piece of primary content, and therefore, the job of synchronizing that second screen content may be left up to the second screen devices 302.
[50] Once a second screen device 302 receives second screen content SSC and a synchronization signal Si, S2, the second screen device 302 may perform synchronization. Specifically, the second screen device 302 may use a received synchronization signal Si, S2 to determine which segment or time point in the second screen content SSC to present. For example, if there are 30 segments of the second screen content SSC and the synchronization signal Si indicates that the 20th segment should be presented, the second screen device 302 will analyze the second screen content SSC to present the 20th segment. As a result, the second screen content SSC
presented on the second screen device 302 may be synchronized with the primary content Cl, C2 on the associated first screen device 301. This synchronization is illustrated by the dashed lines connecting the second screen devices 302 with their associated first screen devices 301.
presented on the second screen device 302 may be synchronized with the primary content Cl, C2 on the associated first screen device 301. This synchronization is illustrated by the dashed lines connecting the second screen devices 302 with their associated first screen devices 301.
[51] While FIG. 5 shows the second screen content SSC and the synchronization signals Si, S2 as separate signals from each other, in some embodiments, they may be combined. In other words, instead of sending both a synchronization signal S1 and second screen content SSC, a combined signal may be sent that includes the second screen content SSC
time shifted according to the synchronization signal Sl. For example, in response to the detection signal D sent by the interface 120a, the particular segment of second screen content SSC corresponding to the primary content Cl presented on the first screen device 301a may be transmitted down to the second screen device 302a. In this manner, the synchronization may be considered to have occurred in the cloud, and the second screen device 302 may present the synchronized second screen content without having to perform the synchronization itself Further, it should be understood that when the second screen content SSC and synchronization signals Si, S2 are separate, the order in which they are received by the second screen device 302 may vary. That is, in some cases a second screen device may receive the second screen content SSC before the synchronization, and in other cases it may be reversed. In the reverse scenario, the second screen device 302 may buffer the synchronization signal until the second screen content SSC is acquired and then use the synchronization signal to perform synchronization.
time shifted according to the synchronization signal Sl. For example, in response to the detection signal D sent by the interface 120a, the particular segment of second screen content SSC corresponding to the primary content Cl presented on the first screen device 301a may be transmitted down to the second screen device 302a. In this manner, the synchronization may be considered to have occurred in the cloud, and the second screen device 302 may present the synchronized second screen content without having to perform the synchronization itself Further, it should be understood that when the second screen content SSC and synchronization signals Si, S2 are separate, the order in which they are received by the second screen device 302 may vary. That is, in some cases a second screen device may receive the second screen content SSC before the synchronization, and in other cases it may be reversed. In the reverse scenario, the second screen device 302 may buffer the synchronization signal until the second screen content SSC is acquired and then use the synchronization signal to perform synchronization.
[52] In accordance with the above, second screen devices 302 may synchronize second screen content SSC with linear primary content based on synchronization signals Si, S2 that are generated in response to detection signals D sent from other premises 102. In comparison, second screen devices 302 attempting to synchronize second screen content SSC with time-shifted primary content (e.g., video on-demand content) might only use synchronization signals Si, S2 that are generated in response to detection signals sent from an interface 120 at the same premises 102. Alternatively, second screen devices 302 may synchronize second screen content SSC with time-shifted primary content using audio recognition processes to analyze the audio of the time-shifted primary content or using triggers received, via a wired or wireless connection, directly from the interface 120 at the same premises 102. Accordingly, the system 500 may differentiate interfaces 120 presenting linear primary content from interfaces 120 presenting time-shifted programming content. For example, the synchronization manager 542 may be configured to determine whether a received detection signal D is generated in response to consumption of time-shifted primary content. If so, the synchronization manager 542 may transmit a unicast synchronization signal to the particular second screen device 302 being used to present second screen content SSC in synchronization with the time-shifted primary content.
[53] FIG. 6 is a flow diagram illustrating an example method of the present application in which a second screen device 302 may be initially configured so that it can interface with the system of the present disclosure.
[54] In step 601, a user may use a web browser to access a web portal. This can be done using the second screen device 302 or another device with Internet access. Once at the web portal, a user may create an account in step 602. This may be done by entering a username and/or a password. By creating an account, a user may subscribe to a service of the present disclosure that provides a synchronized second screen experience.
[55] In step 603, a user can register his/her particular second screen device 302 with the service of the present disclosure so that synchronization signals and second screen content may be received by the second screen device 302. During this registration process, the user may associate a default interface 120, and therefore a default zone, with the second screen device 302 and/or with a username used to log into a second screen experience application on the second screen device 302. For example, a user in FIG. 5 might associate second screen device 302a with interface 120a. To accomplish this, a user may enter an interface 120 identifier (e.g., set top box serial number) or service provider account number into a second screen experience application of a second screen device 302, which may store the entered information on the second screen device 302 itself or in a computing device on the network 530. As a result, when the second screen device 302 is subsequently used to consume supplemental content, it can present the supplemental content for the appropriate zone based on the stored interface 120 identifier and/or service provider account number. It is contemplated that certain users might desire to synchronize second screen content on their second screen device 302 with primary content at a particular interface 120 (e.g., an interface 120 in their home) more often than other interfaces 120, so that the second screen device 302 may receive an appropriate synchronization signal. However, a user may later change the associated interface or account number, which may be desirable the user's residency changes, or temporarily override the association, which may be desirable when the user is consuming supplemental content at another person's home. During step 603, a user may associate a number of second screen devices 302 with the same interface 120.
Alternatively, a username may be associated with an interface 120 so that any second screen device 302 running a second screen application which has been logged into using the username may be associated with that interface 210.
Alternatively, a username may be associated with an interface 120 so that any second screen device 302 running a second screen application which has been logged into using the username may be associated with that interface 210.
[56] At the completion of step 603, an application on the second screen device 302 may have stored an interface 120 identifier. Thus, when the second screen device 302 subsequently sends a request for content, a computing device, such as the second screen content server 541 or the synchronization manager 542, may determine where (e.g., in which zone) the second screen device 302 is located. As a result, the second screen device 302 may receive the appropriate synchronization signals Si, S2 or synchronized second screen content.
[57] In step 604, a user may designate the type of device so different signal formats can be sent depending on the device. Step 604 may also allow the user to enter other user preferences, such as preferences related to types of supplemental content (e.g., coupons, bonus videos, trivia questions, etc.) or parental controls to restrict access to certain second screen content. It is contemplated that different users consuming the same primary content Cl, C2 within the same zone (and even through the same interface 120) may wish to receive different supplemental content. In other words, the second screen experience may be customized for users by providing a variety of supplemental content for any given piece of primary content Cl, C2. Accordingly, the user preferences received in step 604 may be used by second screen devices 302 to control what supplemental content is received and/or presented. For example, a second screen device 302 may receive multiple versions of second screen content SSC for the same primary content and determine which version to present based on user preferences entered in step 604. Therefore, the user preferences may also be used to select which synchronization signal to use or control how synchronization is performed so that the presented second screen content SSC is synchronized with the primary content. Once preferences are submitted, a user profile associated with the account created in step 602 may exist. This user profile may be updated by returning to the web portal and editing the preferences.
[58] FIG. 7 illustrates a process of operating a second screen device 302 to realize a second screen experience. In particular, FIG. 7 describes steps that a second screen device 302 may perform to synchronize second screen content presented thereon with primary content presented on a first screen device 301. By way of these steps, a user may control a second screen device 302 so that the user may consume second screen content in synchronization with primary content that the user simultaneously consumes via a first screen device 301.
[59] The process of FIG. 7 may be performed after the process of FIG. 6 for registering the second screen device 302 is performed. In step 701, the second screen device 302 is turned on. This may include turning on, for example, a tablet, laptop, smartphone, etc.
At step 702, the first screen device 301 may be turned on. When a first screen device 301 (e.g., a television, computer, etc.) is turned on, it may begin presenting content, such as a video program (e.g., a television program). In some cases, the second screen device 302 may be used to turn on the first screen device 301.
At step 702, the first screen device 301 may be turned on. When a first screen device 301 (e.g., a television, computer, etc.) is turned on, it may begin presenting content, such as a video program (e.g., a television program). In some cases, the second screen device 302 may be used to turn on the first screen device 301.
[60] With the second screen device 302 turned on, synchronicity between the supplemental content on the second screen device 302 and the primary content on the first screen device 301 may be calibrated at step 703. For example, calibration may be performed using the second screen device 302 to account for a propagation delay of the primary content to the interface 120 and/or first screen device 301 or of the synchronization signal to the second screen device 302. The propagation delay may be a time elapsing from a time that the primary content is transmitted from, for example, the local office 103 to a time that the primary content reaches the first screen device 301 and/or interface 120, or a time elapsing from a time that a synchronization signal is transmitted from, for example, the synchronization manager 542 to a time that it reaches the second screen device 302.
In some situations, primary content may be received by different interfaces 120 at different times although it is transmitted to the interfaces 120 at the same time. Also, second screen devices 302 within the same zone may receive synchronization signals at different times. To carry out calibration to account for such propagation delays, a second screen device 302 may connect to the interface 120 and/or first screen device 301 via a wired and/or wireless connection. Through the connection, the first screen device 301 and/or interface 120 may inform the second screen device 302 of the propagation delay.
Specifically, the interface 120 and/or first screen device 301 may compute a propagation delay by comparing a time that primary content was transmitted with a time that the primary content was received and provide this information to the second screen device 302. Calibration may also be performed by a second screen device 302 by comparing a time when a dummy synchronization signal is transmitted to a time that it is received.
This information may be used when presenting the second screen content SCS
according to a synchronization signal Si, S2. In some embodiments, a test may be performed to achieve calibration. For example, a user may make a selection on the second screen device 302 in response to some event (e.g., the start of a television program, start/end of a commercial break, etc.) associated with the primary content presented on the first screen device 301 which may allow the second screen device 302 to compute a propagation delay. Although the calibration at step 703 is shown as occurring after step 702 and before step 704, it should be understood that the calibration may take place at other times in the process and/or may be performed multiple times.
In some situations, primary content may be received by different interfaces 120 at different times although it is transmitted to the interfaces 120 at the same time. Also, second screen devices 302 within the same zone may receive synchronization signals at different times. To carry out calibration to account for such propagation delays, a second screen device 302 may connect to the interface 120 and/or first screen device 301 via a wired and/or wireless connection. Through the connection, the first screen device 301 and/or interface 120 may inform the second screen device 302 of the propagation delay.
Specifically, the interface 120 and/or first screen device 301 may compute a propagation delay by comparing a time that primary content was transmitted with a time that the primary content was received and provide this information to the second screen device 302. Calibration may also be performed by a second screen device 302 by comparing a time when a dummy synchronization signal is transmitted to a time that it is received.
This information may be used when presenting the second screen content SCS
according to a synchronization signal Si, S2. In some embodiments, a test may be performed to achieve calibration. For example, a user may make a selection on the second screen device 302 in response to some event (e.g., the start of a television program, start/end of a commercial break, etc.) associated with the primary content presented on the first screen device 301 which may allow the second screen device 302 to compute a propagation delay. Although the calibration at step 703 is shown as occurring after step 702 and before step 704, it should be understood that the calibration may take place at other times in the process and/or may be performed multiple times.
[61] At step 704, a user may operate the second screen device 302 to execute an application that is designed to provide an interface for users to consume second screen content SSC.
In some examples, simply turning on the second screen device 302 may trigger the application to run, and thus, provide a second screen experience. The application may be stored on the second screen device 302. Once the application begins to run, a user may log into a second screen experience service. Logging-in may include providing a username and/or password or other identification information. By logging-in to the second screen experience, a user may enter the second screen device 302 into a service network so that the correct synchronization signals Si, S2 and/or second screen content SSC may be provided to the second screen device 302.
In some examples, simply turning on the second screen device 302 may trigger the application to run, and thus, provide a second screen experience. The application may be stored on the second screen device 302. Once the application begins to run, a user may log into a second screen experience service. Logging-in may include providing a username and/or password or other identification information. By logging-in to the second screen experience, a user may enter the second screen device 302 into a service network so that the correct synchronization signals Si, S2 and/or second screen content SSC may be provided to the second screen device 302.
[62] In step 705, a user may select second screen content SSC that the user wishes to download and/or consume. Specifically, a user may operate the application to specify second screen content SSC. The application may provide the user with a listing of second screen content SSC available for consumption. The listing may be sorted based on the primary content Cl, C2 it is associated with. Accordingly, a user may enter or otherwise identify which primary content they are viewing in order to select the second screen content they would like to consume. That is, based on the selection of the primary content Cl, C2 being consumed, the corresponding second screen content SSC may be selected. The second screen device 302 may communicate with the interface 120 and/or . .
first screen device 301 to identify this selection. Further, where the second screen device 302 is used to turn on the first screen device 301 or tune the first screen device 301 to a particular channel or service, the second screen device 302 may automatically (e.g., without further user input) select the appropriate second screen content SSC
based on the identification of the selected channel or service. Also, there might not be a requirement that the user is consuming the primary content Cl, C2 associated with the second screen content SSC that the user selects. In other words, a user may select second screen content SSC even though he/she is not consuming the associated primary content Cl, C2.
Another optional feature of the application may be the ability to provide a preview of the second screen content SSC that is available for download.
first screen device 301 to identify this selection. Further, where the second screen device 302 is used to turn on the first screen device 301 or tune the first screen device 301 to a particular channel or service, the second screen device 302 may automatically (e.g., without further user input) select the appropriate second screen content SSC
based on the identification of the selected channel or service. Also, there might not be a requirement that the user is consuming the primary content Cl, C2 associated with the second screen content SSC that the user selects. In other words, a user may select second screen content SSC even though he/she is not consuming the associated primary content Cl, C2.
Another optional feature of the application may be the ability to provide a preview of the second screen content SSC that is available for download.
[63] Step 706 includes a step of determining whether the location of the second screen content SSC is to be changed. As mentioned with respect to FIG. 6, the second screen device 302 may be registered with a specific interface 120 and/or location. Specifically, the second screen device 302 may store a location setting for identifying the location (e.g., zone) for which the second screen device 302 will receive second screen content SSC.
Because a user may frequently use the second screen device 302 to consume second screen content at his/her default location (e.g., a home), the location may be set to the default location whenever the user runs the application. The default location may be associated with a default zone, and therefore, the second screen device 302 may assume it is in the default zone unless it is instructed otherwise. However, it is contemplated that the second screen device 302 may be used at a location other than its default location. For example, a user may use his/her second screen device while at a friend's home in a different zone than the user's default zone, and therefore, may desire that his/her second screen content SSC be synchronized with the primary content Cl, C2 delivered to the friend's home.
In this case, a user may choose to change the location/zone setting on his/her second screen device 302. In some examples, the different location may be automatically detected using the second screen device's global positioning system (GPS) receiver or using location information obtained via a network (e.g., a cellular network or the Internet). In other examples, the second screen device 302 may detect that it is in a different location by communicating with the interface 120 at the different location. That is, the second screen device 302 may connect with the interface 120 at the different location, and through one or more communications with the interface 120, the second screen device 302 may be informed of its current location (e.g., zone).
Because a user may frequently use the second screen device 302 to consume second screen content at his/her default location (e.g., a home), the location may be set to the default location whenever the user runs the application. The default location may be associated with a default zone, and therefore, the second screen device 302 may assume it is in the default zone unless it is instructed otherwise. However, it is contemplated that the second screen device 302 may be used at a location other than its default location. For example, a user may use his/her second screen device while at a friend's home in a different zone than the user's default zone, and therefore, may desire that his/her second screen content SSC be synchronized with the primary content Cl, C2 delivered to the friend's home.
In this case, a user may choose to change the location/zone setting on his/her second screen device 302. In some examples, the different location may be automatically detected using the second screen device's global positioning system (GPS) receiver or using location information obtained via a network (e.g., a cellular network or the Internet). In other examples, the second screen device 302 may detect that it is in a different location by communicating with the interface 120 at the different location. That is, the second screen device 302 may connect with the interface 120 at the different location, and through one or more communications with the interface 120, the second screen device 302 may be informed of its current location (e.g., zone).
[64] If is determined that the second screen device 302 is in a different zone than its location setting currently indicates, the location setting may be updated at step 707.
After the proper zone is indicated in the location setting, step 708 may be performed to transmit a request for second screen content SSC. The second screen content request may be transmitted by the second screen device 302 via a network 530 to a second screen content server 541 and/or a synchronization manager 542. The second screen content request may indicate the zone in which the second screen device 302 is located and/or the second screen content SSC desired by the user. In some cases, the second screen content request may include multiple signals. For example, one signal may indicate the zone in which the second screen device 302 is located and may be transmitted to the synchronization manager 542, while another signal may indicate the second screen content desired and may be sent to the second screen content server 541. In some cases, the zone might not be indicated, and the second screen content server 541, synchronization manager 542, and/or other computing device 200 on the network 530 may determine the zone associated with the second screen device 302 based on information stored in a profile associated with the second screen device 302. For example, the synchronization manager 542 may determine an identity of the second screen device 302 sending the second screen content request from an identifier in the request and determine the zone by referring to a profile set up for the second screen device as explained above with respect to FIG. 6.
After the proper zone is indicated in the location setting, step 708 may be performed to transmit a request for second screen content SSC. The second screen content request may be transmitted by the second screen device 302 via a network 530 to a second screen content server 541 and/or a synchronization manager 542. The second screen content request may indicate the zone in which the second screen device 302 is located and/or the second screen content SSC desired by the user. In some cases, the second screen content request may include multiple signals. For example, one signal may indicate the zone in which the second screen device 302 is located and may be transmitted to the synchronization manager 542, while another signal may indicate the second screen content desired and may be sent to the second screen content server 541. In some cases, the zone might not be indicated, and the second screen content server 541, synchronization manager 542, and/or other computing device 200 on the network 530 may determine the zone associated with the second screen device 302 based on information stored in a profile associated with the second screen device 302. For example, the synchronization manager 542 may determine an identity of the second screen device 302 sending the second screen content request from an identifier in the request and determine the zone by referring to a profile set up for the second screen device as explained above with respect to FIG. 6.
[65] In response to the request for second screen content SSC, at step 709, the second screen content server 541 may transmit the appropriate second screen content SSC to the second screen device 302 that sent the request. Accordingly, at step 709, the second screen . .
device 302 may receive the second screen content SSC it requested. The second screen device 302 may display the second screen content SSC once it is received or may buffer the second screen content SSC until a synchronization signal Si, S2 is received.
device 302 may receive the second screen content SSC it requested. The second screen device 302 may display the second screen content SSC once it is received or may buffer the second screen content SSC until a synchronization signal Si, S2 is received.
[66] Additionally, the second screen device 302 may receive a synchronization signal Si, S2 from the synchronization manager 542 at step 710. In some examples, the synchronization signal Si, S2 may be addressed to a particular second screen device 302 that sent the request for second screen content SSC. In other examples, one or more of the synchronization signals Si, S2 may be a multicast signal so as to be received by a plurality of second screen devices 302. Further, in some examples, the synchronization signals Si, S2 may be sent by the synchronization manager 542 in response to a detection signal D received from an interface 120 and/or the second screen content request. In other words, a detection signal D and/or the second screen content request may trigger the synchronization manager 542 to transmit the synchronization signals Si, S2.
Subsequently, the second screen device 302 may synchronize the second screen content SSC with the primary content, which may be presented on a first screen device 301, according to a received synchronization signal Si, S2 at step 711. For example, referring to FIG. 5, the second screen device 302a may synchronize the second screen content SSC
with the synchronization signal Si so that relevant portions of the second screen content SSC may be presented depending on the portions of the primary content being presented on the first screen device 301.
Subsequently, the second screen device 302 may synchronize the second screen content SSC with the primary content, which may be presented on a first screen device 301, according to a received synchronization signal Si, S2 at step 711. For example, referring to FIG. 5, the second screen device 302a may synchronize the second screen content SSC
with the synchronization signal Si so that relevant portions of the second screen content SSC may be presented depending on the portions of the primary content being presented on the first screen device 301.
[67] Further, step 712 may be performed to determine whether there is a change in the second screen content SSC requested. The application on the second screen device 302 may check whether a user has requested different second screen content SSC. This determination may be made based on user provided input (e.g., user input selecting different second screen content SSC from a list) or a detection that the second screen device 302 was used to change the primary content presented on the first screen device 301. If it is determined that new second screen content SSC has not been requested (No at step 712), the process may return to step 710 and may wait to receive another synchronization signal Si, S2 in order to maintain synchronization. Meanwhile, if it is determined that new second screen content SSC has been requested (Yes at step 712), the process may return to step 705 to select the new second screen content SSC.
[68] FIG. 8 illustrates a method for providing a synchronized second screen experience. The method may begin with step 801 in which sampling information, representing statistical data, may be obtained. The sampling information may be based on statistical data collected that represents how many people consume certain primary content Cl, C2. For example, polls and other ratings (e.g., Nielsen ratings) that provide an approximate number of people watching certain television shows at various times of day (e.g., clock times) may be used to generate the statistical data. Specifically, based on the ratings data, a percentage of detection signals D desired to be received from interfaces 120 for each piece of content Cl, C2 at various times of day may be determined. In some embodiments, the sampling information may be generated using program guide information. For example, electronic program guide (EPG) information may be extracted and used to create the sampling information. Table 1 below shows a graphical representation of example sampling information that may be obtained and stored.
Time of Day 8:00 am 8:30 am 9:00 am Channel 25 5% 9% 10%
Channel 26 7% 20%
Channel 27 15% 8% 1%
Channel 28 60% 4%
Channel 29 3% 12% 5%
Channel 30 7% 10% 6%
,
Time of Day 8:00 am 8:30 am 9:00 am Channel 25 5% 9% 10%
Channel 26 7% 20%
Channel 27 15% 8% 1%
Channel 28 60% 4%
Channel 29 3% 12% 5%
Channel 30 7% 10% 6%
,
[69] Referring to Table 1, the 5% in the cell for Channel 25 at 8:00am indicates that 5% of the interfaces 120 expected to tune to Channel 25 at 8:00am are desired to provide detection signals D to report detected triggers. Notably, the percentages of for each piece of primary content (e.g., television program) may be different so that a common number of detection signals may be received regardless of how many people are consuming the primary content. As discussed in more detail below, the interfaces 120 may use this sampling information to determine whether they should send detection signals D
in response to detected triggers.
in response to detected triggers.
[70] Further, the sampling information may be specific for a particular zone.
In other words, different zones may have different sampling information, and therefore, a table similar to Table 1 may be determined for each zone. Moreover, the sampling information may change as ratings change. For example, if the primary content on channel 29 at 8:00am becomes less popular, the percentage of 3% may be increased because the number of interfaces 120 tuned to that primary content may decrease, and a larger subset of those interfaces may be relied upon to provide the detection signals D for triggers within that primary content.
In other words, different zones may have different sampling information, and therefore, a table similar to Table 1 may be determined for each zone. Moreover, the sampling information may change as ratings change. For example, if the primary content on channel 29 at 8:00am becomes less popular, the percentage of 3% may be increased because the number of interfaces 120 tuned to that primary content may decrease, and a larger subset of those interfaces may be relied upon to provide the detection signals D for triggers within that primary content.
[71] It should be understood that the data represented by Table 1 is just one example form that the sampling information may take. The sampling information may take other forms from which similar information may be derived. For example, instead of including percentages in the cells of Table 1, the cells could indicate numbers of expected viewers, and from such numbers, the interfaces 120 can determine how often they should report triggers.
[72] It is recognized that the sampling information might not reflect actual viewership and/or that popularity of primary content may fluctuate. That is, fewer or more people than expected may be watching a particular channel at a particular time, and therefore, the amount of detection signals transmitted may be different than desired. Thus, it may become desirable to adjust the sampling information thereby fine-tuning the number of detection signals transmitted upstream in response to detected triggers. To allow for such an adjustment, a multiplier may be determined. For example, if it is determined that the percentage indicated in the sampling information is too high, and as a result, more than enough detection signals D are being received, then a multiplier may be determined which when multiplied by the percentage of the sampling information would decrease the percentage. For example, if the percentage of 60% for channel 28 at 8:00am causes twice the amount of detection signals D than desired to be transmitted, a multiplier of 0.5 may be determined and sent to the interfaces 120 using the 60% so that those interfaces 120 will use an adjusted percentage of 30% (60% x 0.5) to determine whether to report detection signals D going forward. In contrast, if the percentage of 10% for channel 25 at 9:00am causes a quarter (1/4) of the amount of detection signals D than desired to be transmitted, a multiplier of 4 may be determined and sent to the interfaces 120 using the 10% so that those interfaces 120 will use an adjusted percentage of 40% (10% x 4) to determine whether to report detection signals D going forward.
[731 A multiplier may be determined for each zone. Therefore, if fewer detection signals than expected are received from interfaces 120 in a first zone, while more detection signals than expected are received from interfaces 120 in a second zone, the multiplier for the first zone may be higher than that of the second zone. Multipliers may include a single value and may be determined at shorter intervals than the sampling information. Thus, transmission of the multipliers may use less bandwidth than transmission of updated sampling information.
1741 Step 803 may include transmitting the sampling information and a multiplier from a local office 103, synchronization manager 542, or other computing device on the network 530 to one or more interfaces 120. The transmission at step 803 may be a multicast transmission in which the sampling information and multiplier are pushed to a plurality of interfaces 120. Further, this transmission may take place at a predetermined time. For example, the sampling information and multiplier may be transmitted downstream from the local office 103 to each interface 120 at a set time once a day. While transmission of the sampling information may be limited to conserve bandwidth, updated multipliers may be transmitted more frequently.
[75] At step 804, the sampling information and multiplier may be received by one or more interfaces 120. The interfaces 120 may store the sampling information and multiplier in a local cache.
[76] Step 805 illustrates that the local office 103 may also transmit the primary content Cl, C2 having embedded triggers downstream to the interfaces 120. This primary content Cl, C2 may originate at content providers. The primary content Cl, C2 along with the embedded triggers may be received by the interfaces 120 at step 806. The interface 120 may include a receiver configured to tune to a particular frequency to pick up a particular channel among the primary content Cl, C2.
[77] Upon receiving the primary content Cl, C2, the interfaces 120 may analyze the primary content Cl, C2 to detect triggers embedded within the content at step 807. The triggers may be embedded within primary content Cl, C2 at periodic intervals or at random intervals. Once a trigger is detected, the interface 120 may proceed to step 808. At step 808, each interface 120 that detects a trigger may determine whether it should send a detection signal D in response to detecting that particular trigger. In order to reduce upstream bandwidth, it may be desirable that not all interfaces 120 transmit a detection signal D in response to every trigger. For each trigger, only some interfaces 120 within a zone might send a detection signal D. Each interface 120 may make the determination on a trigger-by-trigger basis as to whether it should transmit a detection signal D. If the interface 120 determines not to send the detection signal D (No at step 808), the interface 120 may continue to monitor the primary content Cl, C2 to detect another trigger. That is, the process may return to step 807 to detect the next trigger when the interface 120 determines not to transmit a detection signal D in response to the most recently detected trigger. Further details regarding the determination at step 808 will be described below.
[78] If the interface 120 determines that it will transmit a detection signal D (Yes at step 808), the detection signal D may be generated and transmitted at step 809. The transmitted detection signal D may include an identifier to identify the primary content being presented on the first screen device 301, an identifier to identify the trigger that the detection signal D is being transmitted in response to, and/or time information indicating a time at which the trigger was detected (or a time at which the detection signal D is generated). The detection signal D may be transmitted to the synchronization manager 542 via the network 530 (e.g., the Internet). The detection signal D may be an IP packet (e.g., IPv4 or IPv6 packet) addressed to the synchronization manager 542.
[79] The detection signal D may be received by the synchronization manager 542 at step 810.
The synchronization manager 542 may decode the detection signal D to determine the zone that the interface 120, which transmitted the detection signal D, is located within.
Further, the synchronization manager 542 may decode the detection signal D to determine which trigger the detection signal D was sent in response to. For example, the synchronization manager 542 may determine an identifier from within the detection signal D that identifies a trigger. From this identifier, the synchronization manager 542 may determine the zone that the detection signal D came from and/or the content that the trigger was embedded within. As shown in FIG. 8, receipt of the detection signal may trigger a monitoring process of FIG. 10 described in more detail below.
[80] After receiving a detection signal D, the synchronization manager 542 may identify one or more second screen devices 302 to which a synchronization signal Si, S2 should be transmitted at step 811. Specifically, the synchronization manager 542 may determine which second screen devices 302 are in the same zone as the zone from which the detection signal D, received in step 810, is sent. This determination may be based on information stored in a database of the synchronization manager 542. The information may include a listing of all second screen devices 302 registered with a particular service and registered to have a particular default zone. Thus, for example, if the detection signal D received in step 810 was from an interface 120 in zone 2, the synchronization manager 542 may identify all second screen devices 302 that have been set with zone 2 as their default zone. In this example, the synchronization manager 542 may identify second screen devices 302 whether or not they are turned on or off and/or regardless of what they are currently being used for (e.g., the second screen devices 302 may be identified even though they are not being used to consume second screen content). In other cases, the synchronization manager 542 may identify only those second screen devices 302 that are currently operating within the same zone as the interface 120 that sent the detection signal D. As discussed above, a user may manipulate a second screen device 302 to specify the location (e.g., indicate the zone) in which the second screen device 302 is currently operating. This location information (e.g., the current zone of the second screen device 302) may be sent to the synchronization manager 542. Thus, when the synchronization manager 542 receives a detection signal D, the synchronization manager 542 may use the information it has received, specifying which second screen devices 302 are in which zones, to identify the second screen devices 302 that are currently operating in the same zone associated with the detection signal D. Further, the synchronization manager 542 may also keep track of what second screen content SSC is being requested from which second screen devices 302. Thus, in some examples, the second screen devices 302 identified by the synchronization manager 542 may be those second screen devices 302 that have requested the second screen content SSC associated with the same primary content Cl, C2 that the detection signal D is associated with.
Accordingly, for example, if a detection signal D was transmitted from zone 2, instead of identifying all second screen devices 302 in zone 2, the synchronization might only identify those second screen devices 302 in zone 2 that have requested second screen content SSC
associated with the same primary content Cl, C2 that the detection signal D is associated with.
[81] Regardless of which second screen devices 302 are identified or how they are identified, the identified set of second screen devices 302 may be sent a synchronization signal in step 812. The synchronization signal may serve to notify second screen devices 302 of the portion or point in time of the primary content presented on the first screen device 301. Based on this information, the second screen device 302 may determine which portion of the second screen content SSC, which it has previously received, should be presented. Thus, the second screen device 302 may synchronize the second screen content SSC with the primary content Cl, C2 presented on the first screen device 301 at step 813. For example, the second screen device 302 may use a segment number included within the synchronization signal Si, S2 to identify a portion of the primary content Cl, C2 that is being presented on the first screen device 301 and to determine a corresponding portion of the second screen content SSC to present.
Alternatively, the second screen device 302 may use time information included in the synchronization signal Si, S2 to identify a point in time of the primary content Cl, C2 and to determine a corresponding portion of the second screen content SSC to present. Notably, the second screen content SSC may be indexed by time points of the primary content Cl, C2 and/or segment numbers of the primary content Cl, C2 so that the corresponding portions of the second screen content SSC can be determined from the information in the synchronization signals Si, S2.
[82] In some examples, second screen content SSC may be indexed according to its own time points and/or segment numbers. In such examples, the synchronization manager 542 may translate the detection signals D into information indicating the time points and/or segment numbers of the second screen content SSC. The synchronization manager may then generate and transmit synchronization signals 51, S2 including this information so that the second screen device 302 may use this information to extract the appropriate portion of second screen content SSC to present. In light of this disclosure, it should be understood that the second screen content SSC and primary content Cl, C2 may be correlated with one another in various manners, and therefore, the synchronization signals Si, S2 may include various types of information that can be used by second screen devices 302 to synchronize the second screen content SSC with the primary content Cl, C2.
[83] FIG. 9A illustrates a process that may be performed by an interface 120 in accordance with an aspect of the disclosure. More specifically, FIG. 9A shows steps that an interface 120 may perform to determine whether it should report a detected trigger. In other words, the steps of FIG. 9A allow an interface 120 to determine whether the interface 120 belongs to a subset of interfaces 120 that should report a detected trigger embedded within content. In some embodiments, the process of FIG. 9A may be performed in place of steps 807-809 of FIG. 8.
[84] The process of Fig. 9A may begin with step 901a in which a trigger may be detected.
Specifically, the interface 120 may monitor the primary content Cl, C2 presented by a first screen device 301 connected to the interface 120 to detect one or more triggers embedded within the content Cl, C2. The triggers may have a standard format so that the interface 120 may be configured to identify the standard format. For example, the interface 120 may be configured to identify EBIF triggers embedded within the primary content Cl, C2.
[85] Once a trigger is detected in step 901a, the interface 120 may determine time information at step 902a. The time information may indicate a time of day (e.g., 8:07 am) or a time period (e.g., 8:00am to 8:30am). The interface 120 may include an internal clock for this purpose. Alternatively, the interface 120 may extract time of day information from another signal or the trigger itself. Further, at step 903a, the interface 120 may determine a channel (e.g., television channel) or other identifier indicating the stream of primary content Cl, C2 being presented on the first screen device 301. Based on the time information determined in step 902a and the channel determined in step 903a, the interface 120 may retrieve a value indicating a percentage of interfaces 120 that the system would like to report detection signals D. This percentage may be extracted from statistical information previously sent to the interface 120 or retrieved in response to detecting the trigger at step 901a. Where statistics (e.g., television ratings) demonstrate that the channel determined in step 903a at the time determined in step 902a receives low viewership, the percentage retrieved at step 904a may be relatively high so that the synchronization manager 542 can be guaranteed to receive a detection signal D.
In contrast, where statistics (e.g., television ratings) demonstrate that the channel determined in step 903a at the time determined in step 902a receives high viewership, the percentage retrieved at step 904a may be relatively low so that the synchronization manager 542 can be guaranteed to receive a detection signal D without being overwhelmed with a high volume of detection signals D.
[86] Additionally, at step 905a, an interface 120 may retrieve a multiplier. The interface 120 may receive multipliers relatively frequently, and therefore, the multiplier retrieved at step 905a may be the most recently received multiplier. The retrieved multiplier may be specific to the zone in which the interface 120 resides. The multiplier may be used to adjust the statistics on a zone-by-zone basis in realization that users in different zones may have different preferences. Moreover, the multiplier may provide a lightweight (in terms of payload) means for adjusting the statistics as time passes. The multiplier may offer a way to adjust the statistics when it is determined that the statistics are not accurately representing the present viewership. For example, a spike in viewership of a particular television program may result due to uncommon circumstances (e.g., a current event involving an actor may increase viewership of a television program featuring that actor), and the multiplier may allow for a real-time adjustment of the statistics.
[87] Using the percentage and multiplier, the interface may execute an algorithm at step 906a.
The algorithm may be predetermined or provided to the interface 120 from an external source. The results of executing the algorithm may dictate whether the interface 120 sends a detection signal D. Various algorithms may be used in step 906a. The . , algorithms may take into account a random number, an identifier of the trigger, a MAC
address of the interface 120, epoch time, and/or other factors in addition to the multiplier and percentage described above. Fig. 9B described below provides an example algorithm.
[88] Regardless of the algorithm, step 907a may be performed to determine whether based on the results of the algorithm, the interface 120 should send a detection signal D. If the interface 120 determines that it should not send a detection signal D, the process may return to step 901a to detect the next trigger within the primary content Cl, C2.
However, if the interface 120 determines that it should send a detection signal D, the interface 120 may generate the detection signal D. Generating the detection signal D
may encompass packaging an identifier of the trigger detected in step 901a in an IP
packet having a header addressed to the synchronization manager 542. The interface 120 may then transmit the detection signal D, via the network 530, to the synchronization manager 542 at step 908a.
[89] FIG. 98 illustrates an example process for determining whether to transmit a detection signal D. Step 904b illustrates that a percentage P may be retrieved from statistical information. Meanwhile, step 905b illustrates that a multiplier M may be retrieved. In an example algorithm of the disclosure, step 906b may include generating a random number X, where X is a value between 0 and 1. For example, step 906b may determine that X
equals 0.065. Any means for generating a random number may be used. Although represented as a decimal number in this disclosure, the random number may be represented by a binary number, hexadecimal number, etc., and the algorithm may be modified accordingly.
[90] After the random number is generated, the algorithm may be executed by the interface 120 at step 906b. In the example of FIG. 9B executing the algorithm may include computing M*P, where M is the multiplier and P is the percentage. The result of the . .
computation may then be compared against the random number X at step 907b. If the random number X is less than or equal to the result, the interface 120 may determine that is should transmit a detection signal D. In contrast, if the random number X
is greater than the result, the interface 120 might not transmit the detection signal D.
[91] FIG. 10 illustrates a process of managing the detection signal traffic. As described above, an aspect of the disclosure includes a process of selecting a subset of interfaces 120 to report detection signals D so that a plurality of second screen devices 302 may present second screen content SSC in synchronization with primary content Cl, C2.
Another aspect of the disclosure includes synchronizing second screen content SSC while optimizing network traffic. Specifically, multipliers may be used to optimize the size of the subset so that the number of interfaces 120 transmitting detection signals D is enough to provide a reliable synchronized experience, but not so many that network congestion results. FIG. 10 provides a process that monitors the number of detection signals D
received, and adjusts multipliers in order to manage network traffic, and in particular upstream network traffic (e.g., data that flows from interfaces 120 upstream to the local office 103 and/or other devices on the network 530). The process of FIG. 10 may be performed by the synchronization manager 542 or another computing device (e.g., server) configured to receive detection signals D.
[92] As shown in FIG. 10, the process of FIG. 10 may be initiated in response to receipt of a detection signal in step 810 of FIG. 8. The process of FIG. 10 begins with step 1001 in which a computing device (e.g., the synchronization manager 542 detects that a detection signal D has been received. In some examples, the process in FIG. 10 may be performed every time a detection signal D is received. In other examples, the process of FIG. 10 may only be executed after a certain number of detection signals D is received and/or after a certain point in time has elapsed, so that the synchronization manager 542 may have time to collect a sufficient sample of detection signals D. By changing the certain number or certain point in time for triggering the performance of the process of FIG. 10, the degree of sensitivity for monitoring the network traffic may be modified.
[93] After receiving a detection signal D, the synchronization manager 542 may decode the detection signal D to determine an identifier of the detection signal D at step 1002. This identifier may correspond to an identifier of the trigger that prompted the detection signal D to be sent in the first place. From the identifier, the synchronization manager 542 may determine a zone in which the detection signal D originated. That is, the synchronization manager 542 may determine which zone the interface 120 that sent the detection signal D
is located in. The synchronization manager 542 may also determine the identity of the primary content Cl, C2 (and/or a portion thereof) containing the trigger that prompted the detection signal D to be transmitted. Additionally, or alternatively, the synchronization manager 542 may decode the detection signal D to identify the MAC address or IP
address associated with the interface 120 that sent the detection signal D, and determine the zone of the interface 120 from this information. Further, the synchronization manager 542 may track MAC addresses and IP addresses of the interfaces 120 sending the detection signals D to verify that the system is working properly. If it is determined, based on collected MAC addresses or IP addresses, that a particular interface 120 is sending too many detection signals D, the synchronization manager 542 may alert an operator (or other administrative entity) that the system or the particular interface 120 might be malfunctioning.
[94] The decoded information may be stored at step 1003. Specifically, the synchronization manager 542 may store, or cause another computing device to store (e.g., a database), information identifying the primary content Cl, C2 having the trigger that caused the detection signal D to be sent, the zone from which the detection signal D was sent (or other location information, e.g., a MAC address), and/or a time that the detection signal D was transmitted (or received). Because many detection signals D may be received at the same time or within a short period of time from one another, the decoded information . .
may be temporarily stored and discarded after a set period of time. In some examples, temporarily storing the decoded information may be sufficient because the decoded information might only be used to determine how many interfaces 120 are responding to the same trigger. Since the detection signals D received in response to the same trigger are expected to be received within a relatively short time period of one another, after a certain time period, the decoded information may be deleted from memory because it may be assumed that all of the detection signals D for that trigger should have been received.
[95] In step 1004, the synchronization manager 542 may compare the stored information with desired results. For example, the synchronization manager 542 may evaluate the information stored in step 1003 to determine how many similar detection signals D are being received. Specifically, the synchronization manager 542 may count the number of detection signals D that were received from the same zone in response to the same trigger as the detection signal D received in step 1001. The synchronization manager 542 may then compare this sum to a predetermined value representing the desired results to determine whether the sum is greater than, less than, or equal to the predetermined value.
In step 1004 of FIG. 10, only one sum may be calculated. However, because different detection signals D (e.g., detection signals D from different zones and in response to different triggers in the primary content Cl, C2) may be received at step 1001, other sums may be calculated in other iterations of step 1004.
[96] In step 1005, the synchronization manager 542 may determine whether too many responses to triggers (e.g., detection signals) are being received.
Specifically, the synchronization manager 542 may determine whether the number of detection signals D
received from a particular zone in response to a particular trigger exceeds the desired results based on the comparison in step 1004. An upper threshold may be used to determine what number is too many. For example, it may be acceptable to receive a certain number of detection signals D (e.g., 5 signals) over the desired results (e.g., 20 signals), but it may be considered too many if an upper threshold (e.g., 50 total signals) is exceeded. If too many responses are being received (Yes at step 1005), step 1006 may be performed to decrease a multiplier. By decreasing the multiplier, which interfaces 120 use to determine whether to send the detection signals D, the number of detection signals transmitted from the interfaces 120 may be reduced. As a result, the upstream bandwidth of the links 101 and the network 530 may be conserved and network congestion may be reduced. After the multiplier is decreased, the multiplier may be transmitted at step 1007 to each of the interfaces 120 in the zone identified as responding with too many detection signals D. Step 1007 may be performed periodically (e.g., once a day) or soon after the multiplier is decreased. By controlling when the multiplier is transmitted in step 1007, the system may be fine-tuned to balance reliability with network traffic. For example, by controlling step 1007 to transmit the multiplier once a day, the system may avoid over-reacting to uncommon circumstances.
[97] If the synchronization manager 542 determines that there are not an excess number of responses (No at step 1005), step 1008 may be performed to determine whether too few responses are received. Specifically, the synchronization manager 542 may determine whether the number of detection signals D received from a particular zone in response to a particular trigger is less than the desired results based on the comparison in step 1004.
A lower threshold may be used to determine what number of detection signals D
is too few. For example, it may be acceptable to receive a certain number of detection signals D (e.g., 10 signals) below the desired results (e.g., 20 signals), but it may be considered too few if a lower threshold (e.g., 5 total signals) is not met. If too few responses are being received (Yes at step 1008), step 1009 may be performed to increase a multiplier.
By increasing the multiplier, which interfaces 120 use to determine whether to send the detection signals D, the number of detection signals D transmitted from the interfaces 120 may be increased. As a result, the synchronization manager 542 may ensure that it receives enough detection signals D so that it can send a synchronization signal to provide a reliable synchronized experience for users of the second screen devices 302.
. .
After the multiplier is increased, the multiplier may be transmitted at step 1010 to each of the interfaces 120 in the zone identified as responding with too few detection signals D.
Step 1010 may be performed periodically (e.g., once a day) or soon after the multiplier is decreased. By controlling when the multiplier is transmitted in step 1010, the system may be fine-tuned to balance reliability with network traffic.
[98] If it is determined that a sufficient amount of detection signals D
are received at step 1008, the process may return to step 1001 to continue to monitor subsequently received detection signals D. The process may also return to step 1001 after transmitting the adjusted multiplier in either steps 1007 or 1010.
[99] Although example embodiments are described above, the various features and steps may be combined, divided, omitted, and/or augmented in any desired manner, depending on the specific secure process desired. For example, the process of FIG. 8 may be modified so that step 802 is performed before or at the same time that step 801 is performed.
Additionally, although second screen experiences are contemplated as being implemented on two separate devices (a first screen device 301 and a second screen device 302), aspects of the disclosure may be enjoyed/implemented on one device having two viewing areas. For example, a second screen experience may be implemented on a single device (e.g., a television) using picture-in-picture to present both primary content and supplemental content simultaneously. This patent should not be limited to the example embodiments described, but rather should have its scope determined by the claims that follow.
[731 A multiplier may be determined for each zone. Therefore, if fewer detection signals than expected are received from interfaces 120 in a first zone, while more detection signals than expected are received from interfaces 120 in a second zone, the multiplier for the first zone may be higher than that of the second zone. Multipliers may include a single value and may be determined at shorter intervals than the sampling information. Thus, transmission of the multipliers may use less bandwidth than transmission of updated sampling information.
1741 Step 803 may include transmitting the sampling information and a multiplier from a local office 103, synchronization manager 542, or other computing device on the network 530 to one or more interfaces 120. The transmission at step 803 may be a multicast transmission in which the sampling information and multiplier are pushed to a plurality of interfaces 120. Further, this transmission may take place at a predetermined time. For example, the sampling information and multiplier may be transmitted downstream from the local office 103 to each interface 120 at a set time once a day. While transmission of the sampling information may be limited to conserve bandwidth, updated multipliers may be transmitted more frequently.
[75] At step 804, the sampling information and multiplier may be received by one or more interfaces 120. The interfaces 120 may store the sampling information and multiplier in a local cache.
[76] Step 805 illustrates that the local office 103 may also transmit the primary content Cl, C2 having embedded triggers downstream to the interfaces 120. This primary content Cl, C2 may originate at content providers. The primary content Cl, C2 along with the embedded triggers may be received by the interfaces 120 at step 806. The interface 120 may include a receiver configured to tune to a particular frequency to pick up a particular channel among the primary content Cl, C2.
[77] Upon receiving the primary content Cl, C2, the interfaces 120 may analyze the primary content Cl, C2 to detect triggers embedded within the content at step 807. The triggers may be embedded within primary content Cl, C2 at periodic intervals or at random intervals. Once a trigger is detected, the interface 120 may proceed to step 808. At step 808, each interface 120 that detects a trigger may determine whether it should send a detection signal D in response to detecting that particular trigger. In order to reduce upstream bandwidth, it may be desirable that not all interfaces 120 transmit a detection signal D in response to every trigger. For each trigger, only some interfaces 120 within a zone might send a detection signal D. Each interface 120 may make the determination on a trigger-by-trigger basis as to whether it should transmit a detection signal D. If the interface 120 determines not to send the detection signal D (No at step 808), the interface 120 may continue to monitor the primary content Cl, C2 to detect another trigger. That is, the process may return to step 807 to detect the next trigger when the interface 120 determines not to transmit a detection signal D in response to the most recently detected trigger. Further details regarding the determination at step 808 will be described below.
[78] If the interface 120 determines that it will transmit a detection signal D (Yes at step 808), the detection signal D may be generated and transmitted at step 809. The transmitted detection signal D may include an identifier to identify the primary content being presented on the first screen device 301, an identifier to identify the trigger that the detection signal D is being transmitted in response to, and/or time information indicating a time at which the trigger was detected (or a time at which the detection signal D is generated). The detection signal D may be transmitted to the synchronization manager 542 via the network 530 (e.g., the Internet). The detection signal D may be an IP packet (e.g., IPv4 or IPv6 packet) addressed to the synchronization manager 542.
[79] The detection signal D may be received by the synchronization manager 542 at step 810.
The synchronization manager 542 may decode the detection signal D to determine the zone that the interface 120, which transmitted the detection signal D, is located within.
Further, the synchronization manager 542 may decode the detection signal D to determine which trigger the detection signal D was sent in response to. For example, the synchronization manager 542 may determine an identifier from within the detection signal D that identifies a trigger. From this identifier, the synchronization manager 542 may determine the zone that the detection signal D came from and/or the content that the trigger was embedded within. As shown in FIG. 8, receipt of the detection signal may trigger a monitoring process of FIG. 10 described in more detail below.
[80] After receiving a detection signal D, the synchronization manager 542 may identify one or more second screen devices 302 to which a synchronization signal Si, S2 should be transmitted at step 811. Specifically, the synchronization manager 542 may determine which second screen devices 302 are in the same zone as the zone from which the detection signal D, received in step 810, is sent. This determination may be based on information stored in a database of the synchronization manager 542. The information may include a listing of all second screen devices 302 registered with a particular service and registered to have a particular default zone. Thus, for example, if the detection signal D received in step 810 was from an interface 120 in zone 2, the synchronization manager 542 may identify all second screen devices 302 that have been set with zone 2 as their default zone. In this example, the synchronization manager 542 may identify second screen devices 302 whether or not they are turned on or off and/or regardless of what they are currently being used for (e.g., the second screen devices 302 may be identified even though they are not being used to consume second screen content). In other cases, the synchronization manager 542 may identify only those second screen devices 302 that are currently operating within the same zone as the interface 120 that sent the detection signal D. As discussed above, a user may manipulate a second screen device 302 to specify the location (e.g., indicate the zone) in which the second screen device 302 is currently operating. This location information (e.g., the current zone of the second screen device 302) may be sent to the synchronization manager 542. Thus, when the synchronization manager 542 receives a detection signal D, the synchronization manager 542 may use the information it has received, specifying which second screen devices 302 are in which zones, to identify the second screen devices 302 that are currently operating in the same zone associated with the detection signal D. Further, the synchronization manager 542 may also keep track of what second screen content SSC is being requested from which second screen devices 302. Thus, in some examples, the second screen devices 302 identified by the synchronization manager 542 may be those second screen devices 302 that have requested the second screen content SSC associated with the same primary content Cl, C2 that the detection signal D is associated with.
Accordingly, for example, if a detection signal D was transmitted from zone 2, instead of identifying all second screen devices 302 in zone 2, the synchronization might only identify those second screen devices 302 in zone 2 that have requested second screen content SSC
associated with the same primary content Cl, C2 that the detection signal D is associated with.
[81] Regardless of which second screen devices 302 are identified or how they are identified, the identified set of second screen devices 302 may be sent a synchronization signal in step 812. The synchronization signal may serve to notify second screen devices 302 of the portion or point in time of the primary content presented on the first screen device 301. Based on this information, the second screen device 302 may determine which portion of the second screen content SSC, which it has previously received, should be presented. Thus, the second screen device 302 may synchronize the second screen content SSC with the primary content Cl, C2 presented on the first screen device 301 at step 813. For example, the second screen device 302 may use a segment number included within the synchronization signal Si, S2 to identify a portion of the primary content Cl, C2 that is being presented on the first screen device 301 and to determine a corresponding portion of the second screen content SSC to present.
Alternatively, the second screen device 302 may use time information included in the synchronization signal Si, S2 to identify a point in time of the primary content Cl, C2 and to determine a corresponding portion of the second screen content SSC to present. Notably, the second screen content SSC may be indexed by time points of the primary content Cl, C2 and/or segment numbers of the primary content Cl, C2 so that the corresponding portions of the second screen content SSC can be determined from the information in the synchronization signals Si, S2.
[82] In some examples, second screen content SSC may be indexed according to its own time points and/or segment numbers. In such examples, the synchronization manager 542 may translate the detection signals D into information indicating the time points and/or segment numbers of the second screen content SSC. The synchronization manager may then generate and transmit synchronization signals 51, S2 including this information so that the second screen device 302 may use this information to extract the appropriate portion of second screen content SSC to present. In light of this disclosure, it should be understood that the second screen content SSC and primary content Cl, C2 may be correlated with one another in various manners, and therefore, the synchronization signals Si, S2 may include various types of information that can be used by second screen devices 302 to synchronize the second screen content SSC with the primary content Cl, C2.
[83] FIG. 9A illustrates a process that may be performed by an interface 120 in accordance with an aspect of the disclosure. More specifically, FIG. 9A shows steps that an interface 120 may perform to determine whether it should report a detected trigger. In other words, the steps of FIG. 9A allow an interface 120 to determine whether the interface 120 belongs to a subset of interfaces 120 that should report a detected trigger embedded within content. In some embodiments, the process of FIG. 9A may be performed in place of steps 807-809 of FIG. 8.
[84] The process of Fig. 9A may begin with step 901a in which a trigger may be detected.
Specifically, the interface 120 may monitor the primary content Cl, C2 presented by a first screen device 301 connected to the interface 120 to detect one or more triggers embedded within the content Cl, C2. The triggers may have a standard format so that the interface 120 may be configured to identify the standard format. For example, the interface 120 may be configured to identify EBIF triggers embedded within the primary content Cl, C2.
[85] Once a trigger is detected in step 901a, the interface 120 may determine time information at step 902a. The time information may indicate a time of day (e.g., 8:07 am) or a time period (e.g., 8:00am to 8:30am). The interface 120 may include an internal clock for this purpose. Alternatively, the interface 120 may extract time of day information from another signal or the trigger itself. Further, at step 903a, the interface 120 may determine a channel (e.g., television channel) or other identifier indicating the stream of primary content Cl, C2 being presented on the first screen device 301. Based on the time information determined in step 902a and the channel determined in step 903a, the interface 120 may retrieve a value indicating a percentage of interfaces 120 that the system would like to report detection signals D. This percentage may be extracted from statistical information previously sent to the interface 120 or retrieved in response to detecting the trigger at step 901a. Where statistics (e.g., television ratings) demonstrate that the channel determined in step 903a at the time determined in step 902a receives low viewership, the percentage retrieved at step 904a may be relatively high so that the synchronization manager 542 can be guaranteed to receive a detection signal D.
In contrast, where statistics (e.g., television ratings) demonstrate that the channel determined in step 903a at the time determined in step 902a receives high viewership, the percentage retrieved at step 904a may be relatively low so that the synchronization manager 542 can be guaranteed to receive a detection signal D without being overwhelmed with a high volume of detection signals D.
[86] Additionally, at step 905a, an interface 120 may retrieve a multiplier. The interface 120 may receive multipliers relatively frequently, and therefore, the multiplier retrieved at step 905a may be the most recently received multiplier. The retrieved multiplier may be specific to the zone in which the interface 120 resides. The multiplier may be used to adjust the statistics on a zone-by-zone basis in realization that users in different zones may have different preferences. Moreover, the multiplier may provide a lightweight (in terms of payload) means for adjusting the statistics as time passes. The multiplier may offer a way to adjust the statistics when it is determined that the statistics are not accurately representing the present viewership. For example, a spike in viewership of a particular television program may result due to uncommon circumstances (e.g., a current event involving an actor may increase viewership of a television program featuring that actor), and the multiplier may allow for a real-time adjustment of the statistics.
[87] Using the percentage and multiplier, the interface may execute an algorithm at step 906a.
The algorithm may be predetermined or provided to the interface 120 from an external source. The results of executing the algorithm may dictate whether the interface 120 sends a detection signal D. Various algorithms may be used in step 906a. The . , algorithms may take into account a random number, an identifier of the trigger, a MAC
address of the interface 120, epoch time, and/or other factors in addition to the multiplier and percentage described above. Fig. 9B described below provides an example algorithm.
[88] Regardless of the algorithm, step 907a may be performed to determine whether based on the results of the algorithm, the interface 120 should send a detection signal D. If the interface 120 determines that it should not send a detection signal D, the process may return to step 901a to detect the next trigger within the primary content Cl, C2.
However, if the interface 120 determines that it should send a detection signal D, the interface 120 may generate the detection signal D. Generating the detection signal D
may encompass packaging an identifier of the trigger detected in step 901a in an IP
packet having a header addressed to the synchronization manager 542. The interface 120 may then transmit the detection signal D, via the network 530, to the synchronization manager 542 at step 908a.
[89] FIG. 98 illustrates an example process for determining whether to transmit a detection signal D. Step 904b illustrates that a percentage P may be retrieved from statistical information. Meanwhile, step 905b illustrates that a multiplier M may be retrieved. In an example algorithm of the disclosure, step 906b may include generating a random number X, where X is a value between 0 and 1. For example, step 906b may determine that X
equals 0.065. Any means for generating a random number may be used. Although represented as a decimal number in this disclosure, the random number may be represented by a binary number, hexadecimal number, etc., and the algorithm may be modified accordingly.
[90] After the random number is generated, the algorithm may be executed by the interface 120 at step 906b. In the example of FIG. 9B executing the algorithm may include computing M*P, where M is the multiplier and P is the percentage. The result of the . .
computation may then be compared against the random number X at step 907b. If the random number X is less than or equal to the result, the interface 120 may determine that is should transmit a detection signal D. In contrast, if the random number X
is greater than the result, the interface 120 might not transmit the detection signal D.
[91] FIG. 10 illustrates a process of managing the detection signal traffic. As described above, an aspect of the disclosure includes a process of selecting a subset of interfaces 120 to report detection signals D so that a plurality of second screen devices 302 may present second screen content SSC in synchronization with primary content Cl, C2.
Another aspect of the disclosure includes synchronizing second screen content SSC while optimizing network traffic. Specifically, multipliers may be used to optimize the size of the subset so that the number of interfaces 120 transmitting detection signals D is enough to provide a reliable synchronized experience, but not so many that network congestion results. FIG. 10 provides a process that monitors the number of detection signals D
received, and adjusts multipliers in order to manage network traffic, and in particular upstream network traffic (e.g., data that flows from interfaces 120 upstream to the local office 103 and/or other devices on the network 530). The process of FIG. 10 may be performed by the synchronization manager 542 or another computing device (e.g., server) configured to receive detection signals D.
[92] As shown in FIG. 10, the process of FIG. 10 may be initiated in response to receipt of a detection signal in step 810 of FIG. 8. The process of FIG. 10 begins with step 1001 in which a computing device (e.g., the synchronization manager 542 detects that a detection signal D has been received. In some examples, the process in FIG. 10 may be performed every time a detection signal D is received. In other examples, the process of FIG. 10 may only be executed after a certain number of detection signals D is received and/or after a certain point in time has elapsed, so that the synchronization manager 542 may have time to collect a sufficient sample of detection signals D. By changing the certain number or certain point in time for triggering the performance of the process of FIG. 10, the degree of sensitivity for monitoring the network traffic may be modified.
[93] After receiving a detection signal D, the synchronization manager 542 may decode the detection signal D to determine an identifier of the detection signal D at step 1002. This identifier may correspond to an identifier of the trigger that prompted the detection signal D to be sent in the first place. From the identifier, the synchronization manager 542 may determine a zone in which the detection signal D originated. That is, the synchronization manager 542 may determine which zone the interface 120 that sent the detection signal D
is located in. The synchronization manager 542 may also determine the identity of the primary content Cl, C2 (and/or a portion thereof) containing the trigger that prompted the detection signal D to be transmitted. Additionally, or alternatively, the synchronization manager 542 may decode the detection signal D to identify the MAC address or IP
address associated with the interface 120 that sent the detection signal D, and determine the zone of the interface 120 from this information. Further, the synchronization manager 542 may track MAC addresses and IP addresses of the interfaces 120 sending the detection signals D to verify that the system is working properly. If it is determined, based on collected MAC addresses or IP addresses, that a particular interface 120 is sending too many detection signals D, the synchronization manager 542 may alert an operator (or other administrative entity) that the system or the particular interface 120 might be malfunctioning.
[94] The decoded information may be stored at step 1003. Specifically, the synchronization manager 542 may store, or cause another computing device to store (e.g., a database), information identifying the primary content Cl, C2 having the trigger that caused the detection signal D to be sent, the zone from which the detection signal D was sent (or other location information, e.g., a MAC address), and/or a time that the detection signal D was transmitted (or received). Because many detection signals D may be received at the same time or within a short period of time from one another, the decoded information . .
may be temporarily stored and discarded after a set period of time. In some examples, temporarily storing the decoded information may be sufficient because the decoded information might only be used to determine how many interfaces 120 are responding to the same trigger. Since the detection signals D received in response to the same trigger are expected to be received within a relatively short time period of one another, after a certain time period, the decoded information may be deleted from memory because it may be assumed that all of the detection signals D for that trigger should have been received.
[95] In step 1004, the synchronization manager 542 may compare the stored information with desired results. For example, the synchronization manager 542 may evaluate the information stored in step 1003 to determine how many similar detection signals D are being received. Specifically, the synchronization manager 542 may count the number of detection signals D that were received from the same zone in response to the same trigger as the detection signal D received in step 1001. The synchronization manager 542 may then compare this sum to a predetermined value representing the desired results to determine whether the sum is greater than, less than, or equal to the predetermined value.
In step 1004 of FIG. 10, only one sum may be calculated. However, because different detection signals D (e.g., detection signals D from different zones and in response to different triggers in the primary content Cl, C2) may be received at step 1001, other sums may be calculated in other iterations of step 1004.
[96] In step 1005, the synchronization manager 542 may determine whether too many responses to triggers (e.g., detection signals) are being received.
Specifically, the synchronization manager 542 may determine whether the number of detection signals D
received from a particular zone in response to a particular trigger exceeds the desired results based on the comparison in step 1004. An upper threshold may be used to determine what number is too many. For example, it may be acceptable to receive a certain number of detection signals D (e.g., 5 signals) over the desired results (e.g., 20 signals), but it may be considered too many if an upper threshold (e.g., 50 total signals) is exceeded. If too many responses are being received (Yes at step 1005), step 1006 may be performed to decrease a multiplier. By decreasing the multiplier, which interfaces 120 use to determine whether to send the detection signals D, the number of detection signals transmitted from the interfaces 120 may be reduced. As a result, the upstream bandwidth of the links 101 and the network 530 may be conserved and network congestion may be reduced. After the multiplier is decreased, the multiplier may be transmitted at step 1007 to each of the interfaces 120 in the zone identified as responding with too many detection signals D. Step 1007 may be performed periodically (e.g., once a day) or soon after the multiplier is decreased. By controlling when the multiplier is transmitted in step 1007, the system may be fine-tuned to balance reliability with network traffic. For example, by controlling step 1007 to transmit the multiplier once a day, the system may avoid over-reacting to uncommon circumstances.
[97] If the synchronization manager 542 determines that there are not an excess number of responses (No at step 1005), step 1008 may be performed to determine whether too few responses are received. Specifically, the synchronization manager 542 may determine whether the number of detection signals D received from a particular zone in response to a particular trigger is less than the desired results based on the comparison in step 1004.
A lower threshold may be used to determine what number of detection signals D
is too few. For example, it may be acceptable to receive a certain number of detection signals D (e.g., 10 signals) below the desired results (e.g., 20 signals), but it may be considered too few if a lower threshold (e.g., 5 total signals) is not met. If too few responses are being received (Yes at step 1008), step 1009 may be performed to increase a multiplier.
By increasing the multiplier, which interfaces 120 use to determine whether to send the detection signals D, the number of detection signals D transmitted from the interfaces 120 may be increased. As a result, the synchronization manager 542 may ensure that it receives enough detection signals D so that it can send a synchronization signal to provide a reliable synchronized experience for users of the second screen devices 302.
. .
After the multiplier is increased, the multiplier may be transmitted at step 1010 to each of the interfaces 120 in the zone identified as responding with too few detection signals D.
Step 1010 may be performed periodically (e.g., once a day) or soon after the multiplier is decreased. By controlling when the multiplier is transmitted in step 1010, the system may be fine-tuned to balance reliability with network traffic.
[98] If it is determined that a sufficient amount of detection signals D
are received at step 1008, the process may return to step 1001 to continue to monitor subsequently received detection signals D. The process may also return to step 1001 after transmitting the adjusted multiplier in either steps 1007 or 1010.
[99] Although example embodiments are described above, the various features and steps may be combined, divided, omitted, and/or augmented in any desired manner, depending on the specific secure process desired. For example, the process of FIG. 8 may be modified so that step 802 is performed before or at the same time that step 801 is performed.
Additionally, although second screen experiences are contemplated as being implemented on two separate devices (a first screen device 301 and a second screen device 302), aspects of the disclosure may be enjoyed/implemented on one device having two viewing areas. For example, a second screen experience may be implemented on a single device (e.g., a television) using picture-in-picture to present both primary content and supplemental content simultaneously. This patent should not be limited to the example embodiments described, but rather should have its scope determined by the claims that follow.
Claims (20)
1. A method, comprising:
transmitting sampling information to a plurality of interfaces;
receiving one or more trigger detection signals indicating receipt of particular triggers at a first subset of the interfaces, the first subset of the interfaces determined based on the sampling information; and transmitting synchronization signals to a plurality of second user devices in response to the one or more trigger detection signals.
transmitting sampling information to a plurality of interfaces;
receiving one or more trigger detection signals indicating receipt of particular triggers at a first subset of the interfaces, the first subset of the interfaces determined based on the sampling information; and transmitting synchronization signals to a plurality of second user devices in response to the one or more trigger detection signals.
2. The method of claim 1, further comprising:
providing content to the plurality of interfaces, wherein the content includes triggers that cause the first subset of the interfaces to transmit the one or more trigger detection signals.
providing content to the plurality of interfaces, wherein the content includes triggers that cause the first subset of the interfaces to transmit the one or more trigger detection signals.
3. The method of claim 1, wherein the sampling information is derived from statistical data representing consumption of content received by the plurality of interfaces.
4. The method of claim 1, wherein the sampling information indicates an ideal size of the first subset of the interfaces for a particular channel at a particular time.
5. The method of claim 1, further comprising transmitting a multiplier to adjust the sampling information.
6. The method of claim 5, wherein the multiplier is determined based on a number of the received trigger detection signals.
7. The method of claim 1, further comprising:
decoding at least one of the trigger detection signals to determine a zone associated with the at least one trigger detection signal; and selecting the plurality of second user devices by identifying them as being associated with the determined zone.
decoding at least one of the trigger detection signals to determine a zone associated with the at least one trigger detection signal; and selecting the plurality of second user devices by identifying them as being associated with the determined zone.
8. The method of claim 1, wherein the synchronization signals indicate a particular portion of supplemental content to be presented by the plurality of second user devices.
9. A method, comprising:
receiving a supplemental content request from a second user device;
determining a group associated with the second user device;
identifying supplemental content corresponding to the supplemental content request;
transmitting the supplemental content to the second user device;
generating a synchronization signal specific to the group that is associated with the second user device; and transmitting the synchronization signal to the second user device.
receiving a supplemental content request from a second user device;
determining a group associated with the second user device;
identifying supplemental content corresponding to the supplemental content request;
transmitting the supplemental content to the second user device;
generating a synchronization signal specific to the group that is associated with the second user device; and transmitting the synchronization signal to the second user device.
10. The method of claim 9, further comprising:
receiving a new supplemental content request; and determining whether the group associated with the second user device has changed based on the new supplemental content request.
receiving a new supplemental content request; and determining whether the group associated with the second user device has changed based on the new supplemental content request.
11. The method of claim 10, further comprising:
if the group has changed, generating a new synchronization signal specific to a new group of the second user device; and transmitting the new synchronization signal to the second user device.
if the group has changed, generating a new synchronization signal specific to a new group of the second user device; and transmitting the new synchronization signal to the second user device.
12. The method of claim 9, wherein the synchronization signal is a multicast signal sent to additional second user devices associated with the group of the second user device.
13. The method of claim 9, wherein the group associated with the second user device is associated with a group of interfaces in a zone that receives common primary content.
14. The method of claim 9, wherein the determining of the group associated with the second user device is based on information within the supplemental content request.
15. The method of claim 9, wherein the determining of the group associated with the second user device is based on information stored in a profile associated with the second user device from which the supplemental content request is received.
16. A method, comprising:
receiving, by a device, a detection signal;
identifying a trigger in the detection signal that caused the detection signal to be transmitted to the device;
comparing a count of detection signals received in association with the identified trigger with a desired number; and determining whether to adjust a multiplier based on the comparison.
receiving, by a device, a detection signal;
identifying a trigger in the detection signal that caused the detection signal to be transmitted to the device;
comparing a count of detection signals received in association with the identified trigger with a desired number; and determining whether to adjust a multiplier based on the comparison.
17. The method of claim 16, wherein the comparing comprises:
counting the signals received in response to the identified trigger to determine the count;
and calculating a difference between the count and the desired number.
counting the signals received in response to the identified trigger to determine the count;
and calculating a difference between the count and the desired number.
18. The method of claim 17, further comprising:
determining whether the calculated difference exceeds an upper threshold; and decreasing the multiplier if the calculated difference exceeds the upper threshold.
determining whether the calculated difference exceeds an upper threshold; and decreasing the multiplier if the calculated difference exceeds the upper threshold.
19. The method of claim 17, further comprising:
determining whether the calculated difference is below a lower threshold; and increasing the multiplier if the calculated difference is below the lower threshold.
determining whether the calculated difference is below a lower threshold; and increasing the multiplier if the calculated difference is below the lower threshold.
20.
The method of claim 16, further comprising, when the multiplier is adjusted, transmitting the adjusted multiplier to user devices associated with a zone that is associated with the detection signal, wherein the identified trigger is embedded within the content, and wherein the multiplier impacts whether additional signals are subsequently sent in response to other triggers embedded within the content.
The method of claim 16, further comprising, when the multiplier is adjusted, transmitting the adjusted multiplier to user devices associated with a zone that is associated with the detection signal, wherein the identified trigger is embedded within the content, and wherein the multiplier impacts whether additional signals are subsequently sent in response to other triggers embedded within the content.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/798,843 US9553927B2 (en) | 2013-03-13 | 2013-03-13 | Synchronizing multiple transmissions of content |
US13/798,843 | 2013-03-13 |
Publications (2)
Publication Number | Publication Date |
---|---|
CA2845465A1 true CA2845465A1 (en) | 2014-09-13 |
CA2845465C CA2845465C (en) | 2019-11-26 |
Family
ID=50342163
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CA2845465A Active CA2845465C (en) | 2013-03-13 | 2014-03-11 | Synchronizing multiple transmissions of content |
Country Status (3)
Country | Link |
---|---|
US (1) | US9553927B2 (en) |
EP (1) | EP2779664A3 (en) |
CA (1) | CA2845465C (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9451196B2 (en) | 2002-03-15 | 2016-09-20 | Comcast Cable Communications, Llc | System and method for construction, delivery and display of iTV content |
US9516253B2 (en) | 2002-09-19 | 2016-12-06 | Tvworks, Llc | Prioritized placement of content elements for iTV applications |
US11115722B2 (en) | 2012-11-08 | 2021-09-07 | Comcast Cable Communications, Llc | Crowdsourcing supplemental content |
US11832024B2 (en) | 2008-11-20 | 2023-11-28 | Comcast Cable Communications, Llc | Method and apparatus for delivering video and video-related content at sub-asset level |
Families Citing this family (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8365230B2 (en) | 2001-09-19 | 2013-01-29 | Tvworks, Llc | Interactive user interface for television applications |
US8413205B2 (en) | 2001-09-19 | 2013-04-02 | Tvworks, Llc | System and method for construction, delivery and display of iTV content |
US11388451B2 (en) | 2001-11-27 | 2022-07-12 | Comcast Cable Communications Management, Llc | Method and system for enabling data-rich interactive television using broadcast database |
US7703116B1 (en) | 2003-07-11 | 2010-04-20 | Tvworks, Llc | System and method for construction, delivery and display of iTV applications that blend programming information of on-demand and broadcast service offerings |
US11381875B2 (en) | 2003-03-14 | 2022-07-05 | Comcast Cable Communications Management, Llc | Causing display of user-selectable content types |
US8578411B1 (en) | 2003-03-14 | 2013-11-05 | Tvworks, Llc | System and method for controlling iTV application behaviors through the use of application profile filters |
US10664138B2 (en) | 2003-03-14 | 2020-05-26 | Comcast Cable Communications, Llc | Providing supplemental content for a second screen experience |
US8819734B2 (en) | 2003-09-16 | 2014-08-26 | Tvworks, Llc | Contextual navigational control for digital television |
US7818667B2 (en) | 2005-05-03 | 2010-10-19 | Tv Works Llc | Verification of semantic constraints in multimedia data and in its announcement, signaling and interchange |
US9973582B2 (en) | 2009-10-19 | 2018-05-15 | Tritan Software International | Method and apparatus for bi-directional communication and data replication between multiple locations during intermittent connectivity |
US9774702B2 (en) * | 2009-10-19 | 2017-09-26 | Tritan Software Corporation | System and method of employing a client side device to access local and remote data during communication disruptions |
US20140093219A1 (en) * | 2012-09-28 | 2014-04-03 | NoiseToys Inc. | Multiple Data Source Aggregation for Efficient Synchronous Multi-Device Media Consumption |
TWI505698B (en) * | 2012-12-06 | 2015-10-21 | Inst Information Industry | Synchronous displaying system for displaying multi-view frame and method for synchronously displaying muti-view frame |
US9167278B2 (en) * | 2012-12-28 | 2015-10-20 | Turner Broadcasting System, Inc. | Method and system for automatic content recognition (ACR) based broadcast synchronization |
US9179185B2 (en) * | 2013-03-14 | 2015-11-03 | The Nielsen Company (Us), Llc | Methods and apparatus to determine a number of people in an area |
US10880609B2 (en) | 2013-03-14 | 2020-12-29 | Comcast Cable Communications, Llc | Content event messaging |
US10291942B2 (en) * | 2013-03-14 | 2019-05-14 | NBCUniversal Medial, LLC | Interactive broadcast system and method |
US9681189B2 (en) * | 2013-06-20 | 2017-06-13 | Microsoft Technology Licensing, Llc | Paired devices |
WO2016048344A1 (en) * | 2014-09-26 | 2016-03-31 | Hewlett Packard Enterprise Development Lp | Caching nodes |
US11783382B2 (en) * | 2014-10-22 | 2023-10-10 | Comcast Cable Communications, Llc | Systems and methods for curating content metadata |
US10860622B1 (en) | 2015-04-06 | 2020-12-08 | EMC IP Holding Company LLC | Scalable recursive computation for pattern identification across distributed data processing nodes |
US10776404B2 (en) | 2015-04-06 | 2020-09-15 | EMC IP Holding Company LLC | Scalable distributed computations utilizing multiple distinct computational frameworks |
US10277668B1 (en) | 2015-04-06 | 2019-04-30 | EMC IP Holding Company LLC | Beacon-based distributed data processing platform |
US10812341B1 (en) * | 2015-04-06 | 2020-10-20 | EMC IP Holding Company LLC | Scalable recursive computation across distributed data processing nodes |
US10791063B1 (en) | 2015-04-06 | 2020-09-29 | EMC IP Holding Company LLC | Scalable edge computing using devices with limited resources |
US10425350B1 (en) | 2015-04-06 | 2019-09-24 | EMC IP Holding Company LLC | Distributed catalog service for data processing platform |
US10706970B1 (en) | 2015-04-06 | 2020-07-07 | EMC IP Holding Company LLC | Distributed data analytics |
US10656861B1 (en) | 2015-12-29 | 2020-05-19 | EMC IP Holding Company LLC | Scalable distributed in-memory computation |
US10838613B2 (en) | 2016-02-17 | 2020-11-17 | Trufan Llc | Consumer electronic entertainment and display system |
US20170238041A1 (en) * | 2016-02-17 | 2017-08-17 | Christopher Alsante | Consumer electronic entertainment and display system |
US11451865B1 (en) * | 2019-04-11 | 2022-09-20 | CUE Audio, LLC | Relay of audio and/or video steganography to improve second screen experiences |
US11303943B2 (en) * | 2019-10-16 | 2022-04-12 | Dish Network L.L.C. | Systems and methods for facilitating adaptive content items for delivery in a packet stream |
CN114503600A (en) * | 2019-10-31 | 2022-05-13 | 六科股份有限公司 | Content modification system with delay buffer feature |
US10795864B1 (en) | 2019-12-30 | 2020-10-06 | Tritan Software Corporation | Method and apparatus for bi-directional communication and data replication between local and remote databases during intermittent connectivity |
US11245946B2 (en) | 2020-01-21 | 2022-02-08 | Dish Network L.L.C. | Systems and methods for adapting content items to secured endpoint media device data |
Family Cites Families (314)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5321750A (en) | 1989-02-07 | 1994-06-14 | Market Data Corporation | Restricted information distribution system apparatus and methods |
US5353121A (en) | 1989-10-30 | 1994-10-04 | Starsight Telecast, Inc. | Television schedule system |
US5287489A (en) | 1990-10-30 | 1994-02-15 | Hughes Training, Inc. | Method and system for authoring, editing and testing instructional materials for use in simulated trailing systems |
US5592551A (en) | 1992-12-01 | 1997-01-07 | Scientific-Atlanta, Inc. | Method and apparatus for providing interactive electronic programming guide |
ATE219615T1 (en) | 1992-12-09 | 2002-07-15 | Discovery Communicat Inc | NETWORK CONTROL FOR CABLE TELEVISION DISTRIBUTION SYSTEMS |
US7721307B2 (en) | 1992-12-09 | 2010-05-18 | Comcast Ip Holdings I, Llc | Method and apparatus for targeting of interactive virtual objects |
US5539449A (en) | 1993-05-03 | 1996-07-23 | At&T Corp. | Integrated television services system |
US6239794B1 (en) | 1994-08-31 | 2001-05-29 | E Guide, Inc. | Method and system for simultaneously displaying a television program and information about the program |
US5485221A (en) | 1993-06-07 | 1996-01-16 | Scientific-Atlanta, Inc. | Subscription television system and terminal for enabling simultaneous display of multiple services |
US5594509A (en) | 1993-06-22 | 1997-01-14 | Apple Computer, Inc. | Method and apparatus for audio-visual interface for the display of multiple levels of information on a display |
US5621456A (en) | 1993-06-22 | 1997-04-15 | Apple Computer, Inc. | Methods and apparatus for audio-visual interface for the display of multiple program categories |
US5589892A (en) | 1993-09-09 | 1996-12-31 | Knee; Robert A. | Electronic television program guide schedule system and method with data feed access |
EP0663639A1 (en) | 1994-01-14 | 1995-07-19 | International Business Machines Corporation | Method for creating a multimedia application |
AU2594595A (en) | 1994-05-16 | 1995-12-05 | Apple Computer, Inc. | Pattern and color abstraction in a graphical user interface |
WO1995031773A1 (en) | 1994-05-16 | 1995-11-23 | Apple Computer, Inc. | Switching between appearance/behavior themes in graphical user interfaces |
US5675752A (en) | 1994-09-15 | 1997-10-07 | Sony Corporation | Interactive applications generator for an interactive presentation environment |
US20050086172A1 (en) | 1994-11-23 | 2005-04-21 | Contentguard Holdings, Inc. | Method, system and device for providing educational content |
US5758257A (en) | 1994-11-29 | 1998-05-26 | Herz; Frederick | System and method for scheduling broadcast of and access to video programs and other data using customer profiles |
US6008803A (en) | 1994-11-29 | 1999-12-28 | Microsoft Corporation | System for displaying programming information |
US6005561A (en) | 1994-12-14 | 1999-12-21 | The 3Do Company | Interactive information delivery system |
US5826102A (en) | 1994-12-22 | 1998-10-20 | Bell Atlantic Network Services, Inc. | Network arrangement for development delivery and presentation of multimedia applications using timelines to integrate multimedia objects and program objects |
US5659793A (en) | 1994-12-22 | 1997-08-19 | Bell Atlantic Video Services, Inc. | Authoring tools for multimedia application development and network delivery |
US6426779B1 (en) | 1995-01-04 | 2002-07-30 | Sony Electronics, Inc. | Method and apparatus for providing favorite station and programming information in a multiple station broadcast system |
US5583563A (en) | 1995-01-12 | 1996-12-10 | Us West Marketing Resources Group, Inc. | Method and system for delivering an application in an interactive television network |
JPH08314979A (en) | 1995-03-13 | 1996-11-29 | Matsushita Electric Ind Co Ltd | Method and device for displaying program information on display |
US5666645A (en) | 1995-04-26 | 1997-09-09 | News America Publications, Inc. | Data management and distribution system and method for an electronic television program guide |
FR2736783B1 (en) | 1995-07-13 | 1997-08-14 | Thomson Multimedia Sa | METHOD AND APPARATUS FOR RECORDING AND PLAYBACK WITH LARGE CAPACITY RECORDING MEDIUM |
US5860073A (en) | 1995-07-17 | 1999-01-12 | Microsoft Corporation | Style sheets for publishing system |
US5801753A (en) | 1995-08-11 | 1998-09-01 | General Instrument Corporation Of Delaware | Method and apparatus for providing an interactive guide to events available on an information network |
US6002394A (en) | 1995-10-02 | 1999-12-14 | Starsight Telecast, Inc. | Systems and methods for linking television viewers with advertisers and broadcasters |
US6049823A (en) | 1995-10-04 | 2000-04-11 | Hwang; Ivan Chung-Shung | Multi server, interactive, video-on-demand television system utilizing a direct-access-on-demand workgroup |
US5802284A (en) | 1995-12-13 | 1998-09-01 | Silicon Graphics, Inc. | System and method using cover bundles to provide immediate feedback to a user in an interactive television environment |
US5694176A (en) | 1996-02-29 | 1997-12-02 | Hughes Electronics | Method and apparatus for generating television program guides with category selection overlay |
US20020049832A1 (en) | 1996-03-08 | 2002-04-25 | Craig Ullman | Enhanced video programming system and method for incorporating and displaying retrieved integrated internet information segments |
US6025837A (en) | 1996-03-29 | 2000-02-15 | Micrsoft Corporation | Electronic program guide with hyperlinks to target resources |
US6240555B1 (en) | 1996-03-29 | 2001-05-29 | Microsoft Corporation | Interactive entertainment system for presenting supplemental interactive content together with continuous video programs |
US6314420B1 (en) | 1996-04-04 | 2001-11-06 | Lycos, Inc. | Collaborative/adaptive search engine |
US5657072A (en) | 1996-04-10 | 1997-08-12 | Microsoft Corporation | Interactive entertainment network system and method for providing program listings during non-peak times |
US5852435A (en) | 1996-04-12 | 1998-12-22 | Avid Technology, Inc. | Digital multimedia editing and data management system |
US5929849A (en) | 1996-05-02 | 1999-07-27 | Phoenix Technologies, Ltd. | Integration of dynamic universal resource locators with television presentations |
JPH09307827A (en) | 1996-05-16 | 1997-11-28 | Sharp Corp | Channel selection device |
US6008836A (en) | 1996-06-03 | 1999-12-28 | Webtv Networks, Inc. | Method and apparatus for adjusting television display control using a browser |
US6016144A (en) | 1996-08-14 | 2000-01-18 | Samsung Electronics Co., Ltd. | Multi-layered television graphical user interface |
US6191781B1 (en) | 1996-08-14 | 2001-02-20 | Samsung Electronics, Ltd. | Television graphical user interface that combines electronic program guide with graphical channel changer |
US5892902A (en) | 1996-09-05 | 1999-04-06 | Clark; Paul C. | Intelligent token protected system with network authentication |
US20030093790A1 (en) | 2000-03-28 | 2003-05-15 | Logan James D. | Audio and video program recording, editing and playback systems using metadata |
US6172677B1 (en) | 1996-10-07 | 2001-01-09 | Compaq Computer Corporation | Integrated content guide for interactive selection of content and services on personal computer systems with multiple sources and multiple media presentation |
US5905492A (en) | 1996-12-06 | 1999-05-18 | Microsoft Corporation | Dynamically updating themes for an operating system shell |
US6061695A (en) | 1996-12-06 | 2000-05-09 | Microsoft Corporation | Operating system shell having a windowing graphical user interface with a desktop displayed as a hypertext multimedia document |
US6405239B1 (en) | 1996-12-09 | 2002-06-11 | Scientific-Atlanta, Inc. | Using a hierarchical file system for indexing data broadcast to a client from a network of servers |
US6067108A (en) | 1996-12-12 | 2000-05-23 | Trw Inc. | Solid-state mass storage data stream generator |
US6177931B1 (en) | 1996-12-19 | 2001-01-23 | Index Systems, Inc. | Systems and methods for displaying and recording control interface with television programs, video, advertising information and program scheduling information |
US5892905A (en) | 1996-12-23 | 1999-04-06 | International Business Machines Corporation | Computer apparatus and method for providing a common user interface for software applications accessed via the world-wide web |
US5850218A (en) | 1997-02-19 | 1998-12-15 | Time Warner Entertainment Company L.P. | Inter-active program guide with default selection control |
US6141003A (en) | 1997-03-18 | 2000-10-31 | Microsoft Corporation | Channel bar user interface for an entertainment system |
US6072483A (en) | 1997-06-02 | 2000-06-06 | Sony Corporation | Active frame scroll interface |
US6195692B1 (en) | 1997-06-02 | 2001-02-27 | Sony Corporation | Television/internet system having multiple data stream connections |
US6292827B1 (en) | 1997-06-20 | 2001-09-18 | Shore Technologies (1999) Inc. | Information transfer systems and method with dynamic distribution of data, control and management of information |
US6317885B1 (en) | 1997-06-26 | 2001-11-13 | Microsoft Corporation | Interactive entertainment and information system using television set-top box |
US5990890A (en) | 1997-08-25 | 1999-11-23 | Liberate Technologies | System for data entry and navigation in a user interface |
US6108711A (en) | 1998-09-11 | 2000-08-22 | Genesys Telecommunications Laboratories, Inc. | Operating system having external media layer, workflow layer, internal media layer, and knowledge base for routing media events between transactions |
US5996025A (en) | 1997-10-31 | 1999-11-30 | International Business Machines Corp. | Network transparent access framework for multimedia serving |
US6205582B1 (en) | 1997-12-09 | 2001-03-20 | Ictv, Inc. | Interactive cable television system with frame server |
JPH11187324A (en) | 1997-12-19 | 1999-07-09 | Matsushita Electric Ind Co Ltd | Program information preparing device, its method and receiver |
US7152236B1 (en) | 1998-01-05 | 2006-12-19 | Gateway Inc. | Integration of internet sources into an electronic program database list |
US20030056216A1 (en) | 1998-01-05 | 2003-03-20 | Theodore D. Wugofski | System for managing favorite channels |
EP2076033A3 (en) | 1998-03-04 | 2009-09-30 | United Video Properties, Inc. | Program guide system with targeted advertising |
US6459427B1 (en) | 1998-04-01 | 2002-10-01 | Liberate Technologies | Apparatus and method for web-casting over digital broadcast TV network |
US6564379B1 (en) | 1998-04-30 | 2003-05-13 | United Video Properties, Inc. | Program guide system with flip and browse advertisements |
US6219839B1 (en) | 1998-05-12 | 2001-04-17 | Sharp Laboratories Of America, Inc. | On-screen electronic resources guide |
US6148081A (en) | 1998-05-29 | 2000-11-14 | Opentv, Inc. | Security model for interactive television applications |
US6314573B1 (en) | 1998-05-29 | 2001-11-06 | Diva Systems Corporation | Method and apparatus for providing subscription-on-demand services for an interactive information distribution system |
US6427238B1 (en) | 1998-05-29 | 2002-07-30 | Opentv, Inc. | Module manager for interactive television system |
US6636887B1 (en) | 1998-06-02 | 2003-10-21 | Mark A. Augeri | Tele-jam system and method for real-time musical interaction |
EP0963115A1 (en) | 1998-06-05 | 1999-12-08 | THOMSON multimedia | Apparatus and method for selecting viewers' profile in interactive TV |
US6763522B1 (en) | 1998-06-30 | 2004-07-13 | Sony Corporation | System and method for a digital television electronic program guide |
US6442755B1 (en) | 1998-07-07 | 2002-08-27 | United Video Properties, Inc. | Electronic program guide using markup language |
CN1867068A (en) | 1998-07-14 | 2006-11-22 | 联合视频制品公司 | Client-server based interactive television program guide system with remote server recording |
AR020608A1 (en) | 1998-07-17 | 2002-05-22 | United Video Properties Inc | A METHOD AND A PROVISION TO SUPPLY A USER REMOTE ACCESS TO AN INTERACTIVE PROGRAMMING GUIDE BY A REMOTE ACCESS LINK |
US6754905B2 (en) | 1998-07-23 | 2004-06-22 | Diva Systems Corporation | Data structure and methods for providing an interactive program guide |
AR019458A1 (en) | 1998-07-23 | 2002-02-20 | United Video Properties Inc | AN INTERACTIVE TELEVISION PROGRAMMING GUIDE PROVISION THAT SERVES AS AN ENTRY |
US7254823B2 (en) | 1998-08-21 | 2007-08-07 | United Video Properties, Inc. | Apparatus and method for constrained selection of favorite channels |
US6898762B2 (en) | 1998-08-21 | 2005-05-24 | United Video Properties, Inc. | Client-server electronic program guide |
TW463503B (en) | 1998-08-26 | 2001-11-11 | United Video Properties Inc | Television chat system |
US6162697A (en) | 1998-10-13 | 2000-12-19 | Institute Of Microelectronics | High Q inductor realization for use in MMIC circuits |
US7313806B1 (en) | 1998-10-30 | 2007-12-25 | Intel Corporation | Method and apparatus for channel surfing through multiple sources based on user-definable preferences |
US6678891B1 (en) | 1998-11-19 | 2004-01-13 | Prasara Technologies, Inc. | Navigational user interface for interactive television |
US6314569B1 (en) | 1998-11-25 | 2001-11-06 | International Business Machines Corporation | System for video, audio, and graphic presentation in tandem with video/audio play |
US6804825B1 (en) | 1998-11-30 | 2004-10-12 | Microsoft Corporation | Video on demand methods and systems |
US6766526B1 (en) | 1998-12-03 | 2004-07-20 | United Video Properties, Inc. | Smart channel entry system |
EP1131953B1 (en) | 1998-12-04 | 2005-04-20 | Index Systems Inc. | System and method for providing news, sports, and local guide services through an electronic program guide |
US6564263B1 (en) | 1998-12-04 | 2003-05-13 | International Business Machines Corporation | Multimedia content description framework |
US20030001880A1 (en) | 2001-04-18 | 2003-01-02 | Parkervision, Inc. | Method, system, and computer program product for producing and distributing enhanced media |
US6169543B1 (en) | 1998-12-28 | 2001-01-02 | Thomson Licensing S.A. | System and method for customizing program guide information to include reminder item or local identifier |
US6621509B1 (en) | 1999-01-08 | 2003-09-16 | Ati International Srl | Method and apparatus for providing a three dimensional graphical user interface |
US6591292B1 (en) | 1999-01-08 | 2003-07-08 | Thomson Licensing S.A. | Method and interface for incorporating program information into an electronic message |
US6522342B1 (en) | 1999-01-27 | 2003-02-18 | Hughes Electronics Corporation | Graphical tuning bar for a multi-program data stream |
BR0008646A (en) | 1999-02-08 | 2002-09-03 | United Video Properties Inc | Electronic program guide with support for rich program content |
US6532589B1 (en) | 1999-03-25 | 2003-03-11 | Sony Corp. | Method and apparatus for providing a calendar-based planner in an electronic program guide for broadcast events |
US6754906B1 (en) | 1999-03-29 | 2004-06-22 | The Directv Group, Inc. | Categorical electronic program guide |
US6658661B1 (en) | 1999-03-29 | 2003-12-02 | Hughes Electronics Corporation | Carousel bit mask system and method |
US6281940B1 (en) | 1999-03-31 | 2001-08-28 | Sony Corporation | Display of previewed channels with rotation of multiple previewed channels along an arc |
US6938270B2 (en) | 1999-04-07 | 2005-08-30 | Microsoft Corporation | Communicating scripts in a data service channel of a video signal |
US6904610B1 (en) | 1999-04-15 | 2005-06-07 | Sedna Patent Services, Llc | Server-centric customized interactive program guide in an interactive television environment |
US6571392B1 (en) | 1999-04-20 | 2003-05-27 | Webtv Networks, Inc. | Receiving an information resource from the internet if it is not received from a broadcast channel |
US6567104B1 (en) | 1999-05-20 | 2003-05-20 | Microsoft Corporation | Time-based dynamic user interface elements |
US7065785B1 (en) | 1999-06-15 | 2006-06-20 | Siemens Communications, Inc. | Apparatus and method for TOL client boundary protection |
US6529950B1 (en) | 1999-06-17 | 2003-03-04 | International Business Machines Corporation | Policy-based multivariate application-level QoS negotiation for multimedia services |
JP2003503907A (en) | 1999-06-28 | 2003-01-28 | ユナイテッド ビデオ プロパティーズ, インコーポレイテッド | Interactive television program guide system and method with niche hub |
US7103904B1 (en) | 1999-06-30 | 2006-09-05 | Microsoft Corporation | Methods and apparatus for broadcasting interactive advertising using remote advertising templates |
US6415438B1 (en) | 1999-10-05 | 2002-07-02 | Webtv Networks, Inc. | Trigger having a time attribute |
AU6756000A (en) | 1999-08-03 | 2001-02-19 | America Online, Inc. | Varying electronic content based on local context |
US6292187B1 (en) | 1999-09-27 | 2001-09-18 | Sony Electronics, Inc. | Method and system for modifying the visual presentation and response to user action of a broadcast application's user interface |
US7086002B2 (en) | 1999-09-27 | 2006-08-01 | International Business Machines Corporation | System and method for creating and editing, an on-line publication |
US7134072B1 (en) | 1999-10-13 | 2006-11-07 | Microsoft Corporation | Methods and systems for processing XML documents |
US7213005B2 (en) | 1999-12-09 | 2007-05-01 | International Business Machines Corporation | Digital content distribution using web broadcasting services |
US20020124255A1 (en) | 1999-12-10 | 2002-09-05 | United Video Properties, Inc. | Systems and methods for coordinating interactive and passive advertisement and merchandising opportunities |
AU2071601A (en) | 1999-12-10 | 2001-06-18 | United Video Properties, Inc. | Features for use with advanced set-top applications on interactive television systems |
US20060059525A1 (en) | 1999-12-13 | 2006-03-16 | Jerding Dean F | Media services window configuration system |
US20020026642A1 (en) | 1999-12-15 | 2002-02-28 | Augenbraun Joseph E. | System and method for broadcasting web pages and other information |
US7228556B2 (en) | 1999-12-21 | 2007-06-05 | Tivo Inc. | Distributed, interactive television program guide; system and method |
JP2003518683A (en) | 1999-12-24 | 2003-06-10 | ラヴェンパック アクチェンゲゼルシャフト | Method and apparatus for presenting data to a user |
WO2001052554A1 (en) | 2000-01-10 | 2001-07-19 | Koninklijke Philips Electronics N.V. | Method of setting a system time clock at the start of an mpeg sequence |
US6421067B1 (en) | 2000-01-16 | 2002-07-16 | Isurftv | Electronic programming guide |
US8413185B2 (en) | 2000-02-01 | 2013-04-02 | United Video Properties, Inc. | Interactive television application with navigable cells and regions |
US7028327B1 (en) | 2000-02-02 | 2006-04-11 | Wink Communication | Using the electronic program guide to synchronize interactivity with broadcast programs |
US20020016969A1 (en) | 2000-02-03 | 2002-02-07 | International Business Machines Corporation | Media on demand system and method |
AU2001241459A1 (en) | 2000-02-08 | 2001-08-20 | Kovac×Ñ, Mario | System and method for advertisement sponsored content distribution |
US6857128B1 (en) | 2000-02-14 | 2005-02-15 | Sharp Laboratories Of America | Electronic programming guide browsing system |
US7337457B2 (en) | 2000-04-12 | 2008-02-26 | Lg Electronics Inc. | Apparatus and method for providing and obtaining product information through a broadcast signal |
US7305696B2 (en) | 2000-04-17 | 2007-12-04 | Triveni Digital, Inc. | Three part architecture for digital television data broadcasting |
US7979880B2 (en) | 2000-04-21 | 2011-07-12 | Cox Communications, Inc. | Method and system for profiling iTV users and for providing selective content delivery |
US20020059586A1 (en) | 2000-04-24 | 2002-05-16 | John Carney | Method and system for personalization and authorization of interactive television content |
US20020010928A1 (en) | 2000-04-24 | 2002-01-24 | Ranjit Sahota | Method and system for integrating internet advertising with television commercials |
US7523180B1 (en) | 2000-04-28 | 2009-04-21 | Microsoft Corporation | System and method for service chain management in a client management tool |
US8296805B2 (en) | 2000-05-30 | 2012-10-23 | Sony Corporation | Command description scheme providing for dynamic update of instance documents and their associated schema |
GB0013045D0 (en) | 2000-05-31 | 2000-07-19 | Pace Micro Tech Plc | Television system |
BR0111653A (en) | 2000-05-31 | 2004-01-13 | Prediware Corp | Universal stb control architectures and methods |
US7743330B1 (en) | 2000-06-19 | 2010-06-22 | Comcast Ip Holdings I, Llc | Method and apparatus for placing virtual objects |
US8495679B2 (en) | 2000-06-30 | 2013-07-23 | Thomson Licensing | Method and apparatus for delivery of television programs and targeted de-coupled advertising |
US6918131B1 (en) | 2000-07-10 | 2005-07-12 | Nokia Corporation | Systems and methods for characterizing television preferences over a wireless network |
US20020070978A1 (en) | 2000-07-13 | 2002-06-13 | Clayton Wishoff | Dynamically configurable graphical user environment |
AU2001284365A1 (en) | 2000-07-31 | 2002-02-13 | The Consumer Media Company Inc. | Improved user-driven data network communication system and method |
US7464344B1 (en) | 2000-08-14 | 2008-12-09 | Connie Carmichael | Systems and methods for immersive advertising |
CA2419409A1 (en) | 2000-08-21 | 2002-02-28 | Intellocity Usa, Inc. | System and method for television enhancement |
US20020059629A1 (en) | 2000-08-21 | 2002-05-16 | Markel Steven O. | Detection and recognition of data receiver to facilitate proper transmission of enhanced data |
US6760043B2 (en) | 2000-08-21 | 2004-07-06 | Intellocity Usa, Inc. | System and method for web based enhanced interactive television content page layout |
AU2001290546A1 (en) | 2000-08-22 | 2002-03-04 | Akamai Technologies, Inc. | Dynamic content assembly on edge-of-network servers in a content delivery network |
US7225456B2 (en) | 2001-04-23 | 2007-05-29 | Sony Corporation | Gateway screen for interactive television |
US20030097657A1 (en) | 2000-09-14 | 2003-05-22 | Yiming Zhou | Method and system for delivery of targeted programming |
US8302127B2 (en) | 2000-09-25 | 2012-10-30 | Thomson Licensing | System and method for personalized TV |
US20020042915A1 (en) | 2000-10-06 | 2002-04-11 | Kubischta Raymond L. | Interactive, off-screen entertainment guide for program selection and control |
US6497449B2 (en) | 2000-10-11 | 2002-12-24 | Actuant Corporation | Surface mount slide-out system |
PT1947858E (en) | 2000-10-11 | 2014-07-28 | United Video Properties Inc | Systems and methods for supplementing on-demand media |
US7516468B1 (en) | 2000-10-12 | 2009-04-07 | Oracle International Corporation | Interactive media presentation system for presenting business data over a digital television network |
DE60143848D1 (en) | 2000-10-15 | 2011-02-24 | Directv Group Inc | METHOD AND SYSTEM FOR ADVERTISING DURING A PAUSE |
US8023421B2 (en) * | 2002-07-25 | 2011-09-20 | Avaya Inc. | Method and apparatus for the assessment and optimization of network traffic |
US7913286B2 (en) | 2000-10-20 | 2011-03-22 | Ericsson Television, Inc. | System and method for describing presentation and behavior information in an ITV application |
US20020156839A1 (en) | 2000-10-26 | 2002-10-24 | Scott Peterson | System for providing localized content information via wireless personal communication devices |
US8516047B2 (en) | 2000-11-06 | 2013-08-20 | Rick Castanho | System and method for service specific notification |
US7099946B2 (en) | 2000-11-13 | 2006-08-29 | Canon Kabushiki Kaishsa | Transferring a media browsing session from one device to a second device by transferring a session identifier and a session key to the second device |
US7207057B1 (en) | 2000-11-16 | 2007-04-17 | Rowe Lynn T | System and method for collaborative, peer-to-peer creation, management & synchronous, multi-platform distribution of profile-specified media objects |
US7124424B2 (en) | 2000-11-27 | 2006-10-17 | Sedna Patent Services, Llc | Method and apparatus for providing interactive program guide (IPG) and video-on-demand (VOD) user interfaces |
US8046799B2 (en) | 2000-11-27 | 2011-10-25 | The Directv Group, Inc. | Daypart based navigation paradigm |
US20020069407A1 (en) | 2000-11-28 | 2002-06-06 | Navic Systems, Incorporated | System and method for reporting counted impressions |
US20020083450A1 (en) | 2000-12-01 | 2002-06-27 | Yakov Kamen | Method and system for content-based broadcasted program selection |
US7174512B2 (en) | 2000-12-01 | 2007-02-06 | Thomson Licensing S.A. | Portal for a communications system |
US20030023970A1 (en) | 2000-12-11 | 2003-01-30 | Ruston Panabaker | Interactive television schema |
US20020078444A1 (en) | 2000-12-15 | 2002-06-20 | William Krewin | System and method for the scaleable delivery of targeted commercials |
US20080060011A1 (en) | 2000-12-22 | 2008-03-06 | Hillcrest Laboratories, Inc. | Zoomable user interfaces for television |
JP4765182B2 (en) | 2001-01-19 | 2011-09-07 | ソニー株式会社 | Interactive television communication method and interactive television communication client device |
EP1356381B1 (en) | 2001-02-02 | 2015-11-11 | Opentv, Inc. | A method and apparatus for the compilation of an interpretative language for interactive television |
US7017175B2 (en) | 2001-02-02 | 2006-03-21 | Opentv, Inc. | Digital television application protocol for interactive television |
WO2002063878A2 (en) | 2001-02-02 | 2002-08-15 | Opentv, Inc. | A method and apparatus for reformatting of content fir display on interactive television |
US7114170B2 (en) | 2001-02-07 | 2006-09-26 | Neoris Usa, Inc. | Method and apparatus for providing interactive media presentation |
US7162694B2 (en) | 2001-02-13 | 2007-01-09 | Microsoft Corporation | Method for entering text |
US8769566B2 (en) | 2001-03-02 | 2014-07-01 | Jlb Ventures Llc | Method and system for advertising based on the content of selected channels or broadcasted programs |
US6886029B1 (en) | 2001-03-13 | 2005-04-26 | Panamsat Corporation | End to end simulation of a content delivery system |
US20030018755A1 (en) | 2001-03-30 | 2003-01-23 | Masterson Robert J. | Online system that facilitates configuration and administration of residential electronic devices |
US20020144269A1 (en) | 2001-03-30 | 2002-10-03 | Connelly Jay H. | Apparatus and method for a dynamic electronic program guide enabling billing broadcast services per EPG line item |
GB0108354D0 (en) | 2001-04-03 | 2001-05-23 | Thirdspace Living Ltd | System and method for providing a user with access to a plurality of sevices and content from a broadband television service |
US6806887B2 (en) | 2001-04-04 | 2004-10-19 | International Business Machines Corporation | System for integrating personalized data with visual content |
US8060906B2 (en) | 2001-04-06 | 2011-11-15 | At&T Intellectual Property Ii, L.P. | Method and apparatus for interactively retrieving content related to previous query results |
US20050005288A1 (en) | 2001-04-13 | 2005-01-06 | Digeo, Inc. | System and method for personalized remote control of an interactive television system |
US8566873B2 (en) | 2001-04-23 | 2013-10-22 | Starz Entertainment, Llc | Program guide enhancements |
WO2002086591A1 (en) | 2001-04-23 | 2002-10-31 | Reveo, Inc. | Image display system and electrically actuatable image combiner therefor |
US20020162120A1 (en) | 2001-04-25 | 2002-10-31 | Slade Mitchell | Apparatus and method to provide supplemental content from an interactive television system to a remote device |
US7584491B2 (en) | 2001-04-25 | 2009-09-01 | Sony Corporation | System and method for managing interactive programming and advertisements in interactive broadcast systems |
US6727930B2 (en) | 2001-05-18 | 2004-04-27 | Hewlett-Packard Development Company, L.P. | Personal digital assistant with streaming information display |
US8291457B2 (en) | 2001-05-24 | 2012-10-16 | Vixs Systems, Inc. | Channel selection in a multimedia system |
US20040031015A1 (en) | 2001-05-24 | 2004-02-12 | Conexant Systems, Inc. | System and method for manipulation of software |
US7873972B2 (en) | 2001-06-01 | 2011-01-18 | Jlb Ventures Llc | Method and apparatus for generating a mosaic style electronic program guide |
US7076734B2 (en) | 2001-06-22 | 2006-07-11 | Microsoft Corporation | Systems and methods for providing a dynamically controllable user interface that embraces a variety of media |
US6760918B2 (en) | 2001-06-29 | 2004-07-06 | Scientific-Atlanta, Inc. | Method and apparatus for recordable media content distribution |
US7406705B2 (en) | 2001-06-29 | 2008-07-29 | Intel Corporation | Carousel exhibiting multiple occurrences of a module |
US7886003B2 (en) | 2001-07-06 | 2011-02-08 | Ericsson Television, Inc. | System and method for creating interactive events |
US20100175084A1 (en) | 2001-07-12 | 2010-07-08 | Ellis Michael D | Interactive television system with messaging and related promotions |
WO2003009126A1 (en) | 2001-07-19 | 2003-01-30 | Digeo, Inc. | System and method for managing television programs within an entertainment system |
US20030066081A1 (en) | 2001-07-27 | 2003-04-03 | Barone Samuel T. | Command protocol for interactive TV production tools |
DE60239067D1 (en) | 2001-08-02 | 2011-03-10 | Intellocity Usa Inc | PREPARATION OF DISPLAY CHANGES |
US7908628B2 (en) | 2001-08-03 | 2011-03-15 | Comcast Ip Holdings I, Llc | Video and digital multimedia aggregator content coding and formatting |
JP4201706B2 (en) | 2001-08-06 | 2008-12-24 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | System and method for combining several EPG sources into one reliable EPG |
AU2002355602A1 (en) | 2001-08-06 | 2003-02-24 | Digeo, Inc. | System and method to provide local content and corresponding applications via carousel transmission |
CA2456984C (en) | 2001-08-16 | 2013-07-16 | Goldpocket Interactive, Inc. | Interactive television tracking system |
US7155675B2 (en) | 2001-08-29 | 2006-12-26 | Digeo, Inc. | System and method for focused navigation within a user interface |
US20030070170A1 (en) | 2001-09-07 | 2003-04-10 | Eric Lennon | Method and apparatus providing an improved electronic program guide in a cable television system |
US8413205B2 (en) | 2001-09-19 | 2013-04-02 | Tvworks, Llc | System and method for construction, delivery and display of iTV content |
US8365230B2 (en) | 2001-09-19 | 2013-01-29 | Tvworks, Llc | Interactive user interface for television applications |
US20030079226A1 (en) | 2001-10-19 | 2003-04-24 | Barrett Peter T. | Video segment targeting using remotely issued instructions and localized state and behavior information |
US20030110503A1 (en) | 2001-10-25 | 2003-06-12 | Perkes Ronald M. | System, method and computer program product for presenting media to a user in a media on demand framework |
US20030084443A1 (en) | 2001-11-01 | 2003-05-01 | Commerce Tv Corporation, Inc. | System and method for creating program enhancements for use in an interactive broadcast network |
US6910191B2 (en) | 2001-11-02 | 2005-06-21 | Nokia Corporation | Program guide data selection device |
US20030086694A1 (en) | 2001-11-07 | 2003-05-08 | Nokia Corporation | Recording program scheduling information in an electronic calendar |
US20030110500A1 (en) | 2001-12-06 | 2003-06-12 | Rodriguez Arturo A. | Prediction-based adaptative control of television viewing functionality |
US7149750B2 (en) | 2001-12-19 | 2006-12-12 | International Business Machines Corporation | Method, system and program product for extracting essence from a multimedia file received in a first format, creating a metadata file in a second file format and using a unique identifier assigned to the essence to access the essence and metadata file |
US20030126601A1 (en) | 2001-12-31 | 2003-07-03 | Koninklijke Philips Electronics N.V. | Visualization of entertainment content |
US7293275B1 (en) | 2002-02-08 | 2007-11-06 | Microsoft Corporation | Enhanced video content information associated with video programs |
US7363612B2 (en) | 2002-03-06 | 2008-04-22 | Sun Microsystems, Inc. | Application programs with dynamic components |
US20050125835A1 (en) | 2002-03-11 | 2005-06-09 | Koninklijke Philips Electronics N.V. | Service/channel installation |
US7703116B1 (en) | 2003-07-11 | 2010-04-20 | Tvworks, Llc | System and method for construction, delivery and display of iTV applications that blend programming information of on-demand and broadcast service offerings |
US20030182663A1 (en) | 2002-03-25 | 2003-09-25 | Sony Corporation | System and method for sharing user comments on TV screens |
US7197715B1 (en) | 2002-03-29 | 2007-03-27 | Digeo, Inc. | System and method to provide customized graphical user interfaces via an interactive video casting network |
US20040078814A1 (en) | 2002-03-29 | 2004-04-22 | Digeo, Inc. | Module-based interactive television ticker |
US8555313B2 (en) | 2002-04-09 | 2013-10-08 | Ericsson Television Inc. | System and method for coordinating interactive television programs |
DE10218812A1 (en) | 2002-04-26 | 2003-11-20 | Siemens Ag | Generic stream description |
US7155674B2 (en) | 2002-04-29 | 2006-12-26 | Seachange International, Inc. | Accessing television services |
US8832754B2 (en) | 2002-05-03 | 2014-09-09 | Tvworks, Llc | System and method for providing synchronized events to a television application |
US7177658B2 (en) * | 2002-05-06 | 2007-02-13 | Qualcomm, Incorporated | Multi-media broadcast and multicast service (MBMS) in a wireless communications system |
US7899915B2 (en) * | 2002-05-10 | 2011-03-01 | Richard Reisman | Method and apparatus for browsing using multiple coordinated device sets |
WO2003102821A1 (en) | 2002-05-31 | 2003-12-11 | Context Media, Inc. | Cataloging and managing the distribution of distributed digital assets |
US20030226141A1 (en) | 2002-06-03 | 2003-12-04 | Krasnow Genessa L. | Advertisement data store |
KR100994949B1 (en) | 2002-06-11 | 2010-11-18 | 코닌클리케 필립스 일렉트로닉스 엔.브이. | Method of filtering a bitstream according to user specifications |
US20040003402A1 (en) | 2002-06-27 | 2004-01-01 | Digeo, Inc. | Method and apparatus for automatic ticker generation based on implicit or explicit profiling |
US7237252B2 (en) | 2002-06-27 | 2007-06-26 | Digeo, Inc. | Method and apparatus to invoke a shopping ticker |
US9445133B2 (en) | 2002-07-10 | 2016-09-13 | Arris Enterprises, Inc. | DVD conversion for on demand |
US20040019900A1 (en) | 2002-07-23 | 2004-01-29 | Philip Knightbridge | Integration platform for interactive communications and management of video on demand services |
WO2004019602A2 (en) | 2002-08-21 | 2004-03-04 | Disney Enterprises, Inc. | Digital home movie library |
AU2003268273B2 (en) | 2002-08-30 | 2007-07-26 | Opentv, Inc | Carousel proxy |
US7092966B2 (en) | 2002-09-13 | 2006-08-15 | Eastman Kodak Company | Method software program for creating an image product having predefined criteria |
US7702315B2 (en) | 2002-10-15 | 2010-04-20 | Varia Holdings Llc | Unified communication thread for wireless mobile communication devices |
KR100513736B1 (en) | 2002-12-05 | 2005-09-08 | 삼성전자주식회사 | Method and system for generation input file using meta language regarding graphic data compression |
US20040172648A1 (en) | 2003-02-28 | 2004-09-02 | Shan Xu | Channel navigation based on channel attribute information |
US7039655B2 (en) | 2003-04-07 | 2006-05-02 | Mesoft Partners, Llc | System and method for providing a digital media supply chain operation system and suite of applications |
KR100518825B1 (en) | 2003-04-30 | 2005-10-06 | 삼성전자주식회사 | Real time channel grouping method and the apparatus thereof |
WO2005008993A1 (en) | 2003-07-23 | 2005-01-27 | Canon Kabushiki Kaisha | Description document for a service offered by a server in a communication network and method of validating a multimedia document |
JP2006033795A (en) | 2004-06-15 | 2006-02-02 | Sanyo Electric Co Ltd | Remote control system, controller, program for imparting function of controller to computer, storage medium with the program stored thereon, and server |
US8190680B2 (en) * | 2004-07-01 | 2012-05-29 | Netgear, Inc. | Method and system for synchronization of digital media playback |
EP1641261A2 (en) * | 2004-09-28 | 2006-03-29 | T.P.G. Podium Israel Ltd. | Method and means for interaction of viewers with television programmes via cellular mobile terminals |
US7440967B2 (en) | 2004-11-10 | 2008-10-21 | Xerox Corporation | System and method for transforming legacy documents into XML documents |
US20060105793A1 (en) | 2004-11-12 | 2006-05-18 | Gutowski Gerald J | Broadcast message services for communication devices engaged in push-to-talk communication |
US20060200842A1 (en) | 2005-03-01 | 2006-09-07 | Microsoft Corporation | Picture-in-picture (PIP) alerts |
US7587415B2 (en) | 2005-03-14 | 2009-09-08 | Microsoft Corporation | Single-pass translation of flat-file documents into XML format including validation, ambiguity resolution, and acknowledgement generation |
WO2007002820A2 (en) | 2005-06-28 | 2007-01-04 | Yahoo! Inc. | Search engine with augmented relevance ranking by community participation |
US8705195B2 (en) * | 2006-04-12 | 2014-04-22 | Winview, Inc. | Synchronized gaming and programming |
US20070220016A1 (en) | 2005-12-16 | 2007-09-20 | Antonio Estrada | Secured content syndication on a collaborative place |
US8572182B2 (en) | 2006-07-21 | 2013-10-29 | Blackberry Limited | Handling notifications in instant messaging systems |
US7624416B1 (en) | 2006-07-21 | 2009-11-24 | Aol Llc | Identifying events of interest within video content |
US8064894B1 (en) | 2006-08-07 | 2011-11-22 | Aol Inc. | Exchanging digital content |
US20080071770A1 (en) | 2006-09-18 | 2008-03-20 | Nokia Corporation | Method, Apparatus and Computer Program Product for Viewing a Virtual Database Using Portable Devices |
US7870267B2 (en) | 2007-05-16 | 2011-01-11 | International Business Machines Corporation | Creating global sessions across converged protocol applications |
EP2111014B1 (en) | 2007-06-20 | 2012-01-11 | Alcatel Lucent | Method and apparatuses of setting up a call-back by a user receiving a media stream |
JP2009025871A (en) | 2007-07-17 | 2009-02-05 | Hewlett-Packard Development Co Lp | Access restriction device and its method |
US20090094651A1 (en) * | 2007-10-09 | 2009-04-09 | Alcatel Lucent | Ethernet-Level Measurement of Multicast Group Delay Performance |
US9535988B2 (en) | 2007-12-21 | 2017-01-03 | Yahoo! Inc. | Blog-based video summarization |
US8739233B2 (en) | 2008-02-29 | 2014-05-27 | General Instrument Corporation | Method and system for providing different formats of encoded content in a switched digital video (SDV) system |
US20090228441A1 (en) | 2008-03-07 | 2009-09-10 | Bjornar Sandvik | Collaborative internet image-searching techniques |
US20090249427A1 (en) | 2008-03-25 | 2009-10-01 | Fuji Xerox Co., Ltd. | System, method and computer program product for interacting with unaltered media |
US8793256B2 (en) | 2008-03-26 | 2014-07-29 | Tout Industries, Inc. | Method and apparatus for selecting related content for display in conjunction with a media |
US9106801B2 (en) | 2008-04-25 | 2015-08-11 | Sony Corporation | Terminals, servers, and methods that find a media server to replace a sensed broadcast program/movie |
US8185405B2 (en) | 2008-05-20 | 2012-05-22 | Chuck Van Court | Method, system, and program product for information editorial controls |
US20100077057A1 (en) | 2008-09-23 | 2010-03-25 | Telefonaktiebolaget Lm Ericsson (Publ) | File Transfer in Conference Services |
US20100079670A1 (en) | 2008-09-30 | 2010-04-01 | Verizon Data Services, Llc | Multi-view content casting systems and methods |
US8175847B2 (en) | 2009-03-31 | 2012-05-08 | Microsoft Corporation | Tag ranking |
US8938675B2 (en) * | 2009-06-16 | 2015-01-20 | Harman International Industries, Incorporated | System for automated generation of audio/video control interfaces |
US8352506B2 (en) | 2009-08-31 | 2013-01-08 | Pod Poster LLC | Automatic submission of audiovisual content to desired destinations |
US9973821B2 (en) | 2009-09-03 | 2018-05-15 | Fox Broadcasting Company | Method and apparatus for concurrent broadcast of media program and social networking derived information exchange |
US8407303B2 (en) | 2009-10-13 | 2013-03-26 | Sony Corporation | Remote email or SMS control of CE device such as TV |
US8266652B2 (en) | 2009-10-15 | 2012-09-11 | At&T Intellectual Property I, L.P. | Apparatus and method for transmitting media content |
JP2013509803A (en) | 2009-10-29 | 2013-03-14 | トムソン ライセンシング | Multi-screen interactive screen architecture |
US20110131204A1 (en) | 2009-12-02 | 2011-06-02 | International Business Machines Corporation | Deriving Asset Popularity by Number of Launches |
US8660545B1 (en) | 2010-01-06 | 2014-02-25 | ILook Corporation | Responding to a video request by displaying information on a TV remote and video on the TV |
US20110214143A1 (en) | 2010-03-01 | 2011-09-01 | Rits Susan K | Mobile device application |
US9264785B2 (en) | 2010-04-01 | 2016-02-16 | Sony Computer Entertainment Inc. | Media fingerprinting for content determination and retrieval |
US8560583B2 (en) | 2010-04-01 | 2013-10-15 | Sony Computer Entertainment Inc. | Media fingerprinting for social networking |
US8516528B2 (en) * | 2010-06-30 | 2013-08-20 | Cable Television Laboratories, Inc. | Synchronization of 2nd screen applications |
US8850495B2 (en) | 2010-08-14 | 2014-09-30 | Yang Pan | Advertisement delivering system based on digital television system and mobile communication device |
US9432746B2 (en) | 2010-08-25 | 2016-08-30 | Ipar, Llc | Method and system for delivery of immersive content over communication networks |
CA2814197C (en) | 2010-11-24 | 2016-11-01 | Lg Electronics Inc. | Video display device and method of controlling the same |
US8863196B2 (en) | 2010-11-30 | 2014-10-14 | Sony Corporation | Enhanced information on mobile device for viewed program and control of internet TV device using mobile device |
GB2486002A (en) | 2010-11-30 | 2012-06-06 | Youview Tv Ltd | Media Content Provision |
EP2656294A4 (en) | 2010-12-20 | 2014-12-10 | Intel Corp | Techniques for management and presentation of content |
CN103283254B (en) | 2011-01-05 | 2018-04-06 | 汤姆逊许可公司 | Multi-screen interactive |
US20120324002A1 (en) | 2011-02-03 | 2012-12-20 | Afolio Inc. | Media Sharing |
US9674576B2 (en) | 2011-03-01 | 2017-06-06 | Ebay Inc. | Methods and systems of providing a supplemental experience based on concurrently viewed content |
US20120233646A1 (en) | 2011-03-11 | 2012-09-13 | Coniglio Straker J | Synchronous multi-platform content consumption |
WO2012154541A1 (en) | 2011-05-06 | 2012-11-15 | Thomson Licensing | Broadcast-initiated delivery of auxiliary content using triggers |
US20120324495A1 (en) | 2011-06-17 | 2012-12-20 | Microsoft Corporation | Detecting and distributing video content identities |
WO2013040533A1 (en) | 2011-09-16 | 2013-03-21 | Umami Co. | Second screen interactive platform |
JP5911262B2 (en) | 2011-10-27 | 2016-04-27 | キヤノン株式会社 | Information processing apparatus, information processing apparatus control method, and program |
US9729942B2 (en) | 2011-11-28 | 2017-08-08 | Discovery Communications, Llc | Methods and apparatus for enhancing a digital content experience |
US8646023B2 (en) * | 2012-01-05 | 2014-02-04 | Dijit Media, Inc. | Authentication and synchronous interaction between a secondary device and a multi-perspective audiovisual data stream broadcast on a primary device geospatially proximate to the secondary device |
KR20140121395A (en) | 2012-01-06 | 2014-10-15 | 톰슨 라이센싱 | Method and system for synchronising social messages with a content timeline |
US20130298084A1 (en) | 2012-01-27 | 2013-11-07 | Bottlenose, Inc. | Targeted advertising based on trending of aggregated personalized information streams |
US20130262997A1 (en) | 2012-03-27 | 2013-10-03 | Roku, Inc. | Method and Apparatus for Displaying Information on a Secondary Screen |
WO2013162599A1 (en) * | 2012-04-27 | 2013-10-31 | Empire Technology Development Llc | Virtual machine switching based on measured network delay |
US9699513B2 (en) | 2012-06-01 | 2017-07-04 | Google Inc. | Methods and apparatus for providing access to content |
US9648369B2 (en) | 2012-06-11 | 2017-05-09 | Verizon Patent And Licensing Inc. | Cross-platform schedule management interface |
US20130347018A1 (en) | 2012-06-21 | 2013-12-26 | Amazon Technologies, Inc. | Providing supplemental content with active media |
WO2014003394A1 (en) * | 2012-06-25 | 2014-01-03 | Lg Electronics Inc. | Apparatus and method for processing an interactive service |
US20140032473A1 (en) | 2012-07-24 | 2014-01-30 | International Business Machines Corporation | Estimating potential message viewing rates of tweets |
US20140089423A1 (en) | 2012-09-27 | 2014-03-27 | United Video Properties, Inc. | Systems and methods for identifying objects displayed in a media asset |
RU2594295C1 (en) * | 2012-10-18 | 2016-08-10 | ЭлДжи ЭЛЕКТРОНИКС ИНК. | Device and method for processing of interactive service |
MX345034B (en) | 2012-11-28 | 2017-01-16 | Lg Electronics Inc | Apparatus and method for processing an interactive service. |
US8699862B1 (en) | 2013-02-06 | 2014-04-15 | Google Inc. | Synchronized content playback related to content recognition |
-
2013
- 2013-03-13 US US13/798,843 patent/US9553927B2/en active Active
-
2014
- 2014-03-11 CA CA2845465A patent/CA2845465C/en active Active
- 2014-03-12 EP EP14159227.9A patent/EP2779664A3/en not_active Ceased
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9451196B2 (en) | 2002-03-15 | 2016-09-20 | Comcast Cable Communications, Llc | System and method for construction, delivery and display of iTV content |
US9516253B2 (en) | 2002-09-19 | 2016-12-06 | Tvworks, Llc | Prioritized placement of content elements for iTV applications |
US11832024B2 (en) | 2008-11-20 | 2023-11-28 | Comcast Cable Communications, Llc | Method and apparatus for delivering video and video-related content at sub-asset level |
US11115722B2 (en) | 2012-11-08 | 2021-09-07 | Comcast Cable Communications, Llc | Crowdsourcing supplemental content |
Also Published As
Publication number | Publication date |
---|---|
US20140280695A1 (en) | 2014-09-18 |
EP2779664A3 (en) | 2014-10-01 |
EP2779664A2 (en) | 2014-09-17 |
US9553927B2 (en) | 2017-01-24 |
CA2845465C (en) | 2019-11-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CA2845465C (en) | Synchronizing multiple transmissions of content | |
US11418823B2 (en) | Delivering content | |
USRE47774E1 (en) | Synchronized viewing of media content | |
EP2868109B1 (en) | Generating a sequence of audio fingerprints at a set top box | |
US9363545B2 (en) | Apparatus and method for television | |
US20230300424A1 (en) | Policy based transcoding | |
US20160302166A1 (en) | Methods and apparatus for synchronized viewing experience across multiple devices | |
US20140282759A1 (en) | Buffering Content | |
US20120116883A1 (en) | Methods and systems for use in incorporating targeted advertising into multimedia content streams | |
JP2018521601A (en) | Automatic content recognition fingerprint sequence verification | |
US20220150293A1 (en) | Determining Location Within Video Content for Presentation to a User | |
US20230269413A1 (en) | Methods and systems for low latency streaming | |
US8713602B2 (en) | Alternate source programming | |
US9325756B2 (en) | Transmission of content fragments | |
US20210390210A1 (en) | Privacy-aware content recommendations | |
US20140278904A1 (en) | Interaction with primary and second screen content | |
US11777871B2 (en) | Delivery of multimedia components according to user activity |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
EEER | Examination request |
Effective date: 20190311 |