WO2001015359A1 - Virtual hybrid interactive multicasting system and method - Google Patents

Virtual hybrid interactive multicasting system and method Download PDF

Info

Publication number
WO2001015359A1
WO2001015359A1 PCT/US2000/040717 US0040717W WO0115359A1 WO 2001015359 A1 WO2001015359 A1 WO 2001015359A1 US 0040717 W US0040717 W US 0040717W WO 0115359 A1 WO0115359 A1 WO 0115359A1
Authority
WO
WIPO (PCT)
Prior art keywords
digital resources
user terminal
receiving
communication channel
visible components
Prior art date
Application number
PCT/US2000/040717
Other languages
French (fr)
Other versions
WO2001015359A8 (en
Inventor
Jack Bell
Original Assignee
Jack Bell
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jack Bell filed Critical Jack Bell
Priority to EP00969022A priority Critical patent/EP1212858A1/en
Priority to AU78852/00A priority patent/AU7885200A/en
Publication of WO2001015359A1 publication Critical patent/WO2001015359A1/en
Publication of WO2001015359A8 publication Critical patent/WO2001015359A8/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/435Processing of additional data, e.g. decrypting of additional data, reconstructing software from modules extracted from the transport stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04HBROADCAST COMMUNICATION
    • H04H60/00Arrangements for broadcast applications with a direct linking to broadcast information or broadcast space-time; Broadcast-related systems
    • H04H60/02Arrangements for generating broadcast information; Arrangements for generating broadcast-related information with a direct linking to broadcast information or to broadcast space-time; Arrangements for simultaneous generation of broadcast information and broadcast-related information
    • H04H60/07Arrangements for generating broadcast information; Arrangements for generating broadcast-related information with a direct linking to broadcast information or to broadcast space-time; Arrangements for simultaneous generation of broadcast information and broadcast-related information characterised by processes or methods for the generation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04HBROADCAST COMMUNICATION
    • H04H60/00Arrangements for broadcast applications with a direct linking to broadcast information or broadcast space-time; Broadcast-related systems
    • H04H60/29Arrangements for monitoring broadcast services or broadcast-related services
    • H04H60/33Arrangements for monitoring the users' behaviour or opinions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04HBROADCAST COMMUNICATION
    • H04H60/00Arrangements for broadcast applications with a direct linking to broadcast information or broadcast space-time; Broadcast-related systems
    • H04H60/61Arrangements for services using the result of monitoring, identification or recognition covered by groups H04H60/29-H04H60/54
    • H04H60/66Arrangements for services using the result of monitoring, identification or recognition covered by groups H04H60/29-H04H60/54 for using the result on distributors' side
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/61Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio
    • H04L65/611Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio for multicast or broadcast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/61Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio
    • H04L65/613Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio for the control of the source by the destination
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • H04N21/23412Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs for generating or manipulating the scene composition of objects, e.g. MPEG-4 objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/235Processing of additional data, e.g. scrambling of additional data or processing content descriptors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/251Learning process for intelligent management, e.g. learning user preferences for recommending movies
    • H04N21/252Processing of multiple end-users' preferences to derive collaborative data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/462Content or additional data management, e.g. creating a master electronic program guide from data received from the Internet and a Head-end, controlling the complexity of a video stream by scaling the resolution or bit-rate based on the client capabilities
    • H04N21/4622Retrieving content or additional data from different sources, e.g. from a broadcast channel and the Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/4722End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting additional data associated with the content
    • H04N21/4725End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting additional data associated with the content using interactive regions of the image, e.g. hot spots
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4782Web browsing, e.g. WebTV
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/08Systems for the simultaneous or sequential transmission of more than one television signal, e.g. additional information signals, the signals occupying wholly or partially the same frequency band, e.g. by time division
    • H04N7/087Systems for the simultaneous or sequential transmission of more than one television signal, e.g. additional information signals, the signals occupying wholly or partially the same frequency band, e.g. by time division with signal insertion during the vertical blanking interval only
    • H04N7/088Systems for the simultaneous or sequential transmission of more than one television signal, e.g. additional information signals, the signals occupying wholly or partially the same frequency band, e.g. by time division with signal insertion during the vertical blanking interval only the inserted signal being digital
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/16Analogue secrecy systems; Analogue subscription systems
    • H04N7/173Analogue secrecy systems; Analogue subscription systems with two-way working, e.g. subscriber sending a programme selection signal
    • H04N7/17309Transmission or handling of upstream communications
    • H04N7/17318Direct or substantially direct transmission and handling of requests
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04HBROADCAST COMMUNICATION
    • H04H20/00Arrangements for broadcast or for distribution combined with broadcast
    • H04H20/65Arrangements characterised by transmission systems for broadcast
    • H04H20/76Wired systems
    • H04H20/82Wired systems using signals not modulated onto a carrier
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04HBROADCAST COMMUNICATION
    • H04H60/00Arrangements for broadcast applications with a direct linking to broadcast information or broadcast space-time; Broadcast-related systems
    • H04H60/76Arrangements characterised by transmission systems other than for broadcast, e.g. the Internet
    • H04H60/81Arrangements characterised by transmission systems other than for broadcast, e.g. the Internet characterised by the transmission system itself
    • H04H60/82Arrangements characterised by transmission systems other than for broadcast, e.g. the Internet characterised by the transmission system itself the transmission system being the Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/16Analogue secrecy systems; Analogue subscription systems
    • H04N7/173Analogue secrecy systems; Analogue subscription systems with two-way working, e.g. subscriber sending a programme selection signal
    • H04N2007/1739Analogue secrecy systems; Analogue subscription systems with two-way working, e.g. subscriber sending a programme selection signal the upstream communication being transmitted via a separate link, e.g. telephone line

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computing Systems (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Social Psychology (AREA)
  • Human Computer Interaction (AREA)
  • Television Systems (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The present invention provides a method and system for interactive broadcasting. Video content (15), typically also including an audio component, is transmitted via a communication channel (14). Visible portions of interactive content (16) are also transmitted via this communication channel (14). Non-visible portions of the interactive content (11) are transmitted via a communication channel (12) which may differ from the channel used for the video content (15) and the visible portions of the interactive content (16). A typical system will include collector/dispatcher control and rendering subsystems. The collector/dispatcher subsystem (20) may retrieve interactive content as prescribed by sheduling information accessible to the control subsystem and in some cases parse the visible and non-visible portions of the retrieved content. Such retrieved content may be forwarded to the rendering subsystem for transmission along with received video content.

Description

VIRTUAL HYBRID INTERACTIVE MULTICASTING SYSTEM AND METHOD
CROSS-REFERENCE TO RELATED PATENT APPLICATION This application claims the benefit, pursuant to 35 U.S.C. § 1 19(e). of applicant's provisional U.S. Patent Applications Serial No. 60/150,214, filed August 23, 1999, entitled "Virtual Hybrid Interactive Multicasting," Serial No. 60/195,027, filed April 6, 2000, entitled "Methods and Systems for Engaging in Commerce Using Hybrid Interactive Multicasting Methods and Products, and Related Methods and Products" and Serial No. 60/195.054, filed April 6, 2000. entitled "Virtual Hybrid
Interactive Broadcasting."' These provisional applications are incorporated by reference herein in their entirety.
BACKGROUND OF INVENTION 1. FIELD OF INVENTION
The present invention relates to a system and method for broadcasting video content and interactive content to user interface terminals, such as television set top boxes, personal computers, television, and other terminals. More particularly, the invention relates to a system and method for broadcasting to a user interface terminal video content and visible portions of interactive content via the same communications channel while supplying the non-visible portions of interactive content via either the same or an alternative communications channel .
2. DESCRIPTION OF RELATED ART The Internet is a global network of connected computer networks. Over the last several years, the Internet has grown in significant measure. A large number of computers on the Internet provide information in various forms. Anyone with a computer connected to the Internet can potentially tap into this vast pool of information. The most wide spread method of providing information over the Internet is via the World Wide Web (the Web). The Web consists of a subset of the computers connected to the Internet; the computers in this subset run Hypertext Transfer Protocol (HTTP) servers (Web servers). The information available via the Internet also encompasses information available via other types of information servers such as GOPHER and FTP.
Information on the Internet can be accessed through the use of a Uniform Resource Locator (URL). A URL uniquely specifies the location of a particular piece of information on the Internet. A URL will typically be composed of several components. The first component typically designates the protocol by which the address piece of information is accessed (e.g., HTTP, GOPHER, etc.). This first component is separated from the remainder of the URL by a colon (':'). The remainder of the URL will depend upon the protocol component. Typically, the remainder designates a computer on the Internet by name, or by IP number, as well as a more specific designation of the location of the resource on the designated computer. For instance, a typical URL for an HTTP resource might be: http://www.server.com/dirl/dir2/resource.htm where http is the protocol, www.server.com is the designated computer and /dirl/dir2/resouce.htm designates the location of the resource on the designated computer.
Web servers host information in the form of Web pages; collectively the server and the information hosted are referred to as a Web site. A significant number of Web pages are encoded using the Hypertext Markup Language (HTML) although other encodings using the extensible Markup Language (XML) or the Standard Generic Markup Language (SGML) are becoming increasingly more common. The published specifications for these languages are incorporated by reference herein. Web pages in these formatting languages may include links to other Web pages on the same Web site or another. As will be known to those skilled in the art, Web pages may be generated dynamically by a server by integrating a variety of elements into a formatted page prior to transmission to a Web client. Web servers and information servers of other types await requests for the information that they receive from Internet clients.
Client software has evolved that allows users of computers connected to the Internet to access this information. Advanced clients such as Netscape's Navigator and Microsoft's Internet Explorer allow users to access software provided via a variety of information servers in a unified client environment. Typically, such client software is referred to as browser software.
The commercialization of the Internet, most notablv the Web aspect of the Internet, and the resulting successes has made the provision of interactive content in various media more important and desirable. For example, it is highly desirable to provide interactive content to cable and wireless television customers. An example of the provision of interactive content is the provision of Web-based content, such as all or part of a Web page.
Broadcasters of television and other video content encounter many complications in transmitting interactive content to multiple cable or wireless customers who use varying home communications terminals, eg, set-top boxes. One of those complications is the relatively-low bandwidth generally available to transmit Web content to customers via the Internet, thus limiting the amount and type of Web content sent to customers. Another complication is the diversity of the set-top box market in that various customers use set top boxes of differing manufacturers that are not generally compatible. Examples of set-top boxes include those provided by WebTV, Open TV, and Power TV. Another complication is that set-top box providers generally do not have direct access to the customer through a cable operator's relatively-high bandwidth connection to the customer. Another complication with conventional systems is that applications must be written to support multiple types of set top boxes. Moreover, in conventional systems, it is often prohibitively expensive to provide interactive elements in a single program because of the need to reprogram the interactive portion due to date sensitivity. Moreover, in conventional systems, there is no generally-compatible, centrally- controlled system that allows for effective management of interactive content on limited-time basis (eg, minute to minute).
A conventional approach to interactive television is to transmit traditional (eg. video) content via the video path while delivering Web graphics and interactivity commands via the Internet path. These two paths (or '"pipes") are between the broadcast source and the viewer (eg, to the viewer's set top box). The Internet pipe in a conventional approach carries Web graphics components and interactive component (or interactivity commands, eg. "hot spots"), and the video pipe carries video content in digital or analog format. The Web graphics component comprises commands regarding matters that the user sees - images, text, and input controls (eg, buttons). The interactive component is invisible to the user but controls interactive elements of the display s, such as what areas of the screen are sensitive to mouse clicks (called "hot spots"). In a conventional approach, Web graphics and Web interactive commands are sent to a customer's set-top box via the Internet pipe. The Web graphics component can be very large, especially when high-quality graphics are used. Because the discussed conventional approach employs the Internet path to transmit such graphics components. and such a path is normally of low bandwidth, a system using this conventional approach is not robust and cannot effectively support high-volume, dynamic, high- quality transmissions. Another complication with such conventional approaches is that the addition of interactivity to video is complicated to use.
Various other approaches to providing interactivity and Web content using television are disclosed in United States patents. For example, in one known system, a computer-based system receives a video program and uniform resources locators (URLs), and then retrieves Web content from the Web pages indicated by the URLs received. Another known system includes equipment for inserting associated data (in a format such as HTML) into the vertical blanking interval of a television signal on the supplier side of the system. A further known system downloads data (eg, Internet data) during off-peak hours on a low bandwidth communications channel. In yet another known system, hardware includes a head end sever, upstream server, and blank interval inserter associated with head end distribution equipment. A still further example is United States Patent No. 6,018,764 to Field et al, the abstract of which states in part as follows: "Web pages and other Internet information resources are retrieved from a oneway broadcast signal such as broadcast television signal. A user selects from a range of available information, including ... HTML ... pages, which is carried in the broadcast stream by invoking a command which is defined according to a . . . URL . . . format. Table mapping data is carried in the broadcast signal and provides a broadcast address corresponding to the URL ...." Such systems have several of the complications described above, in addition to other complications. SUMMARY OF THE INVENTION The present invention is directed to a system and method for broadcasting video content and interactive content to a user terminal. As an example, a summary of one embodiment of the present invention is as follows: Traditional content (eg, video) and digital graphic elements (eg, Web graphics) are transmitted to viewers (eg, to their set top boxes or other suitable terminal) via a first communications channel, which in some embodiments may be referred to as the video pipe, and interactivity commands (eg, Web interactive components) are transmitted via a second communications channel, which in some embodiments may be referred to as the Internet pipe. Examples of the Internet pipe include a dial-up Internet connection, a digital subscriber line, a cable- system Internet service, or other connection to a computer network such as the Internet. In some embodiments, the second communications channel may be the same as the first communications channel. Embodiments of the present invention provide several advantages. One advantage is that the amount of content that must be transferred over the second communications channel or in the vertical blanking interval of a video signal can be reduced, sometimes drastically reduced. Another advantage is that embodiments can provide much greater quality of content and far more complex and powerful "hot spots."
Moreover, embodiments of the present invention can be installed in any existing broadcast operation, analog or digital, in order to implement interactivity. Embodiments can provide interactive functionality to existing broadcasts by providing the broadcaster with the ability to add input graphic images and associating input only interactive elements with those controls via, for example, Web pages. Embodiments are versatile in that they may be installed in any home, on any equipment which has WEB-TV, personal computers, simple set-top boxes, or similar devices, and a connection to the channel providing the non-visible interactive components such as a connection to a computer network such as the Internet. Another advantage is that in a system including an embodiment of the present invention, the control graphics that are usually drawn by the Web device at the user terminal is drawn by the system. Thus there is no wait time for screen refreshes since the visible portions of the interactive content, the graphics controls, are not sent via the second, often-times lower bandwidth, communications channel.
Another advantage is that bandwidth is conserved since, in embodiments, no graphics are transmitted via the second communications channels, which may also be referred to as the data pipe in some embodiments, (eg, Internet pipe, VBI, MPEG3 carrier, any lossless data carrier that can be reasonably synchronized with video broadcast, intranet). Moreover, graphics are much more dynamic since embodiments do not depend on relatively low-bandwidth pipes to transmit graphics. A further advantage is that, in embodiments of the present invention, the addition of interactivity to video is less complicated than conventional approaches. Another advantages is that an on-line editor may allow real-time and dynamic creation of interactive overlays for video content.
Still another advantage is that embodiments of the present invention are compatible with current and emerging set-top box standards and implementations. A still further advantage is that embodiments of the present invention provide for localization of interactive content, allowing local broadcasters to provide nationally broadcast video content along with interactive content linked to geographically-local service and product providers. Another advantage of such localization is that local broadcasters may seek partnerships and other business relationships with local businesses and provide nationally-produced video content in association with such relationships.
Additional advantages of the invention will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The advantages of the invention will be realized and attained by means of the elements and combinations particularly pointed out in the appended claims. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed. BRIEF DESCRIPTION OF THE DRAWINGS The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate one or more embodiments of the invention and together with the description, serve to explain the principles of the invention. FIG. 1 shows a graphical depiction of an embodiment of the present invention, including showing a broadcast pipe having a broadcast data pipe and a broadcast video pipe.
FIG. 2 shows a graphical depiction of an embodiment of the present invention, including showing the broadcast of Web interactive components via an Internet pipe and the broadcast of video and Web graphics components via a video pipe.
FIG. 3 shows a graphical depiction of a cable television system according to the present invention.
FIG. 4 is a more detailed graphical depiction of the local broadcaster 10 of an embodiment as displayed in FIG. 3. FIGs. 5-7 are flow charts depicting typical embodiments of delivery processes according to the present invention.
DETAILED DESCRIPTION OF THE INVENTION A preferred embodiment of the invention is now described in detail. Referring to the drawings, like numbers indicate like parts throughout the views. As used in the description herein and throughout the claims that follow, the meaning of "a," "an," and "the" includes plural reference unless the context clearly dictates otherwise. Also, as used in the description herein and throughout the claims that follow, the meaning of "in" includes "in" and "on" unless the context clearly dictates otherwise. Ranges may be expressed herein as from "about" one particular value, and/or to
"about" another particular value. When such a range is expressed, another embodiment includes from the one particular value and/or to the other particular value. Similarly, when values are expressed as approximations, by use of the antecedent "about," it will be understood that the particular value forms another embodiment. It will be further understood that the endpoints of each of the ranges are significant both in relation to the other endpoint, and independently of the other endpoint. With reference to FIG. 1 , a summary of one embodiment of the present invention is as follows: A broadcaster 10 is in communication with a broadcast pipe 1. The broadcast pipe 1 includes a broadcast data pipe 2 and a broadcast video pipe 3. The broadcast data pipe 2 shown transmits lossless or lossy data, eg. interactive commands components 4. The broadcast video pipe shown transmits data, including video broadcast 5 and interactive graphics components 6. Examples of such data include VBI, customer data, and MPEG data, and other lossless data. The broadcast pipe 1 (including the broadcast data pipe 2 and the broadcast video pipe 3) are connected to an end-user set-top box 8 and television 9 (the set-top box 8 is in communication with the television 9). The set-top box is in communication with a third pipe (or channel), the back channel 7, which is discussed further below. The interactive graphics component 6 and the video component 5 are transmitted via the broadcast video pipe 3, while the interactive command components 4 are transmitted via the broadcast data pipe 2. The set-top box 8 is instructed by the interactive commands components 4 as to the locations and sizes of areas of interactivity and the target of that interactivity (eg, hyperlinks) should the user select the interactive area, "hot-spot." In the embodiment shown, upon selection of a "hot- spot," the user establishes a separate, two-way interactive session \\ ith a third channel 7, called the back channel. The back channel 7 may comprise a connection to a computer network such as an Ethernet or the Internet or another communications channel. In an embodiment, the broadcast data pipe (eg, an Internet channel), may be used as a broadcast pipe when the broadcast video pipe 3 cannot or will not accommodate lossless data transmissions.
The broadcast pipe 1 includes at least two sub-pipe: a broadcast data pipe 2 and a broadcast video pipe 3. The broadcast data pipe 2 typically contains lossless or lossy graphics while the broadcast video pipe 3 typically contains lossless data such as VBI, custom data and MPEG data.
The broadcast video pipe 3 may include sub-pipes for video 5 and visible components of interactive graphics 6; these pipes respectively transmit traditional video content from broadcaster 10 while the graphics sub-pipe 6 carries visible portions of interactive graphics which may in certain embodiments be visible components of Web pages. The broadcast video pipe 3 may include additional sub-pipes such as for one or more audio channels or for additional streams of video or graphics.
The data pipe 2 may include sub-pipes as well such as an interactive command components sub-pipe 4. In one embodiment, the interactive command component sub- pipe 4 may transmit interactive command elements corresponding to the interactive graphic components transmitted via the graphics sub-pipe 6. In one embodiment, the interactive commands may be the non-visible portion of Web pages. These interactive commands may be transmitted as either an image map, or other suitable interactivity designation format such as XML, SGML, etc. The set-top box 8 typically receives the transmission from the broadcast data pipe 2. This transmission instructs the set-top box 8 as to the locations, sizes and shapes of areas of interactivity to be synchronized with the transmissions via the broadcast video pipe 3 and the appropriate response if a user triggers such an area. In other embodiments, a suitable equipped television or other appropriate hardware may receive the transmission from the broadcast data pipe 2.
The television 9 typically receives the transmission from the broadcast video pipe 3 as shown; however, in other embodiments, a set-top box 8 or other suitable hardware may receive the transmission and forward it as received or in some modified format to the television 9. The transmission received via the sub-pipes of the broadcast video pipe 3 may be integrated upon arrival at the television 9, or at the broadcaster 10. Upon indication that a user has triggered an area of interactivity, a two-way connection is established via back channel 7. The back channel 7 may be a communication back to broadcaster 10 via a dedicated line, the Internet or other suitable communications channel. Alternatively, or in addition to communicating back to the broadcaster 10, the back channel may be a connection to the Internet, an Intranet or other connection to an appropriate data repository having two-way interactivity facilities.
Examples of the broadcast data pipe include an Internet pipe, VBI, MPEG3 carrier, an Ethernet and an Intranet. A data pipe may be any lossless data carrier that can be reasonably synchronized with video broadcast. Embodiments of the present invention are described further below as examples. In the examples below with reference to FIGs. 1 and 2, the broadcast data pipe 2 comprises an Internet pipe 12. Those of ordinary skill in the art will recognize that an Internet pipe 12 is shown as the broadcast data pipe 2 by way of example, and that, as described above, other pipes (or channels) may be used as the broadcast data pipe 2 in accordance with the present invention. Thus, the descriptions below will use the Internet pipe 12 to describe embodiments, but those of ordinary skill in the art will recognize that other types of pipes (or channels) may be used.
Thus, by way of example, a summary of one embodiment of the present invention is as follows: Traditional content (eg, video) and Web graphics are transmitted to viewers (eg, to their set top boxes) via the video pipe and interactivity commands are transmitted via the Internet pipe. Examples of the Internet pipe include a dial-up Internet connection, a digital subscriber line, a cable-system Internet service, or other connection to the Internet. Viewers view the video and Web graphics on their television, and can point and click on graphics associated with hot spots to interact with associated content (eg, a Web page) using their television.
It is useful first to provide a basic overview of FIG. 2 and FIG. 3. Referring to FIG. 2, an embodiment of the present invention is shown. The broadcasting hardware of a local cable television broadcaster 10 is connected to an Internet pipe 12 and a video pipe 14. The Internet pipe 12 is connected to a set-top box 20 at an end-user's location, and the video pipe 14 is connected to a television 22 at the end-user's location. The video pipe 14 may be connected to the television 22 directly, through the set-top box or other hardware. One such arrangement is shown in FIG. 2 as an example. In other embodiments, multiple end-user's are connected in this manner to the broadcaster hardware 10.
In the embodiment shown in FIG. 2, the broadcaster sends the Web interactive component 1 1 (eg, the image map of a Web page comprising, for example, HTML, XML or other suitably formatted code that instructs a device that a button or picture displayed constitutes a "hot-spot") to the set-top box using the Internet pipe 12. The broadcaster 10 sends video content 15 (eg, a television broadcast signal) and Web graphics component 16 (eg, the graphics portion of a Web page) to the end-user's television using the video pipe 14. In an embodiment, the set-top box 20 is in communication with the local broadcaster 10 via the Internet (a back channel 7 as shown in FIG. 1) (not shown).
FIGs. 5-7 depict process diagrams exemplifying several embodiments of processes according to the present invention. Referring to FIG. 5, video programming, or other traditional broadcast content, is received in step 510 from an appropriate source such as a national (intermediate) broadcaster (or other broadcaster upstream in the broadcast chain), a live video feed, a computer network such as the Internet, a video recording play-back device, a content server system or other equivalent source. Digital resources are received 520 from a source, which may in certain embodiments be the same source as for the received video programming or other sources such as a bus connection, a local or remote data store, a wireless connection, a dial-up connection, a direct line connection or equivalent source. The digital resources may be receives as the result of transmitting a request for them. The digital resources will typically include both visible components and non- visible components; however, in certain instance, the digital resources may only include visible or non-visible components. Those of skill in the art will recognize that visible components are not limited to those components that once rendered would be seen by a user through a user terminal; visible as used herein encompasses component that when transmitted are rendered at the user terminal in form perceptible to a user. For example, visible components may include audio components, tactile components, olfactory components, etc. that may be perceived by a user through appropriate user terminal hardware and software. Non-visible components will typically designate interactivity and commands related to the visible components. In some embodiments, such as seen in FIG. 6, the digital resources arrive 520 with the visible and non-visible components already segregated such as in steps 610 and 620. In other embodiments, the digital resources may be received as combined visible and non-visible components, or some mixture of segregated and combined format, in which case, a parsing step 530 may be necessary to segregate the components. In some embodiments, the received video programming and/or digital resources may be displaced in whole or in part as seen in step 710 of FIG. 7. For instance, a downstream broadcaster may wish to supplant both the video programming and the digital resources supplied by an upstream broadcaster; for example, a local broadcaster might wish to supply a completely different broadcast than supplied by a national broadcaster for a particular time slot (e.g. local news segment with suitable local digital resources). Alternatively, a downstream broadcaster might only wish to supplant the digital resources supplied by an upstream broadcaster, or just the visible or non-visible portions of the upstream-supplied digital resources. For example, a local broadcaster might want to supplant national broadcaster digital resources with alternative digital resources such as local advertising. Or, a local broadcaster might want to supplant the non-visible components of digital resources such as national advertising supplied by a national broadcaster with interactive elements linking to information on local affiliates of the sponsor of the national advertising. Substitution of programming may also result from the selection of a particular user terminal as described more fully below; in this case, the substitution would be more in the nature of a personalization of the broadcast to a particular user terminal, or set of user terminals.
In some embodiments, as seen in FIG. 5, the visible components of the digital resources may be combined into an integrated signal 540. This approach reduces the amount of work that the user terminal is required to perform by supplying a combined signal for display. However, such an approach may make it more difficult in embodiments where the "user terminal" servicing as the destination of the broadcast is a downstream broadcaster who intends to make a substitution for the visible components of the digital resources or in embodiments where the user terminal performs its own substitution and integration. In such cases, a combination step may not be performed as seen in FIG. 6.
In all cases, the video programming and the visible components of the digital resources are transmitted to a user terminal via a first communication channel 550. Typically, this communication channel will be a high bandwidth channel as described in greater detail above with respect to FIGs. 1-2. The first communication channel may in various embodiments be an over air transmission, a cable television network, a satellite broadcast network, a direct line connection, a computer network such as the Internet or other suitable channel as would be known to those of ordinary skill in the art.
The non-visible components, typically including interactivity elements and commands, are transmitted to the user terminal via a second communication channel 560, which in certain embodiments may be lower in bandwidth than the first communication channel. In some embodiments, the same communication channel may be used as the pipe for transmitting the video programming and visible components and the pipe for transmitting the non-visible components. The second communication channel may in various embodiments be an over air transmission, a cable television network, a satellite broadcast network, a direct line connection, a dial-up connection, a wireless connection, a bus connection, a computer network such as the Internet or other suitable channel as would be known to those of ordinary skill in the art.
In some embodiments, the transmissions, and possibly the content combination in embodiments utilizing such a step, may be coordinated based upon received scheduling information including time and placement information indicating when and where the digital resources are to be presented with respect to the video programming. Receipt of the scheduling information may result from the reception of either the video programming signal, the digital resources or both; alternatively, receipt of the scheduling information may result in the receipt of the video programming, the digital resources or both. In the former situation, the video programming and/or the digital resources may be received, and a request for scheduling information associated with the received video programming may be transmitted. In the latter situation, the receipt of the scheduling information may trigger a request for the video programming and/or the digital resources. Different scheduling information may exist for each separate potential combination of video programming, digital resources and user terminal (or user terminal set including all user terminals meeting a particular criteria). The scheduling information may be received via any suitable source including a bus connection, a computer network such as the Internet, a dial-up connection, a cable television network, a direct line connection, a wireless connection, a satellite broadcast network or a local or remote data store. In some embodiments, a particular user terminal may be selected as the recipient for particular video programming and/or digital resources. Alternatively, the video programming and/or digital resources may be selected, and in some embodiments requested, according to the selected user terminal. The selection process may involve a matching process between profile information associated with the user terminal and profile information associated with the video programming and/or the digital resources. In a programming on-demand environment, a user terminal may be selected on the basis of the receipt of a request from the user terminal for programming.
In some embodiments, a data store may be utilized either as a permanent or temporary storage location for the above-described data including video programming, digital resources, and schedule and profile information and/or data. In different embodiments, the data store may be either primary or secondary storage, or in some instances a combination of both.
A variety of systems may be utilized to perform the methods according to the present invention. Exemplary system embodiments are described in greater detail in the foregoing discussion.
Referring to FIG. 3, another depiction of the embodiment of FIG. 2 is shown, in addition to further aspects of an embodiment of the present invention. In FIG. 3, a national broadcaster (eg, MSNBC) is shown in communication with the local broadcaster 10. The hardware at the local broadcaster 10 includes a control process system 30. a rendering process system 32, and a collector / dispatcher process system 34. The collector / dispatcher system 34 is connected to the Internet (not shown). The local broadcaster 10 is connected to multiple end-users via a cable television network 42. The network 42 shown includes both an Internet pipe and a video pipe as in FIG. 2. In other embodiments, the Internet pipe may be via the Internet itself or via other connection. Also, although the three subsystems 30, 32, 34 are shown as separate structure in FIG. 3, the subsystems may be provided using a different arrangement, such as a single computer.
The set -top box and television of one end-user (or customer) 20 is shown. Also, a node 40 in the network 42 is shown. The node 40 connects the local broadcaster 10 to multiple end-users, including the end-user shown 20. Similarly, the national broadcaster 8 is shown as connecting to multiple local broadcasters, including the one shown 10.
Embodiments of the present invention include methods and systems that allow broadcasters, such as the local broadcaster 10 shown in FIG. 2 and FIG. 3, to provide interactive graphics in a broadcast, such as a television broadcast or a broadcast of other video programming, in an automated manner from a central location. The Web graphics component 16 and the video component 1 are sent via one pipe (or channel), called the video pipe 14. The Web graphics component 16 sent via the video pipe 14 may be included in the vertical blanking interval (VBI) if desired or sent using an MPEG3 data carrier or other carrier or format. The component may be sent outside of the VBI if desired. The Web interactive component 1 1 is sent via another pipe, the Internet pipe 12. The Internet pipe 12 may be part of the cable system 42 or, in another embodiment, may be via the Internet itself.
The broadcast of the various components 1 1 , 15, 16 are coordinated by the broadcaster 10. The broadcaster's transmission of the Web interactive component 1 1 , video component 15, and Web graphics component 16 may be to all end-users in the cable network 42, or to selected ones of the users, as will be described further herein. Those of ordinary skill in the art will recognize that a cable network is just one arrangement in which embodiments of the present invention may be used. Satellite broadcast networks and other networks may also include the present invention.
The subsystems 30, 32, 34 at the broadcaster 10 comprise a typical system. Any of these subsystems may in certain embodiments be ultimately responsible for transmitting the video programming, the visible components of the digital resources and/or the non-visible components of the digital resources. The control subsystem30 generally supports managing scheduling information related to positioning, timing and controlling transmissions of the received video signal and the received digital resources to the user terminal. The rendering subsystem 32 generally supports for managing the rendering of transmissions to the user terminal. The dispatcher/collector subsystem 34 generally supports accumulating digital resources and video programming for transmission to the user terminal. The subsystems 30, 32, 34 interface with the cable television network via the cable head end. FIG. 4 provides a more detailed view of one example of broadcaster system 10.
The control process system 30 provides Web interactive component 1 1 comprising triggers, or commands, to the set top box 20 using the broadcast data pipe, which, in the embodiment shown, comprises the Internet pipe 12. Examples of triggers, or commands, include commands relating to providing links or "hot spots" at designated locations on the television screen and in designated formats (eg, rectangle), commands associating such hot spots with designated hyperlinks, commands relating to loading and displaying designated Web pages or graphics, and similar commands. The commands may comprise an image map, which provides instructions to the set top box for providing links or "hot spots" on the television screen for interaction with the end user. These commands may be sent by the control process system 30 via multiple Internet pipes (not shown). For example, some set-top boxes in the system may receive such commands in the Internet channel in the cable system, while others may receive such commands in the Internet channel of the Internet itself (eg, via a dial-up Internet connection connecting the set-top box to the Internet). The commands may use various standards. For example, with an analog communication, the ATVEF standard, an enhanced HTML standard, or other analog-usable standards. With a digital communication, the MPEG2 data carrier standard or another standard may be used. Those of ordinary skill in the art will recognize that a variety of standards may be used in communicating the triggers, or commands, to the set-top boxes in accordance with the present invention. Preferably, such triggers, or commands, are communicated to the set-top box via a path that may be reasonably synchronized with the video pipe broadcast, such as the broadcast data pipe (eg, the Internet pipe 12).
Preferably, one set of coordinating commands is provided by the administration of the broadcaster to the control process system 30, which converts the coordinating commands provided to the appropriate communication standard for the broadcast of triggers to the set-top boxes. The coordinating commands are provided using a graphical user interface (GUI) or on-line editor associated with the control process system 30 at the local broadcaster 10, who may be located at the cable head end. Such a GUI provides real-time creations of overlays using graphics. An XML structure may be used to support a touch screen displaying groups of interactive graphics (icons) on one side of the GUI screen and a scheduling block on the other side of the screen. The icons are associated with particular Web graphics that may be broadcast to the end users (eg, the end user 20 shown in FIG. 2 and FIG. 3). The administrator at the broadcaster 10 may point and drag the interactive icons to the date and time on the scheduling block side of the screen to indicate that the graphics associated with that icon should be broadcast to the end users at that particular time. The control process system 30 generates an XML file, comprising a schedule file. The control process system 30, using software, reads the XML file and sends the appropriate command triggers to the end users at the appropriate time. Such triggers may be broadcast by the broadcaster 10 using multiple standards to accommodate various set-top boxes. These broadcasts of triggers are coordinated with broadcasts of video and Web graphics components. Schedule information in the form of XML files or other suitable format may be stored in a data store specifically provided for such use such as data store 410 depicted in FIG. 4. In other embodiments, the schedule information may be stored remotely, for example, at a location controlled by a national broadcaster 8 or a digital resource provider 420 (provider of Web content). In such embodiments, the local broadcaster 10 would have access to such schedule information via a suitable communications architecture such as a cable television network, a direct dedicated connection or a computer network. In one embodiment, the schedule information would be accessible via the Internet 400.
The rendering process system 32 (also called the interactive elements rendering system) receives the triggers and coordinates the rendering of the Web graphics with the Web graphics components. Note that in the embodiment shown, the control process system 30 and the rendering process system 32 are shown separately, but in embodiments, they may comprise, for example, a single computer having the software needed to carry out the functions of both subsystems 30, 32. The rendering process system 32 accesses the event schedule as well, and provides instructions to the end user's set top box regarding the size, shape, and placement of hot spots on the user's television screen and the hyperlinks associated with the hot spots. The rendering process system 32 also provides commands to the set top box as to the length of time (or duration) which designated Web graphics and Web interactivity should be displayed and provided on the end user's screen (for example, a particular graphic and its associated hyperlink should be provided for 10 seconds because the video content associated with that graphic and hyperlink will be displayed for 10 seconds).
This schedule file reflects a schedule of events. In the embodiment shown, an event is associated with a time code having year, month, day, hour, minute, and second fields. When the time occurs that is reflected in a time code, the rendering process system 32 carries out the event associated with that time code in the schedule file. After an event associated with a time code is carried out, the system moves the event pointer to the next scheduled event, and carries out any tasks necessary to prepare for that event. Moreover, independent of time coding, if an event command is received from a system user or from a separate system or subsystem through a command- enabled interface (eg, from the national broadcaster 8), the event reflected by that event command is carried out by the system. Such a command may be associated with video content, Web graphics, and Web interactive components provided by the system user or the separate system or subsystem.
The event schedule may include an associated list of Web pages to be transmitted at specific times during that broadcast. The Web pages may be created at the local broadcaster location 10, downloaded from the Internet, or obtained from another source. The event schedule coordinates the broadcasting of the Web pages with segments of the video transmission. For example, if a home shopping network were broadcasting three like items, eg, a jewelry set, Web graphics, showing, for example, three buttons and other graphics related to the items, could be shown on the screen, ie, dynamically generated. As an example, such graphics could be transmitted via the broadcast pipe (video pipe). The user could then press the button that interested them and would be taken to the site for further information and ordering. The video content may be received from the national broadcaster 8, created and provided by the local broadcaster 10, or obtained from some other source and broadcast by the local broadcaster 10. One of ordinary skill in the art will recognize that, in an embodiment, each such product may be associated with data in a database, such as a database associated with an event schedule.
An example of an event is the broadcasting of Web graphics components, video content, and Web interactive components that, when combined and displayed at the end user, provide a display of video content and Web graphics with which the end user can interact by activating "hot spots" or hyperlinks associated with the Web graphics displayed. Examples of such displays include interactive product advertisements, interactive video magazines, interactive real estate advertisements, and many other applications. The hot spots are coordinated with graphics so that the user sees graphics in the area of the hot spot. By providing hot spots and associated hyperlinks with each hot spot, the system allows the end user to "click on" or otherwise activate the area as desired by the user, and prompted by the user's view of the graphics associated with the hot spot. The hyperlinks may be associated with another Web page, another device, or otherwise cause some action to take place.
Upon activation, the interactivity command associated with the activated area is launched. The launch command may be executed locally at the user terminal in certain embodiments. In other embodiments, the activation may result in the transmission of the interactivity command from the user terminal to the broadcaster 10. The transmission may be via a third communication channel, which in some instances may be the same as the first or second communication channel; typically, the third communication channel may be a bus connection a computer network such as the Internet, a dial-up connection, a cable television network, a direct line connection, or a wireless connection. At the broadcaster 10, the command may be executed. In many cases, the execution may result in the retrieval of additional digital resources typically in the form of a Web page. This additional set of digital resources may be communicated back to the user terminal; in certain embodiments, the visible portions of the additional set of digital resources may be transmitted via the first communication channel while the non-visible portions are transmitted via the second communication channel. The embodiment shown in FIG. 2 and FIG. 3 also includes a collector / dispatcher process system 34. The collector / dispatcher process system 34 receives communication from the national broadcaster 8. Note that the national broadcaster sends communication to many local broadcasters. The collector / dispatcher process system 34 shown is connected to the Internet as well as the cable head end. The collector / dispatcher process system 34 collects Web content (including Web graphics and Web interactive content) and broadcasts that content, along with any video and audio content broadcast, to all connected or all designated end users. This collection and dispatching process is automated by the system 34. A wide variety of applications may be carried out with the assistance of the collector / dispatcher process system 34. Examples include a narrow-cast channel, advertising channel, information channel, and a topic-specific cable television magazine. For example, the collector / dispatcher system 34 may access designated Web pages by the uniform resource locator (URL) at designated times, download those pages to the system 34, and then broadcast those pages at designated times to end users 20. The system 34 may separate the Web graphics content and the Web interactive content of downloaded Web pages, and then broadcast the Web graphics component 16 via the video pipe and the Web interactive component 1 1 via the Internet pipe. The broadcasting may be carried out by the collector / dispatcher system 34 or handed off to the control process system 30 or the rendering process system 32 (or some combination thereof) to carry out. Moreover, the collector / dispatcher 34 may collect and dispatch other types of content from other types of sources (besides the Internet).
The subsystems 30, 32, 34 shown comprise rack-mount computers, using any of various conventional operating systems, such as Microsoft Windows. The collector / dispatcher system 34 and the rendering process system 32 shown also include a broadcast-quality graphics card (or graphics / movie card). An example is Targa 3000 made by Pinnacle Systems. Another configuration includes three desk-top PCs w ith appropriate communications facilities for the control path and a graphics card (such as the Targa 3000). Still another configuration includes all three subsystems running on a single computer (eg, a PC having at least an 8088 or 8086 family processor) having the appropriate communications facilities and an appropriate graphics card. The software needed to carry out these functions described can be programmed in any of various programming languages (eg, Visual Basic, Java, C or C++), and those of ordinary skill in the art will recognize the coding necessary.
The embodiment shown in FIG. 2 and FIG. 3 also includes localization functionalities. Localization provides for providing localized Internet or other content to pre-designated geographic areas. An interactive rendering process system is provided at the cable head end (or at the local broadcaster's studio). The national broadcaster 8 broadcasts video content, and the local broadcaster provides the interactive content (Web graphics and Web interactive content). A control signal is sent by the national broadcaster 8 to the local broadcaster 10 advising as to the video content and advisable localized Web content that the local broadcaster 10 may wish to provide to the end users in the area of the local broadcaster (ie, served by the local broadcaster) in association with the to-be-broadcast video content. For example, the national broadcaster 8 may provide travel-related video content to the local broadcaster 10. In addition to sending the travel-related video content to the end user 20, the local broadcaster 10 sends Web graphics 16 and Web interactive components 1 1 created (or stored) by the local broadcaster 10 that is associated with local travel agencies and other local travel services to the end user 20 to be displayed on the end user's television along with the video content sent by the national broadcaster. In this way, the end user received nationally-produced video content and may interact with local, related service providers. For example, the interactive components provided may include hyperlinks to the Web pages of local travel agents. In the shown embodiment, the local broadcaster 10 decides which localized Web-based content to provide with the video content provided by the national broadcaster 8, but in other embodiments, the national broadcaster 8 may dictate or provide the localized interactive content as well. In other embodiments, the national broadcaster 8 may dictate the Web graphic content but allow the local broadcaster 10 to select the desired Web interactive controls and commands to be utilized with the graphic content supplied by the national broadcaster 8.
FIG. 4 depicts one possible arrangement of broadcaster system 10. Control processor 30, rendering processor 32 and dispatcher/collector processor 34 are subsystems within a broadcaster system 10. In other embodiments, the functionality performed by these subsystems may be executed on a single system or distributed across a cluster of suitably programmed systems. The arrangement shown in FIG. 4 is exemplary.
The subsystems or distributed systems performing the desired functionality communicate via a suitable communication architecture such as bus connection, direct line connection, computer network connection (Ethernet, token ring, etc.), interprocess communication or other similar architecture. In one embodiment, the communication architecture is the Internet 400. The systems performing the subsystem functionality will typically have access to a data store 410; the data store may provide simple storage as in the case of one or more hard disk drives or may provide its own access logic such as a database system architecture. The access may be through any suitable communication architecture such as described with respect to the communication of the systems supporting the subsystem functionality.
The data store 410 will typically provide temporary and permanent storage for data required to support the subsystem functionality. This data may include video programming, digital resources, schedule information, video programming profile data, digital resources profile data and user profile data.
Embodiments may also undertake hyper-localization. In a cable system having multiple head ends, hardware may be provided at various head ends or nodes that transmits different interactive content to those users connected to differing head ends or nodes.
In another embodiment, information is gathered about the interests and purchasing habits of various cable television users. Afterward, this information is used by the national or local broadcaster to create and provide interactive content tailored to the interests and purchasing habits of those users.
In another embodiment, a national or local broadcaster may broadcast individually-designated data and video streams. The streams may include interactive content for a particular individual or household. The individually-designated data streams may be coded to be sent to, or only read by, the designated individual or household.
In another embodiment, interactive graphics are provided that are associated with various video display elements, eg, products, shown in the broadcast. The user may click on these interactive graphics and move the graphics on the screen by pointing and clicking. When the interactive graphics are moved by the user, the video display elements move along with the interactive graphics. Also, when selected, the system may display more information about the particular video display element (eg, about the particular product displayed) for the user.
In still another embodiment, the Web graphics and interactive data may be recorded along with video content in a recording. When the recording is played (eg, a video tape, DVD, laser disk, etc.), the Web graphics and interactive data is provided to the end user as described above. In this way, the video content, Web graphics, and interactive components may be recorded and used at a later date, rather than using an event schedule system as described above. Applications include purchasing or renting such tapes from a store, and sending such tapes to households containing a video catalog with interactive elements. In such an embodiment, by capturing the one-way links as relative hyperlinks in the recorded formats data path (VBI, MPEG data path) the content can be played back at any time through an interactive device. The interactive "hot-spots" with links will be drawn just as if they were transmitted via the broadcast path since this information was recorded along with the video. As can be seen, one embodiment of the present invention provides methods for interactive broadcasting comprising sending interactive commands components via a broadcast data pipe, sending interactive graphics components via a broadcast video pipe, and sending video via the broadcast video pipe. Also, one embodiment of the present invention comprises sending a Web interactive component in an Internet channel, sending a Web graphics component in a video broadcast channel, and sending a video component in the video broadcast channel. The Web interactive component may be provided, and the Web graphics component may be provided separately from the Web interactive component. Providing the Web interactive component may comprise extracting the Web interactive component from a Web file, or may comprise creating such a component. Likewise, providing the Web graphics component may comprise extracting the Web graphics component from a Web file, or may comprise creating such a component. Sending the Web interactive component in the Internet channel may comprise sending the Web interactive component to a television set top box. Likewise, sending the Web graphics component in the broadcast channel may comprise sending the Web graphics component to a television set top box, and sending a video component in the broadcast channel may comprise sending the video component to a television set top box. The broadcast channel may comprise a television broadcast channel, specifically a cable television broadcast channel, satellite television broadcast channel, or other broadcast channel. Of course, the video content may include an associated audio component in the broadcast channel. Sending the Web interactive component in the Internet channel may comprise sending the Web interactive component from a cable television head end. Also, sending the Web graphics component in the broadcast channel may comprise sending the Web graphics component from a cable television head end, and sending a video component in the broadcast channel may comprise sending the video component from a cable television head end. Triggering signals may also be sent to the television set top box, wherein the triggering signals comprise commands relating to the location of television interactive areas. Also, synchronizing signals may be sent to the set top box or other system, wherein the synchronizing signals comprise data relating to the temporal display relationship between the Web graphics component, the Web interactive component, and the video component. An interactive multicasting broadcast schedule may be provided comprising a temporal schedule for sending the Web interactive component in the Internet channel, sending the Web graphics component in the broadcast channel, and sending the video component in the broadcast channel. Such a schedule may include multiple entries. The Web interactive component may comprise an image map of a Web page.
A method for interactive broadcasting is also provided comprising extracting a Web interactive component and a Web graphics component from a Web file, sending the Web interactive component in an Internet channel, sending the Web graphics component in a broadcast channel, and sending a video component in the broadcast channel. Methods and systems according to the present invention provide interactive multicasting. The destination for the broadcasts according to the present invention is a user terminal. A typical user terminal will include a television and a set-top box. Other suitable hardware and software may be utilized to receive the broadcast. Such hardware and software will include one or more display devices for rendering the received broadcast and one or more input devices allowing interaction with the received broadcast. The display and input devices may be of any suitable type as will be known to those of ordinary skill in the art. The most typical combination will be a monitor and speakers (or television) allowing visual and auditory displav and a mouse and keyboard for providing input (connected to a set-top box). A user terminal will also require at least one processor (typically within a set-top box) for receiving the broadcast, displaying it appropriately on the display device(s) and monitoring for user inputs via the input device(s).
Throughout this application, various publications max have been referenced. The disclosures of these publications in their entireties are hereby incorporated by reference into this application in order to more fully describe the state of the art to which this invention pertains.
The embodiments described above are given as illustrative examples only. It will be readily appreciated that many deviations may be made from the specific embodiments disclosed in this specification without departing from the invention. Accordingly, the scope of the invention is to be determined b> the claims below rather than being limited to the specifically described embodiments above.

Claims

What is claimed is:
1. A method for delivering video programming and integrated, interactive digital resources to a user terminal, the method comprising the steps of: a) receiving a video programming signal; b) receiving a designated set of digital resources; c) transmitting the video programming signal to a user terminal via a first communication channel; d) transmitting a set of visible components based upon the received digital resources to the user terminal via the first communication channel; and e) transmitting a set of non-visible components based upon the received digital resources to the user terminal via a second communication channel.
2. The method of claim 1. and further comprising the step of parsing the designated set of digital resources into a set of visible components and a set of non-visible components.
3. The method of claim 1. and further comprising the step of receiving program scheduling information comprising time and placement information indicating when and where the designated set of digital resources are to be presented with respect to the video programming signal.
4. The method of claim 3, and further comprising the step of coordinating the transmission of the video programming signal, the set of visible components and the set of non-visible components based upon the received program scheduling information.
5. The method of claim 1, wherein the first communication channel and the second communication each utilize the same communication channel.
6. The method of claim 1 , wherein the first communication channel is selected from the group comprising over air transmission, cable television network, satellite broadcast network, direct line connection, and a computer network.
7. The method of claim 6, wherein the first communication channel is the Internet.
8. The method of claim 1. wherein the second communication channel is selected from the group comprising a bus connection, a computer network, a dial-up connection, a cable television network, a direct line connection, and a wireless connection.
9. The method of claim 8, wherein the second communication channel is the Internet.
10. The method of claim 1, and further comprising the step of receiving an interactivity command from the user terminal triggered by an interaction between an end user with the transmitted video programming, the transmitted set of visible components and the transmitted set of non-visible components as rendered by the user terminal.
1 1. The method of claim 10, and further comprising the step of executing the received interactivity command.
12. The method of claim 11, and further comprising the step of transmitting results of the executed interactivity command to the user terminal.
13. The method of claim 1 1 , wherein the step of executing the received interactivity command comprises retrieving an additional set of digital resources.
14. The method of claim 13, and further comprising the steps of transmitting visible components of the additional set of digital resources to the user terminal via the first communication channel and transmitting the non-visible components of the additional set of digital resources to the user terminal via the second communication channel.
15. The method of claim 10, wherein the step of receiving the interactivity command comprises receiving the interactivity command via a third communication channel.
16. The method of claim 15, wherein the third communication channel is selected from the group comprising a bus connection, a computer network, a dial-up connection, a cable television network, a direct line connection, and a wireless connection.
17. The method of claim 16, wherein the third communication channel is the Internet.
18. The method of claim 15, wherein the third communication channel is the same as the second communication channel.
19. The method of claim 1 , and further comprising the step of selecting the user terminal.
20. The method of claim 19, wherein the step of selecting the user terminal comprises receiving a request for programming from the user terminal.
21. The method of claim 19, and further comprising the steps of receiving program profile information and receiving user profile information and wherein the step of selecting the user terminal comprises matching received user profile information with program profile information.
22. The method of claim 19, and further comprising the step of substituting an alternate set of non-visible components for the set of non-visible components prior to transmission to the user terminal via the second communication channel based upon the user terminal.
23. The method of claim 22, and further comprising the step of substituting an alternate set of visible components for the set of non-visible components prior to transmission to the user terminal via the first communication channel based upon the user terminal.
24. The method of claim 19, and further comprising the step of substituting an alternate set of visible components for the set of non-visible components prior to transmission to the user terminal via the first communication channel based upon the user terminal.
25. The method of claim 1, wherein the step of receiving a designated set of digital resources comprises the steps of receiving a set of visible components based upon the designated set of digital resources and receiving a set of non-visible components based upon the designated set of digital resources.
26. The method of claim 1, and further comprising the step of substituting an alternate set of the digital resources for the designated set of the digital resources prior to transmissions to the user terminal.
27. The method of claim 1, and further comprising the step of substituting an alternate set of non-visible components for the set of non-visible components based upon the designated digital resources prior to transmission to the user terminal via the second communication channel.
28. The method of claim 27, and further comprising the step of substituting an alternate set of visible components for the set of non-visible components based upon the designated digital resources prior to transmission to the user terminal via the first communication channel.
29. The method of claim 1, and further comprising the step of substituting an alternate set of visible components for the set of non-visible components based upon the designated digital resources prior to transmission to the user terminal via the first communication channel.
30. The method of claim 1 , wherein the step of receiving the video programming signal comprises receiving the video programming signal from a source selected from the group comprising a cable television network, a video playback system playing previously recorded video, a live video feed, a computer network, a satellite broadcast network and a direct connect line from a broadcaster.
31. The method of claim 1 , and further comprising the step of transmitting a request for the designated set of digital resources.
32. A method for delivering video programming and integrated, interactive digital resources to a user terminal, the method comprising the steps of: a) receiving a video programming signal; b) receiving a designated set of digital resources; c) combining visible portions of the received designated set of digital resources with the received video programming signal into a combined video signal; d) transmitting the combined video signal to a specified destination via a first communications channel; and e) transmitting a set of non-visible components based upon the received digital resources to the specified destination via a second communications channel.
33. The method of claim 32, and further comprising the step of parsing the designated set of digital resources into a set of visible components and a set of non-visible components.
34. The method of claim 32, and further comprising the step of receiving program scheduling information comprising time and placement information indicating when and where the designated set of digital resources are to be presented with respect to the video programming signal.
35. The method of claim 34, and further comprising the step of coordinating the combination of the video programming signal and the set of visible components, the transmission of the combined video signal and the transmission of the set of non- visible components based upon the received program scheduling information.
36. The method of claim 32, and further comprising the step of receiving an interactivity command from the user terminal triggered by an interaction between an end user with the transmitted video programming, the transmitted set of visible components and the transmitted set of non-visible components as rendered by the user terminal.
37. The method of claim 36. and further comprising the step of executing the received interactivity command.
38. The method of claim 32, and further comprising the step of selecting the user terminal.
39. The method of claim 38, and further comprising the steps of receiving program profile information and receiving user profile information and wherein the step of selecting the user terminal comprises matching received user profile information with program profile information.
40. The method of claim 32, and further comprising the step of transmitting a request for the designated set of digital resources.
41. A system for delivering video programming and integrated, interactive digital resources to a user terminal, the system comprising: a) a data store for storing video programming and digital resource related data: and b) a broadcast processing system comprising at least one processor in communication with the data store, the broadcast processing system's at least one processor for performing the steps of: i) receiving a video programming signal; ii) receiving a designated set of digital resources; iii) transmitting the video programming signal to a user terminal via a first communication channel; iv) transmitting a set of visible components based upon the received digital resources to the user terminal via the first communication channel; and v) transmitting a set of non-visible components based upon the received digital resources to the user terminal via a second communication channel.
42. The system of claim 41. wherein the broadcast processing system performs the further step of storing the designated set of digital resources in the data store.
43. The system of claim 41 , wherein the broadcast processing system performs the further step of parsing the designated set of digital resources into a set of visible components and a set of non-visible components.
44. The system of claim 41 , wherein the broadcast processing system comprises a control processor for managing scheduling information related to positioning, timing and controlling transmissions of the received video signal and the received designated set of digital resources to the user terminal.
45. The system of claim 44, wherein the control processor performs the transmitting steps of the broadcast processing system.
46. The system of claim 44, wherein the control processor performs the step of receiving schedule information from a schedule information source.
47. The system of claim 46, wherein the schedule information source is selected from the group comprising of a bus connection, a computer network, a dial-up connection, a cable television network, a direct line connection, a wireless connection, a satellite broadcast network and the data store.
48. The system of claim 46, wherein the control processor's performance of the step of receiving scheduling information comprises receiving scheduling information from the Internet.
49. The system of claim 44, wherein the control processor performs the step of providing an interface by which a broadcast manager can create schedule information associated with a specified video programming signal.
50. The system of claim 44, wherein the control processor performs the step of providing an interface by which a broadcast manager can edit schedule information associated with a specified video programming signal.
51. The system of claim 44, wherein the control processor performs the step of storing scheduling information in the data store.
52. The system of claim 46, wherein the control processor performs the further step of transmitting a request for the scheduling information to the source.
53. The system of claim 41 , wherein the broadcast processing system comprises a rendering processor for managing the rendering of transmissions to the user terminal.
54. The system of claim 53, wherein the rendering processor performs the transmitting steps of the broadcast processing system.
55. The system of claim 53, wherein the rendering processor performs the step of receiving scheduling information associated with the received video signal and the received set of digital resources.
56. The system of claim 55, wherein the rendering processor manages the rendering of the transmission to the user terminal according to the received scheduling information.
57. The system of claim 53, wherein the rendering processor performs the step of parsing the designated set of digital resources into a set of visible components and a set of non-visible components.
58. The system of claim 41 , wherein the broadcast processing system comprises a dispatcher/collector processor for accumulating digital resources and video programming for transmission to the user terminal.
59. The system of claim 58, wherein the dispatcher/collector processor performs the receiving steps of the broadcasting system.
60. The system of claim 59, wherein the dispatcher/collector processor receives the designated set of digital resources from a source selected from the group comprising a bus connection, a computer network, a dial-up connection, a cable television network, a direct line connection, a wireless connection, a satellite broadcast network and the data store.
61. The system of claim 60, wherein the dispatcher/collector processor receives the designated set of digital resources from the Internet.
62. The system of claim 59, wherein the dispatcher/collector processor performs the step of transmitting a request for the designated set of digital resources.
63. The system of claim 59, wherein the dispatcher/collector processor receives the video programming signal from a source selected from the group comprising over air transmission, cable television network, satellite broadcast network, direct line connection, and a computer network.
64. The system of claim 59, wherein the dispatcher/collector processor performs the step of parsing the designated set of digital resources into a set of visible components and a set of non-visible components.
65. The system of claim 58, wherein the dispatcher/collector processor performs the transmitting steps of the broadcast processing system.
66. A system for delivering video programming and integrated, interactive digital resources to a user terminal, the system comprising: a) first receiving means for receiving a video programming signal; b) second receiving means for receiving a designated set of digital resources; c) first transmitting means for transmitting the video programming signal and a set of visible components based upon the received digital resources to a user terminal via a first communication channel; d) second transmitting means for transmitting a set of non-visible components based upon the received digital resources to the user terminal via a second communication channel.
67. A user terminal system for receiving video programming and integrated, interactive digital resources from a broadcaster, the system comprising: a) a display; b) an input device; and c) a broadcast reception system comprising at least one processor in communication with the display and the input device, the broadcast reception system's at least one processor for performing the steps of: i) receiving a combined signal comprising a video programming component and a component representing visible portions of a set of digital resources via a first communication channel; ii) receiving non-visible portions of the set of digital resources via a second communication channel; iii) presenting the combined signal on the display; and iv) monitoring for user interactivity from the input device based upon the received non-visible portions of the set of digital resources.
68. A user terminal system for receiving video programming and integrated, interactive digital resources from a broadcaster, the system comprising: a) a display; b) an input device; and c) a broadcast reception system comprising at least one processor in communication with the display and the input device, the broadcast reception system's at least one processor for performing the steps of: i) receiving a video programming signal via a first communication channel; ii) receiving visible portions of a set of digital resources via the first communication channel; iii) receiving non-visible portions of the set of digital resources via a second communication channel; iv) presenting the combined signal on the display; and v) monitoring for user interactivity from the input device based upon the received non-visible portions of the set of digital resources.
PCT/US2000/040717 1999-08-23 2000-08-22 Virtual hybrid interactive multicasting system and method WO2001015359A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP00969022A EP1212858A1 (en) 1999-08-23 2000-08-22 Virtual hybrid interactive multicasting system and method
AU78852/00A AU7885200A (en) 1999-08-23 2000-08-22 Virtual hybrid interactive multicasting system and method

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US15021499P 1999-08-23 1999-08-23
US60/150,214 1999-08-23
US19505400P 2000-04-06 2000-04-06
US19502700P 2000-04-06 2000-04-06
US60/195,054 2000-04-06
US60/195,027 2000-04-06

Publications (2)

Publication Number Publication Date
WO2001015359A1 true WO2001015359A1 (en) 2001-03-01
WO2001015359A8 WO2001015359A8 (en) 2001-07-19

Family

ID=27386934

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2000/040717 WO2001015359A1 (en) 1999-08-23 2000-08-22 Virtual hybrid interactive multicasting system and method

Country Status (3)

Country Link
EP (1) EP1212858A1 (en)
AU (1) AU7885200A (en)
WO (1) WO2001015359A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2389754A (en) * 2002-03-07 2003-12-17 Chello Broadband N V Interactive TV system where requests for services are transmitted to the provider over a packet network and responses incorporated in the TV broadcast signal
GB2441041A (en) * 2006-08-11 2008-02-20 Answerback Ltd Interactive Broadcasting
WO2009101609A1 (en) * 2008-02-14 2009-08-20 Alcatel Lucent A method for dynamically developing a programming schedule

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0849946A2 (en) * 1996-12-13 1998-06-24 Kabushiki Kaisha Toshiba Interactive TV broadcasting system and file access method applied thereto
US6049831A (en) * 1996-11-08 2000-04-11 Gte Laboratories Incorporated System for transmitting network-related information where requested network information is separately transmitted as definitions and display information
US6097441A (en) * 1997-12-31 2000-08-01 Eremote, Inc. System for dual-display interaction with integrated television and internet content

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6049831A (en) * 1996-11-08 2000-04-11 Gte Laboratories Incorporated System for transmitting network-related information where requested network information is separately transmitted as definitions and display information
EP0849946A2 (en) * 1996-12-13 1998-06-24 Kabushiki Kaisha Toshiba Interactive TV broadcasting system and file access method applied thereto
US6097441A (en) * 1997-12-31 2000-08-01 Eremote, Inc. System for dual-display interaction with integrated television and internet content

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"Advanced television enhancement forum specification (ATVEF)", DRAFT VERSION 1.1R26, 2 February 1999 (1999-02-02), pages 1 - 37, XP002935716 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2389754A (en) * 2002-03-07 2003-12-17 Chello Broadband N V Interactive TV system where requests for services are transmitted to the provider over a packet network and responses incorporated in the TV broadcast signal
GB2441041A (en) * 2006-08-11 2008-02-20 Answerback Ltd Interactive Broadcasting
WO2009101609A1 (en) * 2008-02-14 2009-08-20 Alcatel Lucent A method for dynamically developing a programming schedule

Also Published As

Publication number Publication date
WO2001015359A8 (en) 2001-07-19
AU7885200A (en) 2001-03-19
EP1212858A1 (en) 2002-06-12

Similar Documents

Publication Publication Date Title
US10587930B2 (en) Interactive user interface for television applications
EP1053642B1 (en) A host apparatus for simulating two way connectivity for one way data streams
US6072521A (en) Hand held apparatus for simulating two way connectivity for one way data streams
US6064420A (en) Simulating two way connectivity for one way data streams for multiple parties
CA2499285C (en) Interactivity with audiovisual programming
JP4933285B2 (en) Internet TV program guide system
US6338094B1 (en) Method, device and system for playing a video file in response to selecting a web page link
US20040032424A1 (en) Method and system for producing and administering a web-cast event
US20150095942A1 (en) System and method for interactive video content programming
CA2398129A1 (en) Video display with vbi triggered message
US20040117820A1 (en) Streaming portal and system and method for using thereof
KR100915314B1 (en) Method and apparatus for managing TV broadcast content that has synchronized web applications
JPH1153441A (en) Information processing method
US8522297B2 (en) System, method and program for identifying web information related to subjects in a program broadcast
WO2001015359A1 (en) Virtual hybrid interactive multicasting system and method

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
AK Designated states

Kind code of ref document: C1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: C1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

WR Later publication of a revised version of an international search report
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
WWE Wipo information: entry into national phase

Ref document number: 2000969022

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 2000969022

Country of ref document: EP

WWW Wipo information: withdrawn in national office

Ref document number: 2000969022

Country of ref document: EP

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

NENP Non-entry into the national phase

Ref country code: JP